100+ datasets found
  1. Sparse Traffic Dataset

    • universe.roboflow.com
    zip
    Updated Nov 29, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    UGA Grad School (2023). Sparse Traffic Dataset [Dataset]. https://universe.roboflow.com/uga-grad-school/sparse-traffic-dataset
    Explore at:
    zipAvailable download formats
    Dataset updated
    Nov 29, 2023
    Dataset provided by
    University of Georgia Graduate Schoolhttp://grad.uga.edu/
    Authors
    UGA Grad School
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Variables measured
    Traffic
    Description

    Sparse Traffic Dataset

    ## Overview
    
    Sparse Traffic Dataset is a dataset for classification tasks - it contains Traffic annotations for 244 images.
    
    ## Getting Started
    
    You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
    
      ## License
    
      This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
    
  2. Data from: Sparse Machine Learning Methods for Understanding Large Text...

    • data.nasa.gov
    • gimi9.com
    • +3more
    Updated Mar 31, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2025). Sparse Machine Learning Methods for Understanding Large Text Corpora [Dataset]. https://data.nasa.gov/dataset/sparse-machine-learning-methods-for-understanding-large-text-corpora
    Explore at:
    Dataset updated
    Mar 31, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    Sparse machine learning has recently emerged as powerful tool to obtain models of high-dimensional data with high degree of interpretability, at low computational cost. This paper posits that these methods can be extremely useful for understanding large collections of text documents, without requiring user expertise in machine learning. Our approach relies on three main ingredients: (a) multi-document text summarization and (b) comparative summarization of two corpora, both using parse regression or classifi?cation; (c) sparse principal components and sparse graphical models for unsupervised analysis and visualization of large text corpora. We validate our approach using a corpus of Aviation Safety Reporting System (ASRS) reports and demonstrate that the methods can reveal causal and contributing factors in runway incursions. Furthermore, we show that the methods automatically discover four main tasks that pilots perform during flight, which can aid in further understanding the causal and contributing factors to runway incursions and other drivers for aviation safety incidents. Citation: L. El Ghaoui, G. C. Li, V. Duong, V. Pham, A. N. Srivastava, and K. Bhaduri, “Sparse Machine Learning Methods for Understanding Large Text Corpora,” Proceedings of the Conference on Intelligent Data Understanding, 2011.

  3. d

    Floreat-f2 - Sparse Point Cloud LAS - Aug 2021 - Datasets - data.wa.gov.au

    • catalogue.data.wa.gov.au
    Updated Aug 17, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2021). Floreat-f2 - Sparse Point Cloud LAS - Aug 2021 - Datasets - data.wa.gov.au [Dataset]. https://catalogue.data.wa.gov.au/dataset/floreat-f2-sparse-point-cloud-las-aug-2021
    Explore at:
    Dataset updated
    Aug 17, 2021
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Floreat, Western Australia
    Description

    The first capture of the area North of the Floreat Surf Life Saving Club, these sand dunes were captured by UAV imagery on 17th Aug 2021 for the Cambridge Coastcare beach dune modelling and monitoring project. It was created as part of an initiative to innovatively monitor coastal dune erosion and visualize these changes over time for future management and mitigation. This data includes Orthomosaic, DSM, DTM, Elevation Contours, 3D Mesh, 3D Point Cloud and LiDAR constructed from over 500 images captured from UAV (drone) and processed in Pix4D. All datasets can be freely accessed through DataWA. Link to Animated video fly-through of this 3D data model Link to the Sketchfab visualisation of the 3D textured mesh The dataset is a Sparse 3D Point Cloud (i.e. a 3D set of points): the X,Y,Z position and colour information is stored for each point of the point cloud. This dataset is of the area North of Floreat SLSC (2021 Flight-2 project area).

  4. Sparse Solutions for Single Class SVMs: A Bi-Criterion Approach - Dataset -...

    • data.nasa.gov
    Updated Mar 31, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2025). Sparse Solutions for Single Class SVMs: A Bi-Criterion Approach - Dataset - NASA Open Data Portal [Dataset]. https://data.nasa.gov/dataset/sparse-solutions-for-single-class-svms-a-bi-criterion-approach
    Explore at:
    Dataset updated
    Mar 31, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    In this paper we propose an innovative learning algorithm - a variation of One-class ? Support Vector Machines (SVMs) learning algorithm to produce sparser solutions with much reduced computational complexities. The proposed technique returns an approximate solution, nearly as good as the solution set obtained by the classical approach, by minimizing the original risk function along with a regularization term. We introduce a bi-criterion optimization that helps guide the search towards the optimal set in much reduced time. The outcome of the proposed learning technique was compared with the benchmark one-class Support Vector machines algorithm which more often leads to solutions with redundant support vectors. Through out the analysis, the problem size for both optimization routines was kept consistent. We have tested the proposed algorithm on a variety of data sources under different conditions to demonstrate the effectiveness. In all cases the proposed algorithm closely preserves the accuracy of standard one-class ? SVMs while reducing both training time and test time by several factors.

  5. t

    UF Sparse Matrix Dataset

    • service.tib.eu
    • resodate.org
    Updated Nov 25, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). UF Sparse Matrix Dataset [Dataset]. https://service.tib.eu/ldmservice/dataset/uf-sparse-matrix-dataset
    Explore at:
    Dataset updated
    Nov 25, 2024
    Description

    Utilized sparse matrices from UF sparse matrix collection for 274-node systems testing.

  6. h

    NEST

    • huggingface.co
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Vincent Maillou, NEST [Dataset]. https://huggingface.co/datasets/vincent-maillou/NEST
    Explore at:
    Authors
    Vincent Maillou
    Description

    NEST: NEw Sparse maTrix dataset

    NEST is a new sparse matrix dataset. Its purpose is to define a modern set of sparse matrices arising in relevant and actual scientific application in order to improve further sparse numerical methods. Nest can be seen as a continuity of the Sparse Matrix Market datasets and contain some curated sparse matrices from it as legacy references. The matrices are stored as COO sparse matrices in scipy.sparse.npz archive format. Conversion utils to/from the… See the full description on the dataset page: https://huggingface.co/datasets/vincent-maillou/NEST.

  7. t

    Elastic Source Imaging with Sparse Data - Dataset - LDM

    • service.tib.eu
    Updated Dec 16, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). Elastic Source Imaging with Sparse Data - Dataset - LDM [Dataset]. https://service.tib.eu/ldmservice/dataset/elastic-source-imaging-with-sparse-data
    Explore at:
    Dataset updated
    Dec 16, 2024
    Description

    The dataset used in this paper for elastic source imaging with very sparse data, both in space and time.

  8. Python Sparse Matrix - Open Problem Multimodal

    • kaggle.com
    zip
    Updated Oct 14, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Wei Xie (2022). Python Sparse Matrix - Open Problem Multimodal [Dataset]. https://www.kaggle.com/datasets/stautxie/python-sparse-matrix-open-problem-multimodal
    Explore at:
    zip(10844965431 bytes)Available download formats
    Dataset updated
    Oct 14, 2022
    Authors
    Wei Xie
    License

    https://cdla.io/sharing-1-0/https://cdla.io/sharing-1-0/

    Description

    Dataset

    This dataset was created by Wei Xie

    Released under Community Data License Agreement - Sharing - Version 1.0

    Contents

  9. Grating Lobes and Spatial Aliasing in Sparse Array Beampatterns

    • catalog.data.gov
    • gimi9.com
    • +1more
    Updated Jul 29, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Institute of Standards and Technology (2022). Grating Lobes and Spatial Aliasing in Sparse Array Beampatterns [Dataset]. https://catalog.data.gov/dataset/grating-lobes-and-spatial-aliasing-in-sparse-array-beampatterns
    Explore at:
    Dataset updated
    Jul 29, 2022
    Dataset provided by
    National Institute of Standards and Technologyhttp://www.nist.gov/
    Description

    Calculated beam pattern in Fourier space of a unitary input given two sparsely sampled synthetic aperture arrays: 1. a regularly spaced array sampled at 2*lambda, where lambda is the wavelength of the 40 GHz signal, and 2. the regularly spaced array with random perturbations (of order ~<lambda) to the (x,y) spatial location of each sample point. This dataset is published in "An Overview of Advances in Signal Processing Techniques for Classical and Quantum Wideband Synthetic Apertures" by Vouras, et al. in IEEE Selected Topics in Signal Processing Recent Advances in Wideband Signal Processing for Classical and Quantum Synthetic Apertures.

  10. h

    sparse

    • huggingface.co
    Updated Sep 14, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Diksha Shrivastava (2024). sparse [Dataset]. https://huggingface.co/datasets/diksha-shrivastava13/sparse
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Sep 14, 2024
    Authors
    Diksha Shrivastava
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Description

    diksha-shrivastava13/sparse dataset hosted on Hugging Face and contributed by the HF Datasets community

  11. CITEseq22: Sparse RAW counts data

    • kaggle.com
    zip
    Updated Nov 20, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Antonina Dolgorukova (2022). CITEseq22: Sparse RAW counts data [Dataset]. https://www.kaggle.com/datasets/antoninadolgorukova/sparse-raw-counts-data-open-problems-multimodal
    Explore at:
    zip(7575907904 bytes)Available download formats
    Dataset updated
    Nov 20, 2022
    Authors
    Antonina Dolgorukova
    License

    https://cdla.io/sharing-1-0/https://cdla.io/sharing-1-0/

    Description

    Sparse matrices for raw counts data for open-problems-multimodal competition. The script for generating sparse matrices was shared by Wei Xie and can be found here.

    The similar dataset for normalized and log1p transformed counts for the same cells can be found here.

    Each h5 file contains 5 arrays:

    axis0 (row index from the original h5 file)

    axis1 (column index from the original h5 file)

    value_i (attribute i in dgCMatrix in R or index indices in csc_array in scipy.sparse)

    value_p (attribute p in dgCMatrix in R or index indptr in csc_array in scipy.sparse)

    value_x (attribute x in dgcMatrix in R or index data in csc_array in scipy.sparse.)

  12. h

    sparse-eval-results-private

    • huggingface.co
    Updated Jun 8, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jie-Han Chen (2025). sparse-eval-results-private [Dataset]. https://huggingface.co/datasets/elichen-skymizer/sparse-eval-results-private
    Explore at:
    Dataset updated
    Jun 8, 2025
    Authors
    Jie-Han Chen
    Description

    elichen-skymizer/sparse-eval-results-private dataset hosted on Hugging Face and contributed by the HF Datasets community

  13. G

    Sparse Models Serving Market Research Report 2033

    • growthmarketreports.com
    csv, pdf, pptx
    Updated Oct 7, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Growth Market Reports (2025). Sparse Models Serving Market Research Report 2033 [Dataset]. https://growthmarketreports.com/report/sparse-models-serving-market
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Oct 7, 2025
    Dataset authored and provided by
    Growth Market Reports
    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Sparse Models Serving Market Outlook



    According to our latest research, the global sparse models serving market size reached USD 1.35 billion in 2024, reflecting the rapid adoption of efficient AI model deployment across industries. With a robust compound annual growth rate (CAGR) of 31.7% projected between 2025 and 2033, the market is forecasted to achieve a value of USD 14.52 billion by 2033. This remarkable growth trajectory is primarily driven by the increasing demand for scalable, low-latency AI inference solutions, as enterprises seek to optimize resource utilization and reduce operational costs in an era of data explosion and AI-driven digital transformation.




    A key growth factor for the sparse models serving market is the exponential increase in the deployment of artificial intelligence and machine learning solutions across diverse sectors. As organizations generate and process massive volumes of data, the need for highly efficient and scalable AI models has become paramount. Sparse models, which leverage techniques such as weight pruning and quantization, enable faster inference and reduced memory footprints without compromising accuracy. This capability is particularly valuable for real-time applications in industries such as finance, healthcare, and retail, where latency and resource efficiency are critical. The widespread adoption of edge computing and IoT devices further amplifies the demand for sparse model serving, as these environments often operate under stringent computational constraints.




    Another significant driver is the ongoing advancements in hardware accelerators and AI infrastructure. The evolution of specialized hardware, such as GPUs, TPUs, and custom AI chips, has enabled the efficient execution of sparse models at scale. Leading technology providers are investing heavily in the development of optimized software frameworks and libraries that support sparse computation, making it easier for organizations to deploy and manage these models in production environments. The integration of sparse model serving with cloud-native platforms and container orchestration systems has further streamlined the operationalization of AI workloads, allowing enterprises to achieve seamless scalability, high availability, and cost-effectiveness. This technological synergy is accelerating the adoption of sparse models serving across both on-premises and cloud deployments.




    The growing emphasis on sustainable AI and green computing is also propelling the market forward. Sparse models consume significantly less energy and computational resources compared to dense models, aligning with the global push towards environmentally responsible technology practices. Enterprises are increasingly prioritizing solutions that not only deliver high performance but also minimize their carbon footprint. Sparse model serving addresses this need by enabling efficient utilization of existing hardware, reducing the frequency of hardware upgrades, and lowering overall power consumption. This sustainability aspect is becoming a key differentiator for vendors in the sparse models serving market, as regulatory frameworks and corporate social responsibility initiatives place greater emphasis on eco-friendly AI deployments.




    From a regional perspective, North America currently dominates the sparse models serving market, accounting for the largest share in 2024. The region’s leadership can be attributed to its advanced digital infrastructure, strong presence of AI technology providers, and early adoption of cutting-edge AI solutions across sectors such as BFSI, healthcare, and IT. Europe and Asia Pacific are also witnessing rapid growth, driven by increasing investments in AI research, government initiatives supporting digital transformation, and the proliferation of data-centric industries. Emerging markets in Latin America and the Middle East & Africa are gradually catching up, as enterprises in these regions recognize the value of efficient AI model deployment in enhancing competitiveness and operational efficiency.





    Compone

  14. G

    Sparse-Matrix Compression Engine Market Research Report 2033

    • growthmarketreports.com
    csv, pdf, pptx
    Updated Aug 29, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Growth Market Reports (2025). Sparse-Matrix Compression Engine Market Research Report 2033 [Dataset]. https://growthmarketreports.com/report/sparse-matrix-compression-engine-market
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Aug 29, 2025
    Dataset authored and provided by
    Growth Market Reports
    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Sparse-Matrix Compression Engine Market Outlook



    According to our latest research, the global Sparse-Matrix Compression Engine market size reached USD 1.42 billion in 2024, reflecting robust adoption across high-performance computing and advanced analytics sectors. The market is poised for substantial expansion, with a projected CAGR of 15.8% during the forecast period. By 2033, the market is forecasted to achieve a value of USD 5.18 billion, driven by escalating data complexity, the proliferation of machine learning applications, and the imperative for efficient storage and computational solutions. The surge in demand for real-time analytics and the growing penetration of artificial intelligence across industries are primary factors fueling this remarkable growth trajectory.




    One of the key growth drivers for the Sparse-Matrix Compression Engine market is the exponential increase in data generation and the corresponding need for efficient data processing and storage. As organizations in sectors such as scientific computing, finance, and healthcare grapple with large-scale, high-dimensional datasets, the requirement for optimized storage solutions becomes paramount. Sparse-matrix compression engines enable significant reduction in data redundancy, leading to lower storage costs and faster data retrieval. This efficiency is particularly crucial in high-performance computing environments where memory bandwidth and storage limitations can hinder computational throughput. The adoption of these engines is further propelled by advancements in hardware accelerators and software algorithms that enhance compression ratios without compromising data integrity.




    Another significant factor contributing to market growth is the rising adoption of machine learning and artificial intelligence across diverse industry verticals. Modern AI and ML algorithms often operate on sparse datasets, especially in areas such as natural language processing, recommendation systems, and scientific simulations. Sparse-matrix compression engines play a pivotal role in minimizing memory footprint and optimizing computational resources, thereby accelerating model training and inference. The integration of these engines into cloud-based and on-premises solutions allows enterprises to scale their AI workloads efficiently, driving widespread deployment in both research and commercial applications. Additionally, the ongoing evolution of lossless and lossy compression techniques is expanding the applicability of these engines to new and emerging use cases.




    The market is also benefiting from the increasing emphasis on cost optimization and energy efficiency in data centers and enterprise IT infrastructure. As organizations strive to reduce operational expenses and carbon footprints, the adoption of compression technologies that minimize data movement and storage requirements becomes a strategic imperative. Sparse-matrix compression engines facilitate this by enabling higher data throughput and lower energy consumption, making them attractive for deployment in large-scale analytics, telecommunications, and industrial automation. Furthermore, the growing ecosystem of service providers and solution integrators is making these technologies more accessible to small and medium enterprises, contributing to broader market penetration.



    The development of High-Speed Hardware Compression Chip technology is revolutionizing the Sparse-Matrix Compression Engine market. These chips are designed to accelerate data compression processes, significantly enhancing the performance of high-performance computing systems. By integrating these chips, organizations can achieve faster data processing speeds, which is crucial for handling large-scale datasets in real-time analytics and AI applications. The chips offer a unique advantage by reducing latency and improving throughput, making them an essential component in modern data centers. As the demand for efficient data management solutions grows, the adoption of high-speed hardware compression chips is expected to rise, driving further innovation and competitiveness in the market.




    From a regional perspective, North America continues to dominate the Sparse-Matrix Compression Engine market, accounting for the largest revenue share in 2024 owing to the presence of leading technology companies, advanced research institutions, and

  15. Sparse Inverse Gaussian Process Regression with Application to Climate...

    • data.nasa.gov
    Updated Mar 31, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2025). Sparse Inverse Gaussian Process Regression with Application to Climate Network Discovery - Dataset - NASA Open Data Portal [Dataset]. https://data.nasa.gov/dataset/sparse-inverse-gaussian-process-regression-with-application-to-climate-network-discovery
    Explore at:
    Dataset updated
    Mar 31, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    Regression problems on massive data sets are ubiquitous in many application domains including the Internet, earth and space sciences, and finances. Gaussian Process regression is a popular technique for modeling the input-output relations of a set of variables under the assumption that the weight vector has a Gaussian prior. However, it is challenging to apply Gaussian Process regression to large data sets since prediction based on the learned model requires inversion of an order n kernel matrix. Approximate solutions for sparse Gaussian Processes have been proposed for sparse problems. However, in almost all cases, these solution techniques are agnostic to the input domain and do not preserve the similarity structure in the data. As a result, although these solutions sometimes provide excellent accuracy, the models do not have interpretability. Such interpretable sparsity patterns are very important for many applications. We propose a new technique for sparse Gaussian Process regression that allows us to compute a parsimonious model while preserving the interpretability of the sparsity structure in the data. We discuss how the inverse kernel matrix used in Gaussian Process prediction gives valuable domain information and then adapt the inverse covariance estimation from Gaussian graphical models to estimate the Gaussian kernel. We solve the optimization problem using the alternating direction method of multipliers that is amenable to parallel computation. We demonstrate the performance of our method in terms of accuracy, scalability and interpretability on a climate data set.

  16. R

    Sparse Residential Dataset

    • universe.roboflow.com
    zip
    Updated Apr 3, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ahmed Iftikhar (2023). Sparse Residential Dataset [Dataset]. https://universe.roboflow.com/ahmed-iftikhar-rfiim/sparse-residential
    Explore at:
    zipAvailable download formats
    Dataset updated
    Apr 3, 2023
    Dataset authored and provided by
    Ahmed Iftikhar
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Variables measured
    Sparse Residential Bounding Boxes
    Description

    Sparse Residential

    ## Overview
    
    Sparse Residential is a dataset for object detection tasks - it contains Sparse Residential annotations for 200 images.
    
    ## Getting Started
    
    You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
    
      ## License
    
      This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
    
  17. r

    Data from: Provable Low Rank Plus Sparse Matrix Separation Via Nonconvex...

    • resodate.org
    • service.tib.eu
    Updated Dec 16, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    April Sagan; John E. Mitchell (2024). Provable Low Rank Plus Sparse Matrix Separation Via Nonconvex Regularizers [Dataset]. https://resodate.org/resources/aHR0cHM6Ly9zZXJ2aWNlLnRpYi5ldS9sZG1zZXJ2aWNlL2RhdGFzZXQvcHJvdmFibGUtbG93LXJhbmstcGx1cy1zcGFyc2UtbWF0cml4LXNlcGFyYXRpb24tdmlhLW5vbmNvbnZleC1yZWd1bGFyaXplcnM=
    Explore at:
    Dataset updated
    Dec 16, 2024
    Dataset provided by
    Leibniz Data Manager
    Authors
    April Sagan; John E. Mitchell
    Description

    This paper considers a large class of problems where we seek to recover a low rank matrix and/or sparse vector from some set of measurements.

  18. d

    Sparse Basic Linear Algebra Subprograms

    • datasets.ai
    • s.cnmilf.com
    • +1more
    21, 47, 57
    Updated Mar 11, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Institute of Standards and Technology (2021). Sparse Basic Linear Algebra Subprograms [Dataset]. https://datasets.ai/datasets/sparse-basic-linear-algebra-subprograms-6885a
    Explore at:
    47, 57, 21Available download formats
    Dataset updated
    Mar 11, 2021
    Dataset authored and provided by
    National Institute of Standards and Technology
    Description

    Sparse Basic Linear Algebra Subprograms (BLAS), comprise of computational kernels for the calculation sparse vectors and matrices operations.

  19. SparseBeads Dataset

    • zenodo.org
    • data.niaid.nih.gov
    zip
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    J. S. Jørgensen; S. B. Coban; W. R. B. Lionheart; S. A. McDonald; P. J. Withers; J. S. Jørgensen; S. B. Coban; W. R. B. Lionheart; S. A. McDonald; P. J. Withers (2020). SparseBeads Dataset [Dataset]. http://doi.org/10.5281/zenodo.290117
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    J. S. Jørgensen; S. B. Coban; W. R. B. Lionheart; S. A. McDonald; P. J. Withers; J. S. Jørgensen; S. B. Coban; W. R. B. Lionheart; S. A. McDonald; P. J. Withers
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The presented data set, inspired by the SophiaBeads Dataset Project for X-ray Computed Tomography, is collected for studies involving sparsity-regularised reconstruction. The aim is to provide tomographic data for various samples where the sparsity in the image varies.

    This dataset is made available as part of the publication

    "SparseBeads Data: Benchmarking Sparsity-Regularized Computed Tomography", Jakob S Jørgensen et al, 2017. Meas. Sci. Technol. 28 124005.

    Direct link: https://doi.org/10.1088/1361-6501/aa8c29.

    This manuscript is published as part of Special Feature on Advanced X-ray Tomography (open access). We refer the users to this publication for an extensive detail in the experimental planning and data acquisition.

    Each zipped data folder includes

    • The meta data for data acquisition and geometry parameters of the scan (.xtekct and .ctprofile.xml).

    • A sinogram of the central slice (CentreSlice > Sinograms > .tif) along with meta data for the 2D slice (.xtek2dct and .ct2dprofile.xml),

    • List of projection angles (.ang)

    • and a 2D FDK reconstruction using the CTPro reconstruction suite (RECON2D > .vol) with volume visualisation parameters (.vgi), added as a reference.

    We also include an extra script for those that wish to use the SophiaBeads Dataset Project Codes, which essentially replaces the main script provided, sophiaBeads.m (visit https://zenodo.org/record/16539). Please note that sparseBeads.m script will have to be placed in the same folder as the project codes. The latest version of this script can be found here: https://github.com/jakobsj/SparseBeads_code

    For more information, please contact

    • jakj [at] dtu.dk
    • jakob.jorgensen [at] manchester.ac.uk
  20. torch_scatter_sparse_wheels

    • kaggle.com
    zip
    Updated Jun 4, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    awxlong (2024). torch_scatter_sparse_wheels [Dataset]. https://www.kaggle.com/datasets/awxlong/torch-scatter-sparse-wheels
    Explore at:
    zip(62319681 bytes)Available download formats
    Dataset updated
    Jun 4, 2024
    Authors
    awxlong
    License

    Apache License, v2.0https://www.apache.org/licenses/LICENSE-2.0
    License information was derived automatically

    Description

    use pre-built wheels to avoid long build times

    !pip install -q torch-geometric==2.1.0 # torch-scatter torch-sparse !pip install -q /kaggle/input/torch-scatter-sparse-wheels/wheelhouse/torch_scatter-2.1.2-cp310-cp310-linux_x86_64.whl !pip install -q /kaggle/input/torch-scatter-sparse-wheels/wheelhouse/torch_sparse-0.6.18-cp310-cp310-linux_x86_64.whl

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
UGA Grad School (2023). Sparse Traffic Dataset [Dataset]. https://universe.roboflow.com/uga-grad-school/sparse-traffic-dataset
Organization logo

Sparse Traffic Dataset

sparse-traffic-dataset

Explore at:
zipAvailable download formats
Dataset updated
Nov 29, 2023
Dataset provided by
University of Georgia Graduate Schoolhttp://grad.uga.edu/
Authors
UGA Grad School
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Variables measured
Traffic
Description

Sparse Traffic Dataset

## Overview

Sparse Traffic Dataset is a dataset for classification tasks - it contains Traffic annotations for 244 images.

## Getting Started

You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.

  ## License

  This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
Search
Clear search
Close search
Google apps
Main menu