Facebook
Twitterhttps://choosealicense.com/licenses/unknown/https://choosealicense.com/licenses/unknown/
Dataset Card for DUTS
This is a FiftyOne dataset with 15572 samples.
Installation
If you haven't already, install FiftyOne: pip install -U fiftyone
Usage
import fiftyone as fo import fiftyone.utils.huggingface as fouh
dataset = fouh.load_from_hub("Voxel51/DUTS")
session = fo.launch_app(dataset)
Dataset Details
Dataset Description… See the full description on the dataset page: https://huggingface.co/datasets/Voxel51/DUTS.
Facebook
Twitterchitradrishti/duts dataset hosted on Hugging Face and contributed by the HF Datasets community
Facebook
TwitterDUTS is a large-scale saliency detection dataset, containing 10,553 training images and 5,019 test images. All training images are collected from the ImageNet DET training/val sets, while test images are collected from the ImageNet DET test set and the SUN data set. Both the training and test set contain very challenging scenarios for saliency detection. Accurate pixel-level ground truths were manually annotated by 50 subjects.
This dataset is obtained from the official DUTS dataset homepage. Any work based on the dataset checkpoints should cite:
@inproceedings{wang2017,
title={Learning to Detect Salient Objects with Image-level Supervision},
author={Wang, Lijun and Lu, Huchuan and Wang, Yifan and Feng, Mengyang
and Wang, Dong, and Yin, Baocai and Ruan, Xiang},
booktitle={CVPR},
year={2017}
}
All rights reserved by the original authors of DUTS Image Dataset.
Facebook
TwitterDUTS_FM is a saliency detection dataset, containing 37,420 images and respective mask images. All images are collected from web and mask images are created manually. This dataset is created to train U2Net for fashion apparels and products.
Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
NDGD Air Quality Forecast Guidance for vertical dust integration. Concentration units are micrograms per cubic meter. For further information please visit our website at https://www.weather.gov/sti/stimodeling_airquality . This is a time enabled image service with WMS and WCS capabilities enabled. Forecast guidance is issued twice daily based on 6Z and 12Z model runs. Forecast information is provided in one hour intervals for the subsequent 48-hour period. Client applications may require users to specify time range parameters to view time sliced or loop data.Link to graphical web page: https://airquality.weather.gov Link to data download (grib2): https://tgftp.nws.noaa.gov/SL.us008001/ST.opnl/DF.gr2/DC.ndgd/GT.aq/ Link to metadata Questions/Concerns about the service, please contact the DISS GIS team Time Information: This map is time-enabled.
Facebook
TwitterSince May 1981, the National Aeronautics and Space Administration (NASA) has used aircraft to collect cosmic dust (CD) particles from Earth's stratosphere. Specially designed dust collectors are prepared for flight and processed after flight in an ultraclean (Class-100) laboratory constructed for this purpose at the Lyndon B. Johnson Space Center (JSC) in Houston, Texas. Particles are individually retrieved from the collectors, examined and cataloged, and then made available to the scientific community for research. Cosmic dust thereby joins lunar samples and meteorites as an additional source of extraterrestrial materials for scientific study.
Facebook
TwitterU.S. Government Workshttps://www.usa.gov/government-works
License information was derived automatically
Dust Impact Detection System (DIDSY) consists of six independent subsystems with the primary aim of registering the impacts of all particulates of significant mass incident on the probe during the post-perihelion encounter with Comet Halley in 1986. Mounted on Giotto's front dust shield, the detectors will determine the mass spectrum of the dust, with a limiting sensitivity of some 10^^-17^^ g, increasing to the largest grain masses encountered along Giotto's trajectory through the cometary environment with an ultimate spatial resolution of some 70 km. An additional detector is located on the rear shield to monitor those dust particles (m > ~ 5 X 10^^-7^^ g) that are able to penetrate the front dust shield. An ambient plasma monitor is also incorporated into DIDSY to measure the inJpact plasma generated by both dust and gas impacts on the spacecraft. The system is controlled and its data processed by a microprocessor-based system that allows the wide range of anticipated impact rates (varying from a few per minute, to ~ 10^6 s^^-1^^ at closest approach) to be handled. The instrument weighs 2.26 kg and consumes 1.9 W of power during normal operation.
Facebook
TwitterVersion 1 is the current version of the dataset.This collection MYDFDS_MON_GLB_L3 provides level 3 monthly frequency of dust storms (FDS) over land from 175°W to 175°E and 80°S to 80°N at a spatial resolution of 0.1˚ x 0.1˚. It is derived from Level 2, the Moderate Resolution Imaging Spectroradiometer (MODIS) Deep Blue aerosol products Collection 6.1 from Aqua (MYD04_L2). The dataset covers the monthly mean from 2003 to 2022.The FDS is calculated as the number of days per month when the daily dust optical depth is greater than a threshold optical depth (e.g., 0.025) with two quality flags: the lowest (1) and highest (3). It is advised to use flag 1, which is of lower quality, over dust source regions, and flag 3 over remote areas or polluted regions. Eight thresholds (0.025, 0.05, 0.1, 0.25, 0.5, 0.75, 1, 2) are saved separately in eight files.If you have any questions, please read the README document first and post your question to the NASA Earthdata Forum (forum.earthdata.nasa.gov) or email the GES DISC Help Desk (gsfc-dl-help-disc@mail.nasa.gov).
Facebook
TwitterThese data are annual aeolian dust deposition calculations from vertical deposition at seven locations near the vicinity of Moab, Utah covering the period from 1999 to 2020. Data were collected by the U.S. Geological Survey Geosciences and Environmental Change Science Center (Denver, Colorado) and Southwest Biological Science Center (Moab, Utah) to "monitor sediment characteristics at sites selected to illuminate the relations between dust sources, present climate, and land use patterns" (Reheis 2003). The sites selected represent various land uses and land ownership including private land, multiple-use public lands, and restricted use National Parks. From 1999-2013 samples were sent to the Geosciences and Environmental Change Science Center in Denver to be processed. From 2014-2020 samples were processed at the Southwest Biological Science Center Canyonlands Research Station in Moab, Utah. The purpose of this data release is to make available annual aeolian dust deposition data collected 2009-2020 that have not been published. Data can be used to understand local and regional patterns of dust inputs from both dry and wet (rain and snow containing dust particles) deposition.
Facebook
TwitterData extracted from published scientific literature on soil and dust ingestion studies are included in the manuscript in Table S2. It includes age, age grouping, study type, study reference, N, mean, standard error, and 95% confidence intervals. This dataset is associated with the following publication: Cohen, J., H. Hubbard, H. Ozkaynak, K. Thomas, L. Phillips, and N. Tulve. Meta-analysis of soil and dust ingestion studies. ENVIRONMENTAL RESEARCH. Elsevier B.V., Amsterdam, NETHERLANDS, 261: 119649, (2024).
Facebook
Twitter## Overview
Dust is a dataset for object detection tasks - it contains Dust annotations for 19,420 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
Facebook
TwitterMIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
## Overview
Dust Instance Seg is a dataset for instance segmentation tasks - it contains Dust ALvG annotations for 271 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [MIT license](https://creativecommons.org/licenses/MIT).
Facebook
TwitterThe SP-1 experiment on Vega spacecraft was intended for studying the spatial and mass distributions of dust particles in the cometary coma over the mass range 1.e-16 to 1.e-6 g. Covering such a broad mass range was made possible by using sensors of two types, namely, impact plasma and acoustic.
Facebook
TwitterThe SP-2 experiment on Vega spacecraft was intended for studying the spatial and mass distributions of dust particles in the cometary coma over the mass range 1.e-16 to 1.e-6 g. Covering such a broad mass range was made possible by using sensors of two types, namely, impact plasma and acoustic.
Facebook
TwitterDataset Overview ================ This data set contains information on dust the dust environment in interplanetary space within the inner solar system, between Jupiter and the Sun, and at high polar latitudes of the Sun. Both interplanetary and interstellar dust particles have been detected. This information is collected with a dust impact experiment, from which may be inferred direction of motion, mass, velocity and charge (see ULYDINST.CAT). The data presented in this dataset include instrumental readouts, inferred metadata, calibration information and a calendar of events. Specifically:
Facebook
TwitterDetector responses and derived quantities from the Galileo dust detector as well as spacecraft geometry information for reliable impacts from launch through 2001. See Gruen et al. (Plan. Sp. Sci. 43, 953-969, 1995) and Krueger et al. (Plan. Sp. Sci. 47, 85-106, 1999; Plan. Sp. Sci. 49, 1285-1301, 2001) for more information.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Data in 5A comprise the membership degree () of each sample, total natural and artificial particulate matter per unit leaf area, PM2.5, PM2.5–10, PM > 10, and specific surface area (SSA), and elemental content for chromium (Cr), cobalt (Co), nickel (Ni), and arsenic (As). (XLSX)
Facebook
TwitterThe data from MPI for this dataset were received as text files each containing spectra of a single instrument mode (there were several files for most modes). These spectra were reformatted into binary tables, and all spectra from each mode were combined into a single file. The original order of the spectra has been preserved. Spacecraft time, relative to switch-on of the instrument is specified as 1 clock tick = 0.11852 seconds. The exact equation is:
Facebook
TwitterThis dataset was created by Chen Yitong
Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
The Mars Dust Storm Sequence Dataset (MDSSD) contains information on dust storm instances and dust storm sequences manually identified in the Mars Reconnaissance Orbiter (MRO) Mars Color Imager (MARCI) Mars Daily Global Maps (https://doi.org/10.7910/DVN/U3766S). The information includes the position of dust storm instance at pixel level, timing, storm ID, sequence ID (when applicable), and confidence level. Currently, the dataset covers selected time periods during Mars year 28 to 34. In time order, the MARCI phases covered are "P", "B", "G", "D", "F", "J", and "K". The MDSSD was originally documented and analyzed in Wang et al. [2023] to investigate the characteristics and annual cycle of Martian dust storms. Please refer to that paper for details. Updates are made as they become available. The appendix and readme files provide additional information. This release contains bug fixes for previous versions and a few new sub-phases for "P". This dataset is provided without any guarantee for accuracy or completeness.
Facebook
Twitterhttps://choosealicense.com/licenses/unknown/https://choosealicense.com/licenses/unknown/
Dataset Card for DUTS
This is a FiftyOne dataset with 15572 samples.
Installation
If you haven't already, install FiftyOne: pip install -U fiftyone
Usage
import fiftyone as fo import fiftyone.utils.huggingface as fouh
dataset = fouh.load_from_hub("Voxel51/DUTS")
session = fo.launch_app(dataset)
Dataset Details
Dataset Description… See the full description on the dataset page: https://huggingface.co/datasets/Voxel51/DUTS.