100+ datasets found
  1. Global Sentinel-1 Burst ID Map - Dataset - NASA Open Data Portal

    • data.nasa.gov
    • s.cnmilf.com
    • +4more
    Updated Mar 31, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2025). Global Sentinel-1 Burst ID Map [Dataset]. https://data.nasa.gov/dataset/global-sentinel-1-burst-id-map
    Explore at:
    Dataset updated
    Mar 31, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    Sentinel-1 performs systematic acquisition of bursts in both IW and EW modes. The bursts overlap almost perfectly between different passes and are always located at the same place. With the deployment of the SAR processor S1-IPF 3.4, a new element has been added to the products annotations: the Burst ID, which should help the end user to identify a burst area of interest and facilitate searches. The Burst ID map is a complementary auxiliary product. The maps have a validity that covers the entire time span of the mission and they are global, i.e., they include as well information where no SAR data is acquired. Each granule contains information about burst and sub-swath IDs, relative orbit and burst polygon, and should allow for an easier link between a certain burst ID in a product and its corresponding geographic location.

  2. E

    Sentinel-2 Satellite Images

    • eos.com
    geotiff
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    EOS Data Analytics, Sentinel-2 Satellite Images [Dataset]. https://eos.com/find-satellite/sentinel-2/
    Explore at:
    geotiffAvailable download formats
    Dataset provided by
    EOS Data Analytics
    Description

    Multispectral imagery captured by Sentinel-2 satellites, featuring 13 spectral bands (visible, near-infrared, and short-wave infrared). Available globally since 2018 (Europe since 2017) with 10-60 m spatial resolution and revisit times of 2-3 days at mid-latitudes. Accessible through the EOSDA LandViewer platform for visualization, analysis, and download.

  3. Lymphatic Mapping for Sentinel Node Identification and Analysis

    • data-staging.niaid.nih.gov
    xml
    Updated May 15, 2013
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2013). Lymphatic Mapping for Sentinel Node Identification and Analysis [Dataset]. https://data-staging.niaid.nih.gov/resources?id=2130214
    Explore at:
    xmlAvailable download formats
    Dataset updated
    May 15, 2013
    Area covered
    United Kingdom
    Variables measured
    Clinical
    Description

    The main objective of this study is to determine whether the first (sentinel) lymph nodes in the drainage pathway of colonic tumour can be detected at the time of surgery using a new technique. The detection method is to inject a fluorescent dye (indocyanine green) adjacent to the tumour. The dye will then be seen as it fluoresces in the light form the near infrared spectrum that can be used at the time of the laparoscopic (keyhole) surgery. An endoscope is placed in the colon (colonoscopy) during surgery and the tracer fluorescent agent is injected around the tumour. The mesentery in which the lymph nodes draining the tumour are located will then be examined by laparoscopy as it is expected that fluorescence will be identified within approximately 5 minutes of the injection. The first lymph node or nodes that take up the fluorescent dye will then be marked by placing a clip or a stitch by them. After the surgery has been completed and colon removed all lymph nodes can be examined microscopically by the pathologist, paying a particular attention to whether any tumour cells are present in the sentinel lymph nodes and whether the presence or the absence of tumour cells in that node accurately reflects the tumour status of the rest of the specimen. If this pilot demonstrates that sentinel lymph nodes can be reliably detected, we have developed a technique which allows us to remove a small area (less than 5 cm) of the colon. Using this procedure should decrease complications following traditional surgery. We however also need a method that allows accurate assessment of the lymph nodes draining the tumour. This pilot trial will examine our ability to detect such ‘sentinel’ lymph nodes so that we can use their status (positive for cancer cells or negative) to determine whether a smaller operation such as full thickness localised excision is adequate treatment for the patient and that they can avoid a larger operation.

  4. Finger Millet Crop Mapping Sentinel Dataset (2025)

    • kaggle.com
    zip
    Updated Jul 4, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Safwan Mohammed (2025). Finger Millet Crop Mapping Sentinel Dataset (2025) [Dataset]. https://www.kaggle.com/datasets/safwanmohammed19/ragi-crop-mapping-sentinel-dataset-2025
    Explore at:
    zip(4065377816 bytes)Available download formats
    Dataset updated
    Jul 4, 2025
    Authors
    Safwan Mohammed
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Finger Millet (Ragi) Crop Mapping - Sentinel Dataset

    Overview

    This dataset was curated for binary classification of Ragi and non-Ragi crop regions using satellite time series data. It comprises three major folders: Ground Data, Satellite Extracted Data, and Time Series Data. This dataset is designed to support agricultural research, crop mapping, and deep learning model development for precision farming in the Tumkuru District, Karnataka, India.

    1. Ground Data

    The Ground Data folder contains coordinate information of crop fields in the Tumkuru District, Karnataka, India, sourced from data.gov.in for the years 2018–2023.

    Processing Steps

    • Downloaded all JSON and XML files containing crop field records.
    • Split files into Ragi and non-Ragi categories based on crop labels.
    • Removed duplicates to retain only the final unique coordinate list per year.
    • Converted the cleaned data into CSV files with columns: longitude, latitude, year, and crop_type (Ragi or non-Ragi).

    2. Satellite Extracted Data

    The Satellite Extracted Data folder contains processed data from Sentinel-1 and Sentinel-2 satellites for ground coordinates in the Tumkuru District, covering July to December for each year (2018–2023).

    Processing Steps

    For each coordinate in the Ground Data, six months of satellite data were extracted for each year.

    Sentinel-1 (S1) Data

    • Bands extracted: VV, VH
    • Calculated the VH/VV ratio
    • Data stored in files named as: S1_[Month]_[Year]_Batch[Number].csv
      • Example: S1_September_2021_Batch5.csv
      • Each batch contains approximately 1000 coordinates.

    Sentinel-2 (S2) Data

    • Applied cloud masking to remove cloud-covered pixels.
    • If data was unavailable due to cloud masking, all indices were set to zero.
    • Calculated the following indices using respective band values:
      • NDVI (Normalized Difference Vegetation Index)
      • EVI (Enhanced Vegetation Index)
      • GNDVI (Green NDVI)
      • RENDVI (Red Edge NDVI)
      • SAVI (Soil Adjusted Vegetation Index)
      • NDMI (Normalized Difference Moisture Index)
      • NDWI (Normalized Difference Water Index)
    • Data stored in files named as: S2_[Month]_[Year]_Batch[Number].csv
      • Example: S2_September_2021_Batch5.csv
    • All satellite data extraction was performed in batch mode for computational efficiency.

    3. Time Series Data

    The Time Series Data folder contains the input data used for model training and is divided into:

    (a) Initial Time Series

    • Merged Sentinel-1 and Sentinel-2 data for each coordinate and month.
    • Each row contains:
      • ID
      • Batch
      • Month
      • Longitude
      • Latitude
      • 10 features, i.e., the processed band values and indices (e.g., VV, VH, VH/VV, NDVI, EVI, GNDVI, RENDVI, SAVI, NDMI, NDWI).

    (b) Final Time Series

    • Applied preprocessing steps such as:
      • Removing duplicate coordinates within files.
      • Removing duplicate coordinates across Ragi and non-Ragi files.
      • Removing null values.

    Final Balanced Dataset

    • Merged all years’ data (2018–2023) into a single balanced dataset with approximately 10,000 rows.
    • Each row contains:
      • 60 features (10 features per month × 6 months from July to December):
      • Month 1: VV, VH, VH/VV, NDVI, EVI, GNDVI, RENDVI, SAVI, NDMI, NDWI
      • Month 2: [same 10 features]
      • ...
      • Month 6: [same 10 features]
      • Coordinate information (Longitude, Latitude)
      • Label:
      • 1 = Ragi
      • 0 = non-Ragi
    • This final dataset was used for deep learning model training for Ragi classification.

    File Naming Conventions

    FolderExample File NameDescription
    Satellite Extracted DataS1_September_2021_Batch5.csv or S2_September_2021_Batch5.csvBatch number indicates sequential processing groups (e.g., Batch5 = 5th group of coordinates).
    Time Series DataStructured naming based on month, year, batch, and Ragi class (e.g., TimeSeries_July_2021_Batch5_Ragi.csv)

    Citation

    If you use this dataset in your research, please cite:

    Hanif S, Bansal P, S S, N S. Finger millet crop mapping Sentinel dataset. Kaggle. (2025). Available from: https://www.kaggle.com/datasets/safwanmohammed19/ragi-crop-mapping-sentinel-dataset-2025

    Additionally, cite the original data sources: - Ground Data: data.gov.in, Open Government Data License – India. - Satellite Data: Sentinel-1 and Sentinel-2 data via Google Earth Engine API, [specify source terms if applicable].

    Licensing

    This dataset includes Ground Data obtained from data.gov.in under the Open Government Data License – India, which allow...

  5. f

    Table_1_Robotic-Assisted Sentinel Lymph Node Mapping With Indocyanine Green...

    • datasetcatalog.nlm.nih.gov
    Updated Jul 2, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Chen, Ming; Jing, Jibo; Xu, Bin; Du, Mulong; Wu, Yuqing; Wang, Jinfeng (2019). Table_1_Robotic-Assisted Sentinel Lymph Node Mapping With Indocyanine Green in Pelvic Malignancies: A Systematic Review and Meta-Analysis.DOCX [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000178118
    Explore at:
    Dataset updated
    Jul 2, 2019
    Authors
    Chen, Ming; Jing, Jibo; Xu, Bin; Du, Mulong; Wu, Yuqing; Wang, Jinfeng
    Description

    Objective: Newer technologies such as near-infrared (NIR) imaging of the fluorescent dye indocyanine green (ICG) and daVinci Xi Surgical System have become promising tools for sentinel lymph node (SLN) mapping. This meta-analysis was conducted to comprehensively evaluate the diagnostic value of SLN in assessing lymph nodal metastasis in pelvic malignancies, using ICG with NIR imaging in robotic-assisted surgery.Materials and Methods: A literature search was conducted using PubMed for studies in English before April 2019. The detection rate, sensitivity of SLN detection of metastatic disease, and factors associated with successful mapping (sample size, study design, mean age, mean body mass index, type of cancer) were synthesized for meta-analysis.Results: A total of 17 articles including 1,059 patients were finally included. The reported detection rates of SLN ranged from 76 to 100%, with a pooled average rate of 95% (95% CI: 93–97; 17 studies). The sensitivity of SLN detection of metastatic disease ranged from 50 to 100% and the pooled sensitivity was 86% (95% CI: 75–94; 8 studies). There were no complications related to ICG administration reported.Conclusions: NIR imaging system using ICG in robotic-assisted surgery is a feasible and safe method for SLN mapping. Due to its promising performance, it is considered to be an alternative to a complete pelvic lymph node dissection.

  6. d

    Data from: Landsat and Sentinel-2 satellite data fusion-derived...

    • catalog.data.gov
    • data.usgs.gov
    Updated Nov 19, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2025). Landsat and Sentinel-2 satellite data fusion-derived evapotranspiration maps of Palo Verde Irrigation District, California, USA [Dataset]. https://catalog.data.gov/dataset/landsat-and-sentinel-2-satellite-data-fusion-derived-evapotranspiration-maps-of-palo-verde
    Explore at:
    Dataset updated
    Nov 19, 2025
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    California, United States
    Description

    Three ET datasets were generated to evaluate the potential integration of Landsat and Sentinel-2 data for improved ET mapping. The first ET dataset was generated by linear interpolation (Lint) of Landsat-based ET fraction (ETf) images of before and after the selected image dates. The second ET dataset was generated using the regular SSEBop approach using the Landsat image only (Lonly). The third ET dataset was generated from the proposed Landsat-Sentinel data fusion (L-S) approach by applying ETf images from Landsat and Sentinel. The scripts (two) used to generate these three ET datasets are included – one script for processing SSEBop model to generate ET maps from Lonly and another script for generating ET maps from Lint and L-S approach.

  7. Additional potential factors influencing the SLN detection rate (n = 767).

    • plos.figshare.com
    xls
    Updated Jun 15, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Julia Krammer; Anja Dutschke; Clemens G. Kaiser; Andreas Schnitzer; Axel Gerhardt; Julia C. Radosa; Joachim Brade; Stefan O. Schoenberg; Klaus Wasser (2023). Additional potential factors influencing the SLN detection rate (n = 767). [Dataset]. http://doi.org/10.1371/journal.pone.0149018.t003
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 15, 2023
    Dataset provided by
    PLOShttp://plos.org/
    Authors
    Julia Krammer; Anja Dutschke; Clemens G. Kaiser; Andreas Schnitzer; Axel Gerhardt; Julia C. Radosa; Joachim Brade; Stefan O. Schoenberg; Klaus Wasser
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Additional potential factors influencing the SLN detection rate (n = 767).

  8. Data from: Sentinel2GlobalLULC: A dataset of Sentinel-2 georeferenced RGB...

    • zenodo.org
    • observatorio-cientifico.ua.es
    • +2more
    text/x-python, zip
    Updated Apr 24, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Yassir Benhammou; Yassir Benhammou; Domingo Alcaraz-Segura; Domingo Alcaraz-Segura; Emilio Guirado; Emilio Guirado; Rohaifa Khaldi; Rohaifa Khaldi; Siham Tabik; Siham Tabik (2025). Sentinel2GlobalLULC: A dataset of Sentinel-2 georeferenced RGB imagery annotated for global land use/land cover mapping with deep learning (License CC BY 4.0) [Dataset]. http://doi.org/10.5281/zenodo.6941662
    Explore at:
    zip, text/x-pythonAvailable download formats
    Dataset updated
    Apr 24, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Yassir Benhammou; Yassir Benhammou; Domingo Alcaraz-Segura; Domingo Alcaraz-Segura; Emilio Guirado; Emilio Guirado; Rohaifa Khaldi; Rohaifa Khaldi; Siham Tabik; Siham Tabik
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Sentinel2GlobalLULC is a deep learning-ready dataset of RGB images from the Sentinel-2 satellites designed for global land use and land cover (LULC) mapping. Sentinel2GlobalLULC v2.1 contains 194,877 images in GeoTiff and JPEG format corresponding to 29 broad LULC classes. Each image has 224 x 224 pixels at 10 m spatial resolution and was produced by assigning the 25th percentile of all available observations in the Sentinel-2 collection between June 2015 and October 2020 in order to remove atmospheric effects (i.e., clouds, aerosols, shadows, snow, etc.). A spatial purity value was assigned to each image based on the consensus across 15 different global LULC products available in Google Earth Engine (GEE).

    Our dataset is structured into 3 main zip-compressed folders, an Excel file with a dictionary for class names and descriptive statistics per LULC class, and a python script to convert RGB GeoTiff images into JPEG format. The first folder called "Sentinel2LULC_GeoTiff.zip" contains 29 zip-compressed subfolders where each one corresponds to a specific LULC class with hundreds to thousands of GeoTiff Sentinel-2 RGB images. The second folder called "Sentinel2LULC_JPEG.zip" contains 29 zip-compressed subfolders with a JPEG formatted version of the same images provided in the first main folder. The third folder called "Sentinel2LULC_CSV.zip" includes 29 zip-compressed CSV files with as many rows as provided images and with 12 columns containing the following metadata (this same metadata is provided in the image filenames):

    • Land Cover Class ID: is the identification number of each LULC class
    • Land Cover Class Short Name: is the short name of each LULC class
    • Image ID: is the identification number of each image within its corresponding LULC class
    • Pixel purity Value: is the spatial purity of each pixel for its corresponding LULC class calculated as the spatial consensus across up to 15 land-cover products
    • GHM Value: is the spatial average of the Global Human Modification index (gHM) for each image
    • Latitude: is the latitude of the center point of each image
    • Longitude: is the longitude of the center point of each image
    • Country Code: is the Alpha-2 country code of each image as described in the ISO 3166 international standard. To understand the country codes, we recommend the user to visit the following website where they present the Alpha-2 code for each country as described in the ISO 3166 international standard:https: //www.iban.com/country-codes
    • Administrative Department Level1: is the administrative level 1 name to which each image belongs
    • Administrative Department Level2: is the administrative level 2 name to which each image belongs
    • Locality: is the name of the locality to which each image belongs
    • Number of S2 images : is the number of found instances in the corresponding Sentinel-2 image collection between June 2015 and October 2020, when compositing and exporting its corresponding image tile

    For seven LULC classes, we could not export from GEE all images that fulfilled a spatial purity of 100% since there were millions of them. In this case, we exported a stratified random sample of 14,000 images and provided an additional CSV file with the images actually contained in our dataset. That is, for these seven LULC classes, we provide these 2 CSV files:

    • A CSV file that contains all exported images for this class
    • A CSV file that contains all images available for this class at spatial purity of 100%, both the ones exported and the ones not exported, in case the user wants to export them. These CSV filenames end with "including_non_downloaded_images".

    To clearly state the geographical coverage of images available in this dataset, we included in the version v2.1, a compressed folder called "Geographic_Representativeness.zip". This zip-compressed folder contains a csv file for each LULC class that provides the complete list of countries represented in that class. Each csv file has two columns, the first one gives the country code and the second one gives the number of images provided in that country for that LULC class. In addition to these 29 csv files, we provided another csv file that maps each ISO Alpha-2 country code to its original full country name.

    © Sentinel2GlobalLULC Dataset by Yassir Benhammou, Domingo Alcaraz-Segura, Emilio Guirado, Rohaifa Khaldi, Boujemâa Achchab, Francisco Herrera & Siham Tabik is marked with Attribution 4.0 International (CC-BY 4.0)

  9. Barest Earth Sentinel 2 Map Index

    • ecat.ga.gov.au
    • researchdata.edu.au
    esri:map-service +2
    Updated Dec 3, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Barest Earth Sentinel 2 Map Index (2021). Barest Earth Sentinel 2 Map Index [Dataset]. https://ecat.ga.gov.au/geonetwork/srv/api/records/3aac2af4-ed0b-420a-ac19-3fd748ac6629
    Explore at:
    esri:map-service, ogc:wms, www:link-1.0-http--linkAvailable download formats
    Dataset updated
    Dec 3, 2021
    Dataset provided by
    Geoscience Australiahttp://ga.gov.au/
    Barest Earth Sentinel 2 Map Index
    Time period covered
    Nov 1, 2021
    Area covered
    Description

    The Barest Earth Sentinel-2 Map Index dataset depicts the 1 to 250 000 maps sheet tile frames that have been used to generate individual tile downloads of the Barest Earth Sentinel-2 product. This web service is designed to be used in conjunction with the Barest Earth Sentinel-2 web service to provide users with direct links for imagery download.

  10. r

    Sentinel-2 10m Land Use/Land Cover Map Blend-Copy

    • opendata.rcmrd.org
    Updated Nov 10, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    alvaroleal_eowilson (2023). Sentinel-2 10m Land Use/Land Cover Map Blend-Copy [Dataset]. https://opendata.rcmrd.org/maps/a155532815f64166820c9eef372df425
    Explore at:
    Dataset updated
    Nov 10, 2023
    Dataset authored and provided by
    alvaroleal_eowilson
    Area covered
    Description

    The Sentinel-2 10m Land Use/Land Cover Time Series displays a global map of land use/land cover (LULC) derived from ESA Sentinel-2 imagery at 10m resolution.The World Imagery (Firefly) map is designed to be used as a neutral imagery basemap, with de-saturated colors, that is useful for overlaying other brightly styled layers.

  11. SEN12 Global Urban Mapping Dataset

    • zenodo.org
    application/gzip
    Updated Aug 14, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sebastian Hafner; Sebastian Hafner; Yifang Ban; Andrea Nascetti; Yifang Ban; Andrea Nascetti (2022). SEN12 Global Urban Mapping Dataset [Dataset]. http://doi.org/10.5281/zenodo.6914898
    Explore at:
    application/gzipAvailable download formats
    Dataset updated
    Aug 14, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Sebastian Hafner; Sebastian Hafner; Yifang Ban; Andrea Nascetti; Yifang Ban; Andrea Nascetti
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The SEN12 Global Urban Mapping (SEN12_GUM) dataset consists of Sentinel-1 SAR (VV + VH band) and Sentinel-2 MSI (10 spectral bands) satellite images acquired over the same area for 96 training and validation sites and an additional 60 test sites covering unique geographies across the globe. The satellite imagery was acquired as part of the European Space Agency's Earth observation program Copernicus and was preprocessed in Google Earth Engine. Built-up area labels for the 30 training and validation sites located in the United States, Canada, and Australia were obtained from Microsoft's open-access building footprints. The other 66 training sites located outside of the United States, Canada, and Australia are unlabeled but can be used for semi-supervised learning. Labels obtained from the SpaceNet7 dataset are provided for all 60 test sites.

  12. t

    The Sentinel-1 Global Backscatter Model (S1GBM) - Mapping Earth's Land...

    • researchdata.tuwien.ac.at
    • researchdata.dl.hpc.tuwien.ac.at
    • +1more
    Updated Aug 23, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Bernhard Bauer-Marschallinger; Senmao Cao; Claudio Navacchi; Vahid Freeman; Felix Reuß; Dirk Geudtner; Björn Rommen; Francisco Ceba Vega; Paul Snoeij; Evert Attema; Christoph Reimer; Wolfgang Wagner (2021). The Sentinel-1 Global Backscatter Model (S1GBM) - Mapping Earth's Land Surface with C-Band Microwaves [Dataset]. http://doi.org/10.48436/n2d1v-gqb91
    Explore at:
    Dataset updated
    Aug 23, 2021
    Dataset provided by
    datacite
    TU Wien
    Authors
    Bernhard Bauer-Marschallinger; Senmao Cao; Claudio Navacchi; Vahid Freeman; Felix Reuß; Dirk Geudtner; Björn Rommen; Francisco Ceba Vega; Paul Snoeij; Evert Attema; Christoph Reimer; Wolfgang Wagner
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Earth
    Dataset funded by
    European Space Agencyhttp://www.esa.int/
    Description

    This dataset was generated by the Remote Sensing Group of the TU Wien Department of Geodesy and Geoinformation (https://mrs.geo.tuwien.ac.at/), within a dedicated project by the European Space Agency (ESA). Rights are reserved with ESA. Open use is granted under the CC BY 4.0 license.With this dataset publication, we open up a new perspective on Earth's land surface, providing a normalised microwave backscatter map from spaceborne Synthetic Aperture Radar (SAR) observations. The Sentinel-1 Global Backscatter Model (S1GBM) describes Earth for the period 2016-17 by the mean C-band radar cross section in VV- and VH-polarization at a 10 m sampling, giving a high-quality impression on surface- structures and -patterns.At TU Wien, we processed 0.5 million Sentinel-1 scenes totaling 1.1 PB and performed semi-automatic quality curation and backscatter harmonisation related to orbit geometry effects. The overall mosaic quality excels (the few) existing datasets, with minimised imprinting from orbit discontinuities and successful angle normalisation in large parts of the world. Supporting the designand verification of upcoming radar sensors, the obtained S1GBM data potentially also serve land cover classification and determination of vegetation and soil states, as well as water body mapping.We invite developers from the broader user community to exploit this novel data resource and to integrate S1GBM parameters in models for various variables of land cover, soil composition, or vegetation structure.Please be referred to our peer-reviewed article at TODO: LINK TO BE PROVIDED for details, generation methods, and an in-depth dataset analysis. In this publication, we demonstrate – as an example of the S1GBM's potential use – the mapping of permanent water bodies and evaluate the results against the Global Surface Water (GSW) benchmark.Dataset RecordThe VV and VH mosaics are sampled at 10 m pixel spacing, georeferenced to the Equi7Grid and divided into six continental zones (Africa, Asia, Europe, North America, Oceania, South America), which are further divided into square tiles of 100 km extent ("T1"-tiles). With this setup, the S1GBM consists of 16071 tiles over six continents, for VV and VH each, totaling to a compressed data volume of 2.67 TB.The tiles' file-format is a LZW-compressed GeoTIFF holding 16-bit integer values, with tagged metadata on encoding and georeference. Compatibility with common geographic information systems as QGIS or ArcGIS, and geodata libraries as GDAL is given.In this repository, we provide each mosaic as tiles that are organised in a folder structure per continent. With this, twelve zipped dataset-collections per continent are available for download.Web-Based Data ViewerIn addition to this data provision here, there is a web-based data viewer set up at the facilities of the Earth Observation Data Centre (EODC) under http://s1map.eodc.eu/. It offers an intuitive pan-and-zoom exploration of the full S1GBM VV and VH mosaics. It has been designed to quickly browse the S1GBM, providing an easy and direct visual impression of the mosaics.Code AvailabilityWe encourage users to use the open-source Python package yeoda, a datacube storage access layer that offers functions to read, write, search, filter, split and load data from the S1GBM datacube. The yeoda package is openly accessible on GitHub at https://github.com/TUW-GEO/yeoda.Furthermore, for the usage of the Equi7Grid we provide data and tools via the python package available on GitHub at https://github.com/TUW-GEO/Equi7Grid. More details on the grid reference can be found in https://www.sciencedirect.com/science/article/pii/S0098300414001629.AcknowledgementsThis study was partly funded by the project "Development of a Global Sentinel-1 Land Surface Backscatter Model", ESA Contract No. 4000122681/17/NL/MP for the European Union Copernicus Programme. The computational results presented have been achieved using the Vienna Scientific Cluster (VSC). We further would like to thank our colleagues at TU Wien and EODC for supporting us on technical tasks to cope with such a large and complex data set. Last but not least, we appreciate the kind assistance and swift support of the colleagues from the TU Wien Center for Research Data Management.

  13. f

    DataSheet_2_Operative and Oncological Outcomes Comparing Sentinel Node...

    • datasetcatalog.nlm.nih.gov
    • frontiersin.figshare.com
    • +1more
    Updated Jan 13, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Cheng, Hongyan; Gu, Yu; Xiang, Yang; Kong, Yujia; Zong, Liju (2021). DataSheet_2_Operative and Oncological Outcomes Comparing Sentinel Node Mapping and Systematic Lymphadenectomy in Endometrial Cancer Staging: Meta-Analysis With Trial Sequential Analysis.xlsx [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000786069
    Explore at:
    Dataset updated
    Jan 13, 2021
    Authors
    Cheng, Hongyan; Gu, Yu; Xiang, Yang; Kong, Yujia; Zong, Liju
    Description

    ObjectiveTo evaluate the utility of sentinel lymph node mapping (SLN) in endometrial cancer (EC) patients in comparison with lymphadenectomy (LND).MethodsComprehensive search was performed in MEDLINE, EMBASE, CENTRAL, OVID, Web of science databases, and three clinical trials registration websites, from the database inception to September 2020. The primary outcomes covered operative outcomes, nodal assessment, and oncological outcomes. Software Revman 5.3 was used. Trial sequential analysis (TSA) and Grading of Recommendations Assessment, Development, and Evaluation (GRADE) were performed.ResultsOverall, 5,820 EC patients from 15 studies were pooled in the meta-analysis: SLN group (N = 2,152, 37.0%), LND group (N = 3,668, 63.0%). In meta-analysis of blood loss, SLN offered advantage over LND in reducing operation bleeding (I2 = 74%, P<0.01). Z-curve of blood loss crossed trial sequential monitoring boundaries though did not reach TSA sample size. There was no difference between SLN and LND in intra-operative complications (I2 = 7%, P = 0.12). SLN was superior to LND in detecting positive pelvic nodes (P-LN) (I2 = 36%, P<0.001), even in high risk patients (I2 = 36%, P = 0.001). While no difference was observed in detection of positive para-aortic nodes (PA-LN) (I2 = 47%, P = 0.76), even in high risk patients (I2 = 62%, P = 0.34). Analysis showed no difference between two groups in the number of resected pelvic nodes (I2 = 99%, P = 0.26). SLN was not associated with a statistically significant overall survival (I2 = 79%, P = 0.94). There was no difference in progression-free survival between SLN and LND (I2 = 52%, P = 0.31). No difference was observed in recurrence. Based on the GRADE assessment, we considered the quality of current evidence to be moderate for P-LN biopsy, low for items like blood loss, PA-LN positive.ConclusionThe present meta-analysis underlines that SLN is capable of reducing blood loss during operation in regardless of surgical approach with firm evidence from TSA. SLN mapping is more targeted for less node dissection and more detection of positive lymph nodes even in high risk patients with conclusive evidence from TSA. Utility of SLN yields no survival detriment in EC patients.

  14. o

    Agricultural land use (vector) : National-scale crop type maps for Germany...

    • openagrar.de
    • data.niaid.nih.gov
    • +1more
    Updated Feb 5, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Gideon Tetteh; Marcel Schwieder; Lukas Blickensdörfer; Alexander Gocht; Stefan Erasmi (2024). Agricultural land use (vector) : National-scale crop type maps for Germany from combined time series of Sentinel-1, Sentinel-2 and Landsat data (2022) [Dataset]. http://doi.org/10.5281/zenodo.10621629
    Explore at:
    Dataset updated
    Feb 5, 2024
    Authors
    Gideon Tetteh; Marcel Schwieder; Lukas Blickensdörfer; Alexander Gocht; Stefan Erasmi
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Germany
    Description

    The dataset contains a map of the main classes of agricultural land use (dominant crop types and other land use types) in Germany for the year 2022. It complements a series of maps that are produced annually at the Thünen Institute beginning with the year 2017 on the basis of satellite data. The maps cover the entire open landscape, i.e., the agriculturally used area (UAA) and e.g., uncultivated areas. The map was derived from time series of Sentinel-1, Sentinel-2, Landsat 8 and additional environmental data. Map production is based on the methods described in Blickensdörfer et al. (2022). All optical satellite data were managed, pre-processed and structured in an analysis-ready data (ARD) cube using the open-source software FORCE - Framework for Operational Radiometric Correction for Environmental monitoring (Frantz, D., 2019), in which SAR and environmental data were integrated. The map extent covers all areas in Germany that are defined as agricultural land, grassland, small woody features, heathland, peatland or unvegetated areas according to ATKIS Basis-DLM (Geobasisdaten: © GeoBasis-DE / BKG, 2020). Version v201: Post-processing of the maps included a sieve filter as well as a ruleset for the reduction of non-plausible areas using the Basis-DLM and the digital terrain model of Germany (Geobasisdaten: © GeoBasis-DE / BKG, 2015). The final post-processing step comprises the aggregation of the gridded data to homogeneous objects (fields) based on the approach that is described in Tetteh et al. (2021) and Tetteh et al. (2023). The maps are available in FlatGeobuf format, which makes downloading the full dataset optional. All data can directly be accessed in QGIS, R, Python or any supported software of your choice using the URL to the datasets that will be provided on request. By doing so the entire map area or only the regions of interest can be accessed. QGIS legend files for data visualization can be downloaded separately. Class-specific accuracies for each year are proveded in the respective tables. We provide this dataset "as is" without any warranty regarding the accuracy or completeness and exclude all liability. References: Blickensdörfer, L., Schwieder, M., Pflugmacher, D., Nendel, C., Erasmi, S., & Hostert, P. (2022). Mapping of crop types and crop sequences with combined time series of Sentinel-1, Sentinel-2 and Landsat 8 data for Germany. Remote Sensing of Environment, 269, 112831. BKG, Bundesamt für Kartographie und Geodäsie (2015). Digitales Geländemodell Gitterweite 10 m. DGM10. https://sg.geodatenzentrum.de/web_public/gdz/dokumentation/deu/dgm10.pdf (last accessed: 28. April 2022). BKG, Bundesamt für Kartographie und Geodäsie (2020). Digitales Basis-Landschaftsmodell. https://sg.geodatenzentrum.de/web_public/gdz/dokumentation/deu/basis-dlm.pdf (last accessed: 28. April 2022). Frantz, D. (2019). FORCE—Landsat + Sentinel-2 Analysis Ready Data and Beyond. Remote Sensing, 11, 1124. Tetteh, G.O., Gocht, A., Erasmi, S., Schwieder, M., & Conrad, C. (2021). Evaluation of Sentinel-1 and Sentinel-2 Feature Sets for Delineating Agricultural Fields in Heterogeneous Landscapes. IEEE Access, 9, 116702-116719. Tetteh, G.O., Schwieder, M., Erasmi, S., Conrad, C., & Gocht, A. (2023). Comparison of an Optimised Multiresolution Segmentation Approach with Deep Neural Networks for Delineating Agricultural Fields from Sentinel-2 Images. PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science

  15. a

    Sentinel-2 10m Land Use/Land Cover Timeseries

    • hub.arcgis.com
    • supply-chain-data-hub-nmcdc.hub.arcgis.com
    • +1more
    Updated May 19, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    New Mexico Community Data Collaborative (2022). Sentinel-2 10m Land Use/Land Cover Timeseries [Dataset]. https://hub.arcgis.com/maps/NMCDC::sentinel-2-10m-land-use-land-cover-timeseries/about
    Explore at:
    Dataset updated
    May 19, 2022
    Dataset authored and provided by
    New Mexico Community Data Collaborative
    Area covered
    Description

    This layer displays a global map of land use/land cover (LULC) derived from ESA Sentinel-2 imagery at 10m resolution. Each year is generated from Impact Observatory’s deep learning AI land classification model used a massive training dataset of billions of human-labeled image pixels developed by the National Geographic Society. The global maps were produced by applying this model to the Sentinel-2 scene collection on Microsoft’s Planetary Computer, processing over 400,000 Earth observations per year.The algorithm generates LULC predictions for 10 classes, described in detail below. The year 2017 has a land cover class assigned for every pixel, but its class is based upon fewer images than the other years. The years 2018-2021 are based upon a more complete set of imagery. For this reason, the year 2017 may have less accurate land cover class assignments than the years 2018-2021.Variable mapped: Land use/land cover in 2017, 2018, 2019, 2020, 2021Data Projection: Universal Transverse Mercator (UTM)Mosaic Projection: WGS84Extent: GlobalSource imagery: Sentinel-2Cell Size: 10m (0.00008983152098239751 degrees)Type: ThematicSource: Esri Inc.Publication date: January 2022What can you do with this layer?Global land use/land cover maps provide information on conservation planning, food security, and hydrologic modeling, among other things. This dataset can be used to visualize land use/land cover anywhere on Earth. It should be noted that since land use focus does not provide the spatial detail of a land cover map for the built area classification – yards, parks, small groves will appear as built area rather than trees or rangeland classes This layer can also be used in analyses that require land use/land cover input. For example, the Zonal Statistics tools allow a user to understand the composition of a specified area by reporting the total estimates for each of the classes. Land Cover processingThis map was produced by a deep learning model trained using over 5 billion hand-labeled Sentinel-2 pixels, sampled from over 20,000 sites distributed across all major biomes of the world.The underlying deep learning model uses 6 bands of Sentinel-2 surface reflectance data: visible blue, green, red, near infrared, and two shortwave infrared bands. To create the final map, the model is run on multiple dates of imagery throughout the year, and the outputs are composited into a final representative map for each year.Processing platformSentinel-2 L2A/B data was accessed via Microsoft’s Planetary Computer and scaled using Microsoft Azure Batch.Class definitions1. WaterAreas where water was predominantly present throughout the year; may not cover areas with sporadic or ephemeral water; contains little to no sparse vegetation, no rock outcrop nor built up features like docks; examples: rivers, ponds, lakes, oceans, flooded salt plains.2. TreesAny significant clustering of tall (~15 feet or higher) dense vegetation, typically with a closed or dense canopy; examples: wooded vegetation, clusters of dense tall vegetation within savannas, plantations, swamp or mangroves (dense/tall vegetation with ephemeral water or canopy too thick to detect water underneath).4. Flooded vegetationAreas of any type of vegetation with obvious intermixing of water throughout a majority of the year; seasonally flooded area that is a mix of grass/shrub/trees/bare ground; examples: flooded mangroves, emergent vegetation, rice paddies and other heavily irrigated and inundated agriculture.5. CropsHuman planted/plotted cereals, grasses, and crops not at tree height; examples: corn, wheat, soy, fallow plots of structured land.7. Built AreaHuman made structures; major road and rail networks; large homogenous impervious surfaces including parking structures, office buildings and residential housing; examples: houses, dense villages / towns / cities, paved roads, asphalt.8. Bare groundAreas of rock or soil with very sparse to no vegetation for the entire year; large areas of sand and deserts with no to little vegetation; examples: exposed rock or soil, desert and sand dunes, dry salt flats/pans, dried lake beds, mines.9. Snow/IceLarge homogenous areas of permanent snow or ice, typically only in mountain areas or highest latitudes; examples: glaciers, permanent snowpack, snow fields. 10. CloudsNo land cover information due to persistent cloud cover.11. RangelandOpen areas covered in homogenous grasses with little to no taller vegetation; wild cereals and grasses with no obvious human plotting (i.e., not a plotted field); examples: natural meadows and fields with sparse to no tree cover, open savanna with few to no trees, parks/golf courses/lawns, pastures. Mix of small clusters of plants or single plants dispersed on a landscape that shows exposed soil or rock; scrub-filled clearings within dense forests that are clearly not taller than trees; examples: moderate to sparse cover of bushes, shrubs and tufts of grass, savannas with very sparse grasses, trees or other plants.CitationKarra, Kontgis, et al. “Global land use/land cover with Sentinel-2 and deep learning.” IGARSS 2021-2021 IEEE International Geoscience and Remote Sensing Symposium. IEEE, 2021.AcknowledgementsTraining data for this project makes use of the National Geographic Society Dynamic World training dataset, produced for the Dynamic World Project by National Geographic Society in partnership with Google and the World Resources Institute.For questions please email environment@esri.com

  16. Crop classification dataset for testing domain adaptation or distributional...

    • zenodo.org
    • data.niaid.nih.gov
    bin, csv
    Updated May 13, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dan M. Kluger; Dan M. Kluger; Sherrie Wang; Sherrie Wang; David B. Lobell; David B. Lobell (2022). Crop classification dataset for testing domain adaptation or distributional shift methods [Dataset]. http://doi.org/10.5281/zenodo.6376160
    Explore at:
    bin, csvAvailable download formats
    Dataset updated
    May 13, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Dan M. Kluger; Dan M. Kluger; Sherrie Wang; Sherrie Wang; David B. Lobell; David B. Lobell
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    In this upload we share processed crop type datasets from both France and Kenya. These datasets can be helpful for testing and comparing various domain adaptation methods. The datasets are processed, used, and described in this paper: https://doi.org/10.1016/j.rse.2021.112488 (arXiv version: https://arxiv.org/pdf/2109.01246.pdf).

    In summary, each point in the uploaded datasets corresponds to a particular location. The label is the crop type grown at that location in 2017. The 70 processed features are based on Sentinel-2 satellite measurements at that location in 2017. The points in the France dataset come from 11 different departments (regions) in Occitanie, France, and the points in the Kenya dataset come from 3 different regions in Western Province, Kenya. Within each dataset there are notable shifts in the distribution of the labels and in the distribution of the features between regions. Therefore, these datasets can be helpful for testing for testing and comparing methods that are designed to address such distributional shifts.

    More details on the dataset and processing steps can be found in Kluger et. al. (2021). Much of the processing steps were taken to deal with Sentinel-2 measurements that were corrupted by cloud cover. For users interested in the raw multi-spectral time series data and dealing with cloud cover issues on their own (rather than using the 70 processed features provided here), the raw dataset from Kenya can be found in Yeh et. al. (2021), and the raw dataset from France can be made available upon request from the authors of this Zenodo upload.

    All of the data uploaded here can be found in "CropTypeDatasetProcessed.RData". We also post the dataframes and tables within that .RData file as separate .csv files for users who do not have R. The contents of each R object (or .csv file) is described in the file "Metadata.rtf".

    Preferred Citation:

    -Kluger, D.M., Wang, S., Lobell, D.B., 2021. Two shifts for crop mapping: Leveraging aggregate crop statistics to improve satellite-based maps in new regions. Remote Sens. Environ. 262, 112488. https://doi.org/10.1016/j.rse.2021.112488.

    -URL to this Zenodo post https://zenodo.org/record/6376160

  17. f

    Patients and tumor characteristics (n = 767).

    • plos.figshare.com
    xls
    Updated Jun 1, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Julia Krammer; Anja Dutschke; Clemens G. Kaiser; Andreas Schnitzer; Axel Gerhardt; Julia C. Radosa; Joachim Brade; Stefan O. Schoenberg; Klaus Wasser (2023). Patients and tumor characteristics (n = 767). [Dataset]. http://doi.org/10.1371/journal.pone.0149018.t001
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Julia Krammer; Anja Dutschke; Clemens G. Kaiser; Andreas Schnitzer; Axel Gerhardt; Julia C. Radosa; Joachim Brade; Stefan O. Schoenberg; Klaus Wasser
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Patients and tumor characteristics (n = 767).

  18. RBC-SatImg: Sentinel-2 Imagery and WatData Labels for Water Mapping

    • zenodo.org
    • data.niaid.nih.gov
    • +1more
    zip
    Updated Aug 19, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Helena Calatrava; Helena Calatrava; Bhavya Duvvuri; Bhavya Duvvuri; Haoqing Li; Haoqing Li; Ricardo Borsoi; Ricardo Borsoi; Tales Imbiriba; Tales Imbiriba; Edward Beighley; Edward Beighley; Deniz Erdogmus; Deniz Erdogmus; Pau Closas; Pau Closas (2024). RBC-SatImg: Sentinel-2 Imagery and WatData Labels for Water Mapping [Dataset]. http://doi.org/10.5281/zenodo.13345343
    Explore at:
    zipAvailable download formats
    Dataset updated
    Aug 19, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Helena Calatrava; Helena Calatrava; Bhavya Duvvuri; Bhavya Duvvuri; Haoqing Li; Haoqing Li; Ricardo Borsoi; Ricardo Borsoi; Tales Imbiriba; Tales Imbiriba; Edward Beighley; Edward Beighley; Deniz Erdogmus; Deniz Erdogmus; Pau Closas; Pau Closas
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Data Description

    This dataset is linked to the publication "Recursive classification of satellite imaging time-series: An application to land cover mapping". In this paper, we introduce the recursive Bayesian classifier (RBC), which converts any instantaneous classifier into a robust online method through a probabilistic framework that is resilient to non-informative image variations. To reproduce the results presented in the paper, the RBC-SatImg folder and the code in the GitHub repository RBC-SatImg are required.

    The RBC-SatImg folder contains:

    • Sentinel-2 time-series imagery from three key regions: Oroville Dam (CA, USA) and Charles River (Boston, MA, USA) for water mapping, and the Amazon Rainforest (Brazil) for deforestation detection.
    • The RBC-WatData dataset with manually generated water mapping labels for the Oroville Dam and Charles River regions. This dataset is well-suited for multitemporal land cover and water mapping research, as it accounts for the dynamic evolution of true class labels over time.
    • Pickle files with output to reproduce the results in the paper, including:
      • Instantaneous classification results for GMM, LR, SIC, WN, DWM
      • Posterior results obtained with the RBC framework

    The Sentinel-2 images and forest labels used in the deforestation detection experiment for the Amazon Rainforest have been obtained from the MultiEarth Challenge dataset.

    Folder Structure

    The following paths can be changed in the configuration file from the GitHub repository as desired. The RBC-SatImg is organized as follows:

    • `./log/` (EMPTY): Default path for storing log files generated during code execution.
    • `./evaluation_results/`: Contains the results to reproduce the findings in the paper, including two sub-folders:
      • `./classification/`: For each test site, four sub-folders are included as:
        • `./accuracy/`: Each sub-folder corresponding to an experimental configuration contains pickle files with balanced classification accuracy results and information about the models. The default configuration used in the paper is "conf_00."
        • `./figures/`: Includes result figures from the manuscript in SVG format.
        • `./likelihoods/`: Contains pickle files with instantaneous classification results.
        • `./posteriors/`: Contains pickle files with posterior results generated by the RBC framework.
      • `./sensitivity_analysis/`: Contains sensitivity analysis results, organized by different test sites and epsilon values.
    • `./Sentinel2_data/`: Contains Sentinel-2 images used for training and evaluation, organized by scenarios (Oroville Dam, Charles River, Amazon Rainforest). Selected images have been filtered and processed as explained in the manuscript. The Amazon Rainforest images and labels have been obtained from the MultiEarth dataset, and consequently, the labels are included in this folder instead of the RBC-WatData folder.
    • `./RBC-WatData/`: Contains the water labels that we manually generated with the LabelStudio tool.
  19. t

    Sentinel-1 Flood Maps Using Exponential Filter as No-Flood Reference

    • researchdata.tuwien.at
    • resodate.org
    application/gzip
    Updated Dec 2, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Mark Edwin Tupas; Florian Roth; Florian Roth; Bernhard Bauer-Marschallinger; Bernhard Bauer-Marschallinger; Wolfgang Wagner; Wolfgang Wagner; Mark Edwin Tupas; Mark Edwin Tupas; Mark Edwin Tupas (2024). Sentinel-1 Flood Maps Using Exponential Filter as No-Flood Reference [Dataset]. http://doi.org/10.48436/3dd60-ydz51
    Explore at:
    application/gzipAvailable download formats
    Dataset updated
    Dec 2, 2024
    Dataset provided by
    TU Wien
    Authors
    Mark Edwin Tupas; Florian Roth; Florian Roth; Bernhard Bauer-Marschallinger; Bernhard Bauer-Marschallinger; Wolfgang Wagner; Wolfgang Wagner; Mark Edwin Tupas; Mark Edwin Tupas; Mark Edwin Tupas
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Apr 29, 2024
    Description

    Background

    The TU Wien flood mapping algorithm is a Sentinel-1-based workflow using Bayes Inference at the pixel level. The algorithm is currently deployed in global operations under the Copernicus GFM project and have been shown to work generally well. However, the current approach has overestimation issues related to imperfect no-flood probability modeling. In a recent study, we proposed and compared an Exponential Filter derived from no-flood references versus the original Harmonic Model. We have conducted experiments on seven study sites for flooded and no-flood scenarios. A full description and discussion are found in the paper: Assessment of Time-Series-Derived No-Flood Reference for SAR-based Bayesian Flood Mapping.

    Methodology

    • We generated no-flood references using the Exponential Filter at various T-parameter values and the original Harmonic Model as a baseline.
    • Flood maps were generated using the Bayes Inference-based SAR Flood mapping algorithm implemented in Python using the Yeoda software package. Flood maps using the various no-flood references for all available Sentinel-1 image acquisitions for a selected relative orbit per study site.
    • Each flood map is compared with the reference CEMS Rapid Mapping or Sentinel Asia reference dataset to generate validation/confusion maps.

    Technical details

    • Datasets are stored in GeoTiff format using LZW Compression.
    • Files are compressed in two bundles: 1) flood maps, 2) false positive count maps, and 3) validation results.
    • Files are organized and tiled following the T3 Equi7Grid tilling system at 20m x 20m resolution.
      • Folder structure: dataset/map product>(continental)subgrid>tile>files.
      • The study covers the following study sites:
        • EU E039N027T: Scotland
        • AS E054N015T3: Vietnam
        • EU E054N006T3: Greece
        • EU E051N012T3: Slovenia
        • AS E024N027T3: India
        • OC E057N117T3: Philippines
        • EU E057N024T3: Latvia
    • Files are named following the Yeoda file naming convention.
    • Summary Accuracy Assessment Metrics are in CSV format.

    Datasets:

    • Flood: flood maps generated using different parameterizations of no-flood reference.
    • FP_Count: false positive count maps.
    • Validation results include:
      • Confusion maps were generated from the difference between the flood maps and the rasterized CEMS Rapid Mapping reference or Sentinel Asia datasets. Summary Accuracy Assessment Metrics in CSV format.
      • ERA5-LAND daily aggregates in CSV format.
      • Root Mean Square Error time-series analysis in CSV format.
      • False Positive Rate time-series analysis in CSV format.
    • *Due to storage constraints, no flood reference is available upon request.

  20. Data from: A 10 m resolution urban green space map for major Latin American...

    • figshare.com
    zip
    Updated Aug 14, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Yang Ju; Iryna Dronova; Xavier Delclòs-Alió (2025). A 10 m resolution urban green space map for major Latin American cities from Sentinel-2 remote sensing images and OpenStreetMap [Dataset]. http://doi.org/10.6084/m9.figshare.19803790.v4
    Explore at:
    zipAvailable download formats
    Dataset updated
    Aug 14, 2025
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Yang Ju; Iryna Dronova; Xavier Delclòs-Alió
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Latin America
    Description

    Here we produced the first 10 m resolution urban green space (UGS) map for the main urban clusters across 371 major Latin American cities as of 2017. Our approach applied a supervised classification of Sentinel-2 satellite imagery and UGS samples derived from OpenStreetMap (OSM). The overall accuracy of this UGS map in 11 randomly selected cities was 0.87, evaluated by independently collected validation samples (‘ground truth’). We further improved mapping quality through a visual inspection and additional sample collection. The resulting UGS map enables studies to measure area, spatial configuration, and human exposures to UGS, facilitating studies about the relationship between UGS and human exposures to environmental hazards, public health outcomes, and environmental justice issues in Latin American cities.UGS in this map series includes grass, shrub, forest, and farmland, and non-UGS included buildings, pavement, roads, barren land, and dry vegetation.The UGS map series includes three sets of files:(1) binary UGS maps at 10 m spatial resolution in GEOTIFF format (UGS.zip), with each of the 371 cities being an individual map. Mapped value of 1 indicates UGS, 0 indicates non-UGS, and no data (with value of -32768) indicates areas outside the mapped boundary or water bodies;(2) a shapefile of mapped boundaries (Boundaries.zip). The boundary file contains city name, country name and its ISO-2 country code, and an ID field linking each city's boundary to the corresponding UGS map.(3) .prj files containing projection information for the binary UGS maps and boundary shapefile. The binary UGS maps are projected with World Geodetic System (WGS) 84 / Pseudo-Mercator projected coordinate system (EPSG: 3857), and the boundary shapefile is projected with WGS 1984 geographic coordinate system (EPSG: 4326)Reference: A 10 m resolution urban green space map for major Latin American cities from Sentinel-2 remote sensing images and OpenStreetMap, published by Scientific Data [link].Citation: Ju, Y., Dronova, I., & Delclòs-Alió, X. (2022). A 10 m resolution urban green space map for major Latin American cities from Sentinel-2 remote sensing images and OpenStreetMap. Scientific Data, 9, Article 1. https://doi.org/10.1038/s41597-022-01701-y

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
nasa.gov (2025). Global Sentinel-1 Burst ID Map [Dataset]. https://data.nasa.gov/dataset/global-sentinel-1-burst-id-map
Organization logo

Global Sentinel-1 Burst ID Map - Dataset - NASA Open Data Portal

Explore at:
Dataset updated
Mar 31, 2025
Dataset provided by
NASAhttp://nasa.gov/
Description

Sentinel-1 performs systematic acquisition of bursts in both IW and EW modes. The bursts overlap almost perfectly between different passes and are always located at the same place. With the deployment of the SAR processor S1-IPF 3.4, a new element has been added to the products annotations: the Burst ID, which should help the end user to identify a burst area of interest and facilitate searches. The Burst ID map is a complementary auxiliary product. The maps have a validity that covers the entire time span of the mission and they are global, i.e., they include as well information where no SAR data is acquired. Each granule contains information about burst and sub-swath IDs, relative orbit and burst polygon, and should allow for an easier link between a certain burst ID in a product and its corresponding geographic location.

Search
Clear search
Close search
Google apps
Main menu