100+ datasets found
  1. Z

    Data for detailed temporal mapping of global human modification from 1990 to...

    • data.niaid.nih.gov
    • zenodo.org
    Updated Jun 18, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Baruch-Mordo, Sharon (2023). Data for detailed temporal mapping of global human modification from 1990 to 2017 [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_3963012
    Explore at:
    Dataset updated
    Jun 18, 2023
    Dataset provided by
    Oakleaf, James
    Chen, Bin
    Theobald, David M.
    Kennedy, Christina
    Baruch-Mordo, Sharon
    Kiesecker, Joe
    License

    Attribution 1.0 (CC BY 1.0)https://creativecommons.org/licenses/by/1.0/
    License information was derived automatically

    Description

    Data on the extent, patterns, and trends of human land use are critically important to support global and national priorities for conservation and sustainable development. To inform these issues, we created a series of detailed global datasets for 1990, 1995, 2000, 2005, 2010, 2015, and 2017 to evaluate temporal changes and spatial patterns of land use modification of terrestrial lands (excluding Antarctica). These data were calculated using the degree of human modification approach that combines the proportion of a pixel of a given stressor (i.e. footprint) times the intensity of that stressor (ranging from 0 to 1.0). Our novel datasets are detailed (0.09 km^2 resolution), temporally consistent (for 1990-2015, every 5 years), comprehensive (11 change stressors, 14 current), robust (using an established framework and incorporating classification errors and parameter uncertainty), and strongly validated. We also provide a dataset that represents ~2017 conditions and has 14 stressors for an even more comprehensive dataset, but the 2017 results should not be used to calculate change with the other datasets (1990-2015). Note that because of repo file size limits, the datasets for the for the HM overall for 1990 and 1995, as well as major stressors for all years, are located this Google Drive.

    This version 1.5 provides the following updates:

    Datasets are provided for each of the 6 stressor groups: built-up areas (BU), agricultural/timber harvest (AG), extractive energy and mining (EX), human intrusions (HI), natural system modifications (NS), and transportation & infrastructure (TI), available now at 300 m resolution for each of the time steps in the 1990-2015 time series.

    It provides the addition datasets for the years 1995 and 2005, calculated using linear interpolation when stressor data do not provide data at the specific year.

    The ESA 150 m water-mask dataset (Lamarche et al. 2017) was used to provide better and more consistent alignment of datasets at the ocean-land-inland water interfaces.

    The built-up stressor uses an updated version of the Global Human Settlement Layer (v2022A).

    Values provided are 32-bit floating point values, with human modification values ranging from 0.0 to 1.0.

    For more details on the approach and methods, please see: Theobald, D. M., Kennedy, C., Chen, B., Oakleaf, J., Baruch-Mordo, S., and Kiesecker, J.: Earth transformed: detailed mapping of global human modification from 1990 to 2017, Earth Syst. Sci. Data., https://doi.org/10.5194/essd-2019-252, 2020.

    Version 1.5 was completed in collaboration with the Center for Biodiversity and Global Change at Yale University and supported by the E.O. Wilson Biodiversity Foundation.

  2. Resource of "TemporalVAE: atlas-assisted temporal mapping of time-series...

    • zenodo.org
    bin
    Updated May 8, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Yi-Jun Liu; Yi-Jun Liu (2025). Resource of "TemporalVAE: atlas-assisted temporal mapping of time-series single-cell transcriptomes during embryogenesis" [Dataset]. http://doi.org/10.5281/zenodo.15366361
    Explore at:
    binAvailable download formats
    Dataset updated
    May 8, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Yi-Jun Liu; Yi-Jun Liu
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    rawCount_Z_C_Xiao_M_P_Liu_Tyser_Xiang.h5ad is Integration of 8 human embryo datasets:

    • Xiang et al. (original dataset downloaded from GEO: GSE136447);
    • Petropoulos et al. (original dataset downloaded from ArrayExpress: E-MTAB-3929);
    • Molè et al. (original dataset downloaded from ArrayExpress: E-MTAB-8060);
    • Zhou et al. (original dataset downloaded from GEO: GSE109555);
    • Liu et al. (original dataset downloaded from GEO: sample GSM3901995-GSM3902021, GSM3902030-GSM3902031, GSM4058370-GSM4058375 from GSE133200);
    • Tyser et al. (original dataset downloaded from ArrayExpress: E-MTAB-9388);
    • Xiao et al. (original dataset downloaded from https://cs8.3dembryo.com/#/download);
    • Cui et al. (original dataset downloaded from https://cs7.3dembryo.com/#/download).

    human_atlas.ckpt is the pre-trained TemporalVAE by human embryo data.

    rawCount_Z_C_Xiao_M_P_Liu_Tyser_Xiang.filtered.h5ad adds processed_matrix of raw count gene expression and predictions from TemporalVAE.

  3. SDNist v1.3: Temporal Map Challenge Environment

    • datasets.ai
    • data.nist.gov
    • +1more
    0, 23, 5, 8
    Updated Aug 6, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Institute of Standards and Technology (2024). SDNist v1.3: Temporal Map Challenge Environment [Dataset]. https://datasets.ai/datasets/sdnist-benchmark-data-and-evaluation-tools-for-data-synthesizers
    Explore at:
    5, 23, 8, 0Available download formats
    Dataset updated
    Aug 6, 2024
    Dataset authored and provided by
    National Institute of Standards and Technologyhttp://www.nist.gov/
    Description

    SDNist (v1.3) is a set of benchmark data and metrics for the evaluation of synthetic data generators on structured tabular data. This version (1.3) reproduces the challenge environment from Sprints 2 and 3 of the Temporal Map Challenge. These benchmarks are distributed as a simple open-source python package to allow standardized and reproducible comparison of synthetic generator models on real world data and use cases. These data and metrics were developed for and vetted through the NIST PSCR Differential Privacy Temporal Map Challenge, where the evaluation tools, k-marginal and Higher Order Conjunction, proved effective in distinguishing competing models in the competition environment.SDNist is available via pip install: pip install sdnist==1.2.8 for Python >=3.6 or on the USNIST/Github. The sdnist Python module will download data from NIST as necessary, and users are not required to download data manually.

  4. Z

    SEN12 Multi-Temporal Urban Mapping Dataset

    • data.niaid.nih.gov
    • zenodo.org
    Updated Apr 4, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sebastian Hafner (2023). SEN12 Multi-Temporal Urban Mapping Dataset [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_7794692
    Explore at:
    Dataset updated
    Apr 4, 2023
    Dataset authored and provided by
    Sebastian Hafner
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Monthly mean Sentinel-1 SAR and cloud-free Sentinel-2 MSI images for the SpaceNet 7 training and test sites. Our dataset also includes monthly rasterized built-up area labels for the 60 training sites.

  5. t

    Roelfsema, Christiaan M, Lyons, Mitchell B, Kovacs, Eva M, Maxwell, Paul,...

    • service.tib.eu
    Updated Nov 29, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). Roelfsema, Christiaan M, Lyons, Mitchell B, Kovacs, Eva M, Maxwell, Paul, Saunders, Megan I, Samper-Villarreal, Jimena, Phinn, Stuart R (2014). Dataset: Multi-temporal mapping of seagrass cover, species and biomass of the Eastern Banks, Moreton Bay, Australia, with links to shapefiles. https://doi.org/10.1594/PANGAEA.833767 [Dataset]. https://service.tib.eu/ldmservice/dataset/png-doi-10-1594-pangaea-833767
    Explore at:
    Dataset updated
    Nov 29, 2024
    License

    Attribution 3.0 (CC BY 3.0)https://creativecommons.org/licenses/by/3.0/
    License information was derived automatically

    Area covered
    City of Moreton Bay, Australia
    Description

    The spatial and temporal dynamics of seagrasses have been studied from the leaf to patch (100 m2) scales. However, landscape scale (> 100 km2) seagrass population dynamics are unresolved in seagrass ecology. Previous remote sensing approaches have lacked the temporal or spatial resolution, or ecologically appropriate mapping, to fully address this issue. This paper presents a robust, semi-automated object-based image analysis approach for mapping dominant seagrass species, percentage cover and above ground biomass using a time series of field data and coincident high spatial resolution satellite imagery. The study area was a 142 km**2 shallow, clear water seagrass habitat (the Eastern Banks, Moreton Bay, Australia). Nine data sets acquired between 2004 and 2013 were used to create seagrass species and percentage cover maps through the integration of seagrass photo transect field data, and atmospherically and geometrically corrected high spatial resolution satellite image data (WorldView-2, IKONOS and Quickbird-2) using an object based image analysis approach. Biomass maps were derived using empirical models trained with in-situ above ground biomass data per seagrass species. Maps and summary plots identified inter- and intra-annual variation of seagrass species composition, percentage cover level and above ground biomass. The methods provide a rigorous approach for field and image data collection and pre-processing, a semi-automated approach to extract seagrass species and cover maps and assess accuracy, and the subsequent empirical modelling of seagrass biomass. The resultant maps provide a fundamental data set for understanding landscape scale seagrass dynamics in a shallow water environment. Our findings provide proof of concept for the use of time-series analysis of remotely sensed seagrass products for use in seagrass ecology and management.

  6. Additional file 1: of Mapping areas of spatial-temporal overlap from...

    • springernature.figshare.com
    zip
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jed Long; Stephen Webb; Trisalyn Nelson; Kenneth Gee (2023). Additional file 1: of Mapping areas of spatial-temporal overlap from wildlife tracking data [Dataset]. http://doi.org/10.6084/m9.figshare.c.3645392_D1.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Jed Long; Stephen Webb; Trisalyn Nelson; Kenneth Gee
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Derivation of biased correlated random walks (BCRW) as implemented in the simulation study. (ZIP 26.5 kb)

  7. m

    Data from: Mapping spatial-temporal sediment dynamics of river-floodplains...

    • data.mendeley.com
    Updated Nov 28, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alice Fassoni-Andrade (2018). Mapping spatial-temporal sediment dynamics of river-floodplains in the Amazon [Dataset]. http://doi.org/10.17632/wy2mz3nm7p.1
    Explore at:
    Dataset updated
    Nov 28, 2018
    Authors
    Alice Fassoni-Andrade
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Amazon Rainforest
    Description

    This directory contains the following maps in GeoTIFF format: open-water frequency, time-series of reflectance for each band (Red and NIR), and frequency of SSSC classes.

    Fassoni-Andrade, A. C., & Paiva, R. C. D. (2019). Mapping spatial-temporal sediment dynamics of river-floodplains in the Amazon. Remote Sensing of Environment, 221, 94-107.

    https://www.sciencedirect.com/science/article/abs/pii/S0034425718305005

    Corresponding autor: alice.fassoni@ufrgs.br

    1) Data Description:

    1.1) Spatial Representation Type: Raster Format: TIFF Columns and rows: 7661, 2405 Cell Size (X,Y): 0.002245, 0.002245 (~250m)

    1.2) Extent (coordinate system): Top: -0.025 Left: -67.28 Right: -50.081055 Botton: -5.424225

    1.3) Spatial Reference Properties (GCS_WGS_1984.prj file): Type: Geographic Geographic Coordinate Reference: WGS 1984 Open Geospatial Consortium (OGC) Well Known Text (WKT): GEOGCS["GCS_WGS_1984", DATUM["D_WGS_1984", SPHEROID["WGS_1984",6378137.0,298.257223563]], PRIMEM["Greenwich",0.0], UNIT["Degree",0.0174532925199433], AUTHORITY["EPSG",4326]]

    2) Data description for individual files:

    2.1) File: open_water_frequency.tif This map indicates for how long, during 15 years (2003-2017), each pixel in rivers and lakes of central Amazon basin remained as open-water at every four days. Number of Bands: 1 Values between 0 and 100. Pixel Type: floating point Pixel Depth: 32 bit No Data Value: 0

    2.2) File: class_SSC_frequency.tif This map represents a 15-year frequency (2003-2017) at which each pixel in rivers and lakes of central Amazon basin remains in one of the surface suspended sediments concentration classes (SSSC): high, moderate, and low. The open-water frequency map must be considered to interpret the sediments temporal dynamics in the class frequency map. For example, a pixel in the floodplain lake with frequency of 10, 30, and 20% in SSSC classes low, medium and high respectively, remains 40% of the time as no open-water. Number of Bands: 3 band 1: low SSSC class; band 2: Moderate SSSC class; band 3: High SSSC class Composition of bands for best visualization: R(3)G(2)B(1) without contrast Values between 0 and 100. Pixel Type: double precision Pixel Depth: 64 bit No Data Value: 0

    2.3) File: time_series_nir.tif This map represents the climatology time series of infrared (nir) reflectance in period of four-days in rivers and lakes of central Amazon basin between 2003 and 2017 (15 years). Number of Bands: 92 (each band represent a date that is identified in dates.txt file) Values between 0 and 10000. Pixel Type: floating point Pixel Depth: 32 bit No Data Value: 0

    2.4) File: time_series_red.tif This map represents the climatology time series of red reflectance in period of four-days in rivers and lakes of central Amazon basin between 2003 and 2017 (15 years). Number of Bands: 92 (each band represent a date that is identified in dates.txt file) Values between 0 and 10000. Pixel Type: floating point Pixel Depth: 32 bit No Data Value: 0

  8. r

    Vision and GPS data for testing Open FABMAP's application to ground based,...

    • researchdata.edu.au
    Updated Jul 11, 2014
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Arren Glover; Will Maddern; Will Maddern; Arren Glover (2014). Vision and GPS data for testing Open FABMAP's application to ground based, aerial and temporal mapping [Dataset]. https://researchdata.edu.au/vision-gps-testing-temporal-mapping/448380
    Explore at:
    Dataset updated
    Jul 11, 2014
    Dataset provided by
    Queensland University of Technology
    Authors
    Arren Glover; Will Maddern; Will Maddern; Arren Glover
    License

    Attribution-NonCommercial 3.0 (CC BY-NC 3.0)https://creativecommons.org/licenses/by-nc/3.0/
    License information was derived automatically

    Time period covered
    Aug 19, 2009 - Sep 11, 2009
    Area covered
    Description

    The data records a single route taken through the suburb of St Lucia, Queensland, Australia. The route was traversed at five different times of the day to capture the difference in appearance between early morning and late afternoon. The route was traversed again, another five times, two weeks later for a total of ten datasets.

    The data was recorded with a forward facing webcam attached to the roof of a car.

    GPS data is included for each dataset.

    Each dataset is labelled with the date and time it was collected in the following format DD/MM/YY_24HOUR. Each dataset has 5 files.

    • (webcam_video.avi) video imagery. This file is compressed and no uncompressed version is available.
    • (gps_log.txt) raw GPS data as logged
    • (frame_log.txt) time stamps of each video frame compatible with GPS data
    • (fGPS.txt) processed version of GPS and frame time stamps providing the GPS point for each frame in the video file
    • (fGPS.mat) as above, saved for easy importation to MATLAB

  9. n

    A global map of travel time to cities

    • narcis.nl
    • phys-techsciences.datastations.nl
    geotiff
    Updated Oct 1, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Weiss, D. (University of Oxford) (2018). A global map of travel time to cities [Dataset]. http://doi.org/10.17026/dans-ztx-2sd2
    Explore at:
    geotiffAvailable download formats
    Dataset updated
    Oct 1, 2018
    Dataset provided by
    Data Archiving and Networked Services (DANS)
    Authors
    Weiss, D. (University of Oxford)
    Area covered
    Earth, (n: 80 e: 180 s: -65 w: -180)
    Description

    A global analysis of accessibility to high-density urban centres at a resolution of 1×1 kilometre for 2015, as measured by travel time.

    To model the time required for individuals to reach their most accessible city, we first quantified the speed at which humans move through the landscape. The principle underlying this work was that all areas on Earth, represented as pixels within a 2D grid, had a cost (that is, time) associated with moving through them that we quantified as a movement speed within a cost or ‘friction’ surface. We then applied a least-cost-path algorithm to the friction surface in relation to a set of high-density urban points. The algorithm calculated pixel-level travel times for the optimal path between each pixel and its nearest city (that is, with the shortest journey time). From this work we ultimately produced two products: (a) an accessibility map showing travel time to urban centres, as cities are proxies for access to many goods and services that affect human wellbeing; and (b) a friction surface that underpins the accessibility map and enables the creation of custom accessibility maps from other point datasets of interest. The map products are in GeoTIFF format in EPSG:4326 (WGS84) project with a spatial resolution of 30 arcsecs. The accessibility map pixel values represent travel time in minutes. The friction surface map pixels represent the time, in minutes required to travel one metre. This DANS data record contains these two map products.

  10. N

    Data from: Temporal reliability of ultra-high field resting-state MRI for...

    • neurovault.org
    zip
    Updated Jun 30, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2018). Temporal reliability of ultra-high field resting-state MRI for single-subject sensorimotor and language mapping [Dataset]. http://identifiers.org/neurovault.collection:1949
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jun 30, 2018
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    A collection of 2 brain maps. Each brain map is a 3D array of values representing properties of the brain at different locations.

    Collection description

  11. Dataset for "Enhancing Cloud Detection in Sentinel-2 Imagery: A...

    • zenodo.org
    • data.niaid.nih.gov
    bin
    Updated Feb 4, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Gong Chengjuan; Yin Ranyu; Yin Ranyu; Long Tengfei; Long Tengfei; He Guojin; Jiao Weili; Wang Guizhou; Gong Chengjuan; He Guojin; Jiao Weili; Wang Guizhou (2024). Dataset for "Enhancing Cloud Detection in Sentinel-2 Imagery: A Spatial-Temporal Approach and Dataset" [Dataset]. http://doi.org/10.5281/zenodo.10613705
    Explore at:
    binAvailable download formats
    Dataset updated
    Feb 4, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Gong Chengjuan; Yin Ranyu; Yin Ranyu; Long Tengfei; Long Tengfei; He Guojin; Jiao Weili; Wang Guizhou; Gong Chengjuan; He Guojin; Jiao Weili; Wang Guizhou
    License

    Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
    License information was derived automatically

    Description

    This dataset is built for time-series Sentinel-2 cloud detection and stored in Tensorflow TFRecord (refer to https://www.tensorflow.org/tutorials/load_data/tfrecord).

    Each file is compressed in 7z format and can be decompressed using Bandzip or 7-zip software.

    Dataset Structure:

    Each filename can be split into three parts using underscores. The first part indicates whether it is designated for training or validation ('train' or 'val'); the second part indicates the Sentinel-2 tile name, and the last part indicates the number of samples in this file.

    For each sample, it includes:

    1. Sample ID;
    2. Array of time series 4 band image patches in 10m resolution, shaped as (n_timestamps, 4, 42, 42);
    3. Label list indicating cloud cover status for the center \(6\times6\) pixels of each timestamp;
    4. Ordinal list for each timestamp;
    5. Sample weight list (reserved);

    Here is a demonstration function for parsing the TFRecord file:

    import tensorflow as tf
    
    # init Tensorflow Dataset from file name
    def parseRecordDirect(fname):
      sep = '/'
      parts = tf.strings.split(fname,sep)
      tn = tf.strings.split(parts[-1],sep='_')[-2]
      nn = tf.strings.to_number(tf.strings.split(parts[-1],sep='_')[-1],tf.dtypes.int64)
      t = tf.data.Dataset.from_tensors(tn).repeat().take(nn)
      t1 = tf.data.TFRecordDataset(fname)
      ds = tf.data.Dataset.zip((t, t1))
      return ds
    
    keys_to_features_direct = {
      'localid': tf.io.FixedLenFeature([], tf.int64, -1),
      'image_raw_ldseries': tf.io.FixedLenFeature((), tf.string, ''),
      'labels': tf.io.FixedLenFeature((), tf.string, ''),
      'dates': tf.io.FixedLenFeature((), tf.string, ''),
      'weights': tf.io.FixedLenFeature((), tf.string, '')
        }
    
    # The Decoder (Optional)
    class SeriesClassificationDirectDecorder(decoder.Decoder):
     """A tf.Example decoder for tfds classification datasets."""
     def _init_(self) -> None:
      super()._init_()
    
     def decode(self, tid, ds):
      parsed = tf.io.parse_single_example(ds, keys_to_features_direct)
      encoded = parsed['image_raw_ldseries']
      labels_encoded = parsed['labels']
      decoded = tf.io.decode_raw(encoded, tf.uint16)
      label = tf.io.decode_raw(labels_encoded, tf.int8)
      dates = tf.io.decode_raw(parsed['dates'], tf.int64)
      weight = tf.io.decode_raw(parsed['weights'], tf.float32)
      decoded = tf.reshape(decoded,[-1,4,42,42])
      sample_dict = {
       'tid': tid, # tile ID
       'dates': dates, # Date list
       'localid': parsed['localid'], # sample ID
       'imgs': decoded, # image array
       'labels': label, # label list
       'weights': weight
      }
      return sample_dict
    
    # simple function 
    def preprocessDirect(tid, record):
      parsed = tf.io.parse_single_example(record, keys_to_features_direct)
      encoded = parsed['image_raw_ldseries']
      labels_encoded = parsed['labels']
      decoded = tf.io.decode_raw(encoded, tf.uint16)
      label = tf.io.decode_raw(labels_encoded, tf.int8)
      dates = tf.io.decode_raw(parsed['dates'], tf.int64)
      weight = tf.io.decode_raw(parsed['weights'], tf.float32)
      decoded = tf.reshape(decoded,[-1,4,42,42])
      return tid, dates, parsed['localid'], decoded, label, weight
    
    t1 = parseRecordDirect('filename here')
    dataset = t1.map(preprocessDirect, num_parallel_calls=tf.data.experimental.AUTOTUNE)
    
    #
    

    Class Definition:

    • 0: clear
    • 1: opaque cloud
    • 2: thin cloud
    • 3: haze
    • 4: cloud shadow
    • 5: snow

    Dataset Construction:

    First, we randomly generate 500 points for each tile, and all these points are aligned to the pixel grid center of the subdatasets in 60m resolution (eg. B10) for consistence when comparing with other products.
    It is because that other cloud detection method may use the cirrus band as features, which is in 60m resolution.

    Then, the time series image patches of two shapes are cropped with each point as the center.
    The patches of shape \(42 \times 42\) are cropped from the bands in 10m resolution (B2, B3, B4, B8) and are used to construct this dataset.
    And the patches of shape \(348 \times 348\) are cropped from the True Colour Image (TCI, details see sentinel-2 user guide) file and are used to interpreting class labels.

    The samples with a large number of timestamps could be time-consuming in the IO stage, thus the time series patches are divided into different groups with timestamps not exceeding 100 for every group.

  12. f

    Supplementary material: Detailed temporal analysis of CE–SDG mapping...

    • usn.figshare.com
    txt
    Updated May 16, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zahir Barahmand (2025). Supplementary material: Detailed temporal analysis of CE–SDG mapping dynamics [Dataset]. http://doi.org/10.23642/usn.28920209.v1
    Explore at:
    txtAvailable download formats
    Dataset updated
    May 16, 2025
    Dataset provided by
    University of South-Eastern Norway
    Authors
    Zahir Barahmand
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This supplementary material provides detailed temporal data on the mapping of 2,700 circular economy (CE) indicators to the 17 Sustainable Development Goals (SDGs), covering the period 2011–2024. It includes year-by-year distributions for each SDG, identification of emergence years based on a defined threshold method, and supporting data for temporal clustering and pattern analysis. These results offer additional insights into the evolution of CE–SDG alignment and highlight emerging trends, gaps, and shifts in thematic emphasis over time.

  13. d

    Detailed temporal mapping of global human modification from 1990 to 2017

    • datadryad.org
    • data.niaid.nih.gov
    • +2more
    zip
    Updated Jan 31, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    David Theobald; Christina Kennedy; Bin Chen; James Oakleaf; Joe Kiesecker; Sharon Baruch-Mordo (2020). Detailed temporal mapping of global human modification from 1990 to 2017 [Dataset]. http://doi.org/10.5061/dryad.n5tb2rbs1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jan 31, 2020
    Dataset provided by
    Dryad
    Authors
    David Theobald; Christina Kennedy; Bin Chen; James Oakleaf; Joe Kiesecker; Sharon Baruch-Mordo
    Time period covered
    2020
    Description

    The file naming convention for the zip-files that contain the datasets provided here is as follows:

    gHM_landLakeReservoirOcean300m.zip - contains TIFs at 300 m resolution to represent the following classes: 1=land, 2=lake (natural water bodies), 3=reservoirs (water bodies created by dams), and 4=ocean.

    gHMv1_300m_1990_change.zip - contains TIFs at 300 m resolution for calculating change between 1990 and 2000, 2010, or 2015.

    gHMv1_300m_2000_change.zip - contains TIFs at 300 m resolution for calculating change between 2000 and 1990, 2010, or 2015.

    gHMv1_300m_2010_change.zip - contains TIFs at 300 m resolution for calculating change between 2010 and 1990, 2000, or 2015.

    gHMv1_300m_2015_change.zip - contains TIFs at 300 m resolution for calculating change between 2015 and 1990, 2000, or 2010.

    gHMv1_300m_2017_static.zip - contains TIFs at 300 m resolution that contains all stressors that represents ~2017 conditions. This is NOT to be used to compare to other "change" datasets.

    gH...

  14. f

    Data from: Time-series China urban land use mapping (2016–2022): An approach...

    • figshare.com
    zip
    Updated Dec 27, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Xiong Shuping (2024). Time-series China urban land use mapping (2016–2022): An approach for achieving spatial-consistency and semantic-transition rationality in temporal domain [Dataset]. http://doi.org/10.6084/m9.figshare.27610683.v3
    Explore at:
    zipAvailable download formats
    Dataset updated
    Dec 27, 2024
    Dataset provided by
    figshare
    Authors
    Xiong Shuping
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    If you want to use this data, please cite our article:Xiong, S., Zhang, X., Lei, Y., Tan, G., Wang, H., & Du, S. (2024). Time-series China urban land use mapping (2016–2022): An approach for achieving spatial-consistency and semantic-transition rationality in temporal domain. Remote Sensing of Environment, 312, 114344.The global urbanization trend is geographically manifested through city expansion and the renewal of internal urban structures and functions. Time-series urban land use (ULU) maps are vital for capturing dynamic land changes in the urbanization process, giving valuable insights into urban development and its environmental consequences. Recent studies have mapped ULU in some cities with a unified model, but ignored the regional differences among cities; and they generated ULU maps year by year, but ignored temporal correlations between years; thus, they could be weak in large-scale and long time-series ULU monitoring. Accordingly, we introduce an temporal-spatial-semantic collaborative (TSS) mapping framework to generating accurate ULU maps with considering regional differences and temporal correlations. Firstly, to support model training, a large-scale ULU sample dataset based on OpenStreetMap (OSM) and Sentinel-2 imagery is automatically constructed, providing a total number of 56,412 samples with a size of 512 × 512 which are divided into six sub-regions in China and used for training different classification models. Then, an urban land use mapping network (ULUNet) is proposed to recognize ULU. This model utilizes a primary and an auxiliary encoder to process noisy OSM samples and can enhance the model's robustness under noisy labels. Finally, taking the temporal correlations of ULU into consideration, the recognized ULU are optimized, whose boundaries are unified by a time-series co-segmentation, and whose categories are modified by a knowledge-data driven method. To verify the effectiveness of the proposed method, we consider all urban areas in China (254,566 km2), and produce a time-series China urban land use dataset (CULU) at a 10-m resolution, spanning from 2016 to 2022, with an overall accuracy of CULU is 82.42%. Through comparison, it can be found that CULU outperforms existing datasets such as EULUC-China and UFZ-31cities in data accuracies, spatial boundaries consistencies and land use transitions logicality. The results indicate that the proposed method and generated dataset can play important roles in land use change monitoring, ecological-environmental evolution analysis, and also sustainable city development.

  15. Data from: Mapping multi-temporal population distribution in China from 1985...

    • figshare.com
    zip
    Updated Sep 4, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Haoming Zhuang (2021). Mapping multi-temporal population distribution in China from 1985 to 2010 using Landsat images via deep learning [Dataset]. http://doi.org/10.6084/m9.figshare.15095748.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Sep 4, 2021
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Haoming Zhuang
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    China
    Description

    This dataset includes four parts:(1) The produced multi-temporal gridded population data (1km x 1km) of China from 1985-2010 with a five-year interval. The data is stored in the "population_density" folder. The values in the TIFF files represent the population density of cells. The cell area TIFF is provided to calculate the total population count.(2) The samples for training, validating and testing the ResNet-N model. The data is stored in the "samples_for_resnet_n" folder.(3) The town-scale validation data for 2010. The data is stored in the "town_level_validation" folder. (4) The county-scale validation data from 1990-2010. The data is stored in the "county_level_validation" folder.

  16. H

    Spatio-temporal Ecohydrologic Mapping Tutorial

    • hydroshare.org
    • dataone.org
    zip
    Updated May 23, 2017
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sai Nudurupati; Erkan Istanbulluoglu (2017). Spatio-temporal Ecohydrologic Mapping Tutorial [Dataset]. https://www.hydroshare.org/resource/50125a4c19d54b5dab8e49b75e7b415e
    Explore at:
    zip(29.4 KB)Available download formats
    Dataset updated
    May 23, 2017
    Dataset provided by
    HydroShare
    Authors
    Sai Nudurupati; Erkan Istanbulluoglu
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This Landlab driver illustrates the use of Landlab ecohydrology components to model semi-arid ecohydrological dynamics driven by a storm pulse and solar radiation. Components (names given in parenthesis) we will use are: * Precipitation Distribution (PrecipitationDistribution) * Solar radiation (Radiation) * Potential Evapotranspiration (PotentialEvapotranspiration) * Soil Moisture (SoilMoisture) * Vegetation (Vegetation) A digital elevation model (DEM) of a headwater region in central New Mexico (latitude 34N) will be used as input.

    This tutorial is an extension of Ecohydrologic Mapping Tutorial (Istanbulluoglu, E., S. S. Nudurupati (2016). Landlab Ecohydrologic Mapping Tutorial, HydroShare, http://www.hydroshare.org/resource/15d0a79514c44a59b41b68ad74496d0f)

  17. D

    Data from: Mapping tick dynamics and tick bite risk using data-driven...

    • lifesciences.datastations.nl
    bin, csv, png +7
    Updated Sep 11, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    I. Garcia-Marti; I. Garcia-Marti (2019). Mapping tick dynamics and tick bite risk using data-driven approaches and volunteered observations [Dataset]. http://doi.org/10.17026/DANS-ZRE-TGGD
    Explore at:
    tiff(153311943), tiff(422055), png(410086), csv(5742834), tiff(153310703), png(408231), text/x-python(7184), xml(391), tiff(153312027), bin(398), xml(1022), tiff(41254912), csv(5726867), text/comma-separated-values(0), bin(108), bin(1058), txt(9685), text/x-python(4025), bin(106), txt(1604), csv(210000), text/x-python(16678), text/x-python(9931), csv(6222792), csv(5802942), png(411523), tiff(421532), text/x-python(3738), text/x-python(2750), csv(6885072), csv(100195), csv(3941145), csv(5557879), xml(185), png(412004), csv(1197487), png(419987), tiff(31899648), csv(1810005), csv(8141472), text/x-python(774), tiff(6062080), text/x-python(3221), text/x-python(9264), csv(64946), csv(1555372), text/x-python(5321), csv(6070696), text/x-python(9568), png(402946), csv(6186043), text/x-python(15374), csv(5493446), tiff(842776), png(413310), text/x-python(1491), txt(14044), text/x-python(8492), csv(18065), png(411986), tiff(438738), text/x-python(3363), txt(268), xml(286), text/x-python(3969), csv(855727), tiff(127614976), csv(7807904), text/x-python(2956), text/x-python(4061), text/x-python(10178), text/x-python(3386), text/x-python(5200), csv(780439), png(419009), text/x-python(2894), text/x-python(5966), zip(463746), text/x-python(15435), text/x-python(4436), tiff(135790592), csv(6539057), text/x-python(1812), txt(9934), text/x-python(1840), csv(8474544), png(420496), csv(6370761), csv(9165636), png(410406), csv(6570701), csv(6411299), text/x-python(3272), tiff(3112960), text/x-python(3511), tiff(153318499), csv(7903380), csv(951538), csv(427972), xml(41046), tsv(1716730), tsv(493683), tsv(4658), tsv(480492), tsv(394462), tsv(490143), tsv(2285105), tsv(991166), tsv(2010160), tsv(110567), tsv(330846), tsv(2263040), tsv(5898), tsv(558181), tsv(55684), tsv(7788), tsv(602896), tsv(995534), tsv(614108), tsv(693437), tsv(10329), tsv(747053), tsv(71795), tsv(1237381), tsv(498799), tsv(437521), tsv(45528), tsv(504336), tsv(28914), tsv(433802), tsv(47370), tsv(594569), tsv(1063569), tsv(2289719), tsv(457899), tsv(1572), tsv(20042), tsv(56361), tsv(4275073), tsv(1267961), tsv(1373), tsv(123871), tsv(1072437), tsv(86119), tsv(2279204), tsv(1224317), tsv(19157), tsv(74888), tsv(216201), tsv(473293), tsv(2291716), tsv(542666), tsv(1182185), tsv(635776), tsv(3018606), tsv(1101630), tsv(11102), tsv(291835), tsv(2275609), tsv(444293), tsv(63116), tsv(394467), tsv(4346), tsv(992133), tsv(2288690), tsv(35670), tsv(1006394), tsv(699280), tsv(2278352), tsv(99604), tsv(2503), tsv(393026), tsv(633163), tsv(1252444), tsv(57914), tsv(114317), tsv(2236542), tsv(110656), tsv(3196), tsv(624436), tsv(40154), tsv(204518), tsv(4139483), tsv(1363), tsv(519978), tsv(570070), tsv(2227576), tsv(33354), tsv(42597), tsv(217634), tsv(542504), tsv(431489), tsv(938), tsv(491548), tsv(1170249), tsv(494390), tsv(9142), tsv(2003783), tsv(624985), tsv(2231680), tsv(331315), tsv(2290101), tsv(1205428), tsv(468066), tsv(36673), tsv(471907), tsv(267466), tsv(2292660), tsv(2279600), tsv(3272420), tsv(162450), tsv(82667), tsv(988571), tsv(27699), tsv(2276155), tsv(215813), tsv(55461), tsv(999761), tsv(138925), tsv(635821), tsv(630423), tsv(1002412), tsv(466553), tsv(2287264), tsv(993987), tsv(110271), tsv(645096), tsv(471439), tsv(2263896), tsv(95739), tsv(1403828), tsv(2271542), tsv(997737), text/x-python(2820), tiff(21314), xml(358), png(401981), csv(5586709), tiff(422063), txt(1846), bin(11530), csv(6745355), txt(1513), text/x-python(3883), csv(1780210), csv(376596), png(405098), text/x-python(6921), xml(9105), text/x-python(8864), text/x-python(6752), text/x-python(4273), png(401033), csv(5706797), csv(1260970), text/x-python(1834), text/x-python(12286), png(402071), text/x-python(10352), text/x-python(17741), png(408339), text/x-python(3811), text/x-python(6627), png(403995), txt(13456), txt(1712), text/x-python(3622), png(401858), text/x-python(2999), tsv(280416), tsv(2278055), tsv(36656), tsv(56611), tsv(80263), tsv(2276747), tsv(217727), tsv(79729), tsv(1416038), tsv(577525), tsv(426075), tsv(1103644), tsv(459273), tsv(2092400), tsv(535349), tsv(981093), tsv(280), tsv(259847), tsv(985987), tsv(560), tsv(1413366), tsv(2268263), tsv(85250), tsv(3815255), tsv(3019176), tsv(416013), tsv(2292594), tsv(2283538), tsv(993986), tsv(989), tsv(2291261), tsv(532622), tsv(263094), tsv(995388), tsv(36312), tsv(586541), tsv(597436), tsv(539153), tsv(71854), tsv(45544), tsv(2288358), tsv(1005655), tsv(2289486), tsv(443218), tsv(455745), tsv(606483), tsv(1676), tsv(2293612), tsv(2282367), tsv(4270205), tsv(870), tsv(622541), tsv(611190), tsv(28936), tsv(36356), tsv(162557), tsv(997671), tsv(605190), tsv(44645), tsv(427327), tsv(163344)Available download formats
    Dataset updated
    Sep 11, 2019
    Dataset provided by
    DANS Data Station Life Sciences
    Authors
    I. Garcia-Marti; I. Garcia-Marti
    License

    https://doi.org/10.17026/fp39-0x58https://doi.org/10.17026/fp39-0x58

    Description

    This deposit contains the materials used during the development of this PhD thesis. During this research, we applied machine learning methods to obtain new insights about tick dynamics and tick bite risk in the Netherlands. We combined volunteered data sources coming from two citizen science projects with a wide array of environmental variables (e.g. weather, remote sensing, official geodata) to devise models capable of predicting the risk of tick bite or daily tick activity at the national level. We hope that this research and the associated materials can be inspiring for future researchers.

  18. A

    Landsat Time Enabled Imagery

    • data.amerigeoss.org
    • amerigeo.org
    • +3more
    Updated Jul 9, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    AmeriGEOSS (2021). Landsat Time Enabled Imagery [Dataset]. https://data.amerigeoss.org/fi/dataset/landsat-time-enabled-imagery
    Explore at:
    arcgis geoservices rest api, htmlAvailable download formats
    Dataset updated
    Jul 9, 2021
    Dataset provided by
    AmeriGEOSS
    Description

    This map includes a variety of Landsat services which have been time enabled and can be explored using the Time Slider in the ArcGIS.COM Map Viewer or Explorer. Each layer has a predefined useful band combination already set on the services.


    For more information about each layer, click on the hyperlink bellow.

    Data Source: This map includes image services compiled from the following Global Land Survey (GLS) datasets: GLS 2005, GLS 2000, GLS 1990, and GLS 1975. GLS datasets are created by the U.S. Geological Survey (USGS) and the National Aeronautics and Space Administration (NASA) using Landsat images. These global minimal-cloud cover, orthorectified Landsat data products support global assessments of land-cover, land cover-change, and ecosystem dynamics such as disturbance and vegetation health.

  19. o

    Temporal mapping of C/EBPα and –β binding during liver regeneration

    • omicsdi.org
    xml
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Johannes E Waage,Fin S Larsen,Bo T Porse,Nicolas Rapin,Hanne C Bisgaard,Janus S Jakobsen,Johannes Eichler Waage, Temporal mapping of C/EBPα and –β binding during liver regeneration [Dataset]. https://www.omicsdi.org/dataset/arrayexpress-repository/E-GEOD-42321
    Explore at:
    xmlAvailable download formats
    Authors
    Johannes E Waage,Fin S Larsen,Bo T Porse,Nicolas Rapin,Hanne C Bisgaard,Janus S Jakobsen,Johannes Eichler Waage
    Variables measured
    Genomics
    Description

    Temporal mapping of C/EBPα and –β binding during liver regeneration reveals dynamic occupancy and specific regulatory codes for homeostatic and cell cycle gene batteries. Timeseries ChIP-seq of transcription factors C/EBPα and –β, and Pol II in time points 0, 3, 8, 16, 24, 36, 48, 168 hours after partial hepatectomy. Additionally, an IgG mock to 0h was sequenced, as well as the transcription factor Egr1.

  20. e

    GapMap Frontal to Temporal

    • search.kg.ebrains.eu
    Updated Jul 11, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Katrin Amunts; Hartmut Mohlberg; Sebastian Bludau; Peter Pieperhoff (2020). GapMap Frontal to Temporal [Dataset]. http://doi.org/10.25493/468P-A99
    Explore at:
    Dataset updated
    Jul 11, 2020
    Authors
    Katrin Amunts; Hartmut Mohlberg; Sebastian Bludau; Peter Pieperhoff
    Description

    This dataset contains the “GapMap Frontal to Temporal" in the individual, single subject template of the MNI Colin 27 as well as the MNI ICBM 152 2009c nonlinear asymmetric reference space. In order to provide whole-brain coverage for the cortex within the Julich-Brain Atlas, yet uncharted parts of the frontal cortex have been combined to the brain region “GapMap Frontal to Temporal”. The distributions were modeled so that probabilistic gap maps were computed in analogy to other maps of the Julich-Brain Atlas. The probabilistic map of “GapMap Frontal to Temporal” is provided in NifTi format for each hemisphere in the reference space. The Julich-Brain atlas relies on a modular, flexible and adaptive framework containing workflows to create the probabilistic brain maps for these structures. New maps are continuously replacing parts of “GapMap Frontal to Temporal” with progress in mapping.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Baruch-Mordo, Sharon (2023). Data for detailed temporal mapping of global human modification from 1990 to 2017 [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_3963012

Data for detailed temporal mapping of global human modification from 1990 to 2017

Explore at:
Dataset updated
Jun 18, 2023
Dataset provided by
Oakleaf, James
Chen, Bin
Theobald, David M.
Kennedy, Christina
Baruch-Mordo, Sharon
Kiesecker, Joe
License

Attribution 1.0 (CC BY 1.0)https://creativecommons.org/licenses/by/1.0/
License information was derived automatically

Description

Data on the extent, patterns, and trends of human land use are critically important to support global and national priorities for conservation and sustainable development. To inform these issues, we created a series of detailed global datasets for 1990, 1995, 2000, 2005, 2010, 2015, and 2017 to evaluate temporal changes and spatial patterns of land use modification of terrestrial lands (excluding Antarctica). These data were calculated using the degree of human modification approach that combines the proportion of a pixel of a given stressor (i.e. footprint) times the intensity of that stressor (ranging from 0 to 1.0). Our novel datasets are detailed (0.09 km^2 resolution), temporally consistent (for 1990-2015, every 5 years), comprehensive (11 change stressors, 14 current), robust (using an established framework and incorporating classification errors and parameter uncertainty), and strongly validated. We also provide a dataset that represents ~2017 conditions and has 14 stressors for an even more comprehensive dataset, but the 2017 results should not be used to calculate change with the other datasets (1990-2015). Note that because of repo file size limits, the datasets for the for the HM overall for 1990 and 1995, as well as major stressors for all years, are located this Google Drive.

This version 1.5 provides the following updates:

Datasets are provided for each of the 6 stressor groups: built-up areas (BU), agricultural/timber harvest (AG), extractive energy and mining (EX), human intrusions (HI), natural system modifications (NS), and transportation & infrastructure (TI), available now at 300 m resolution for each of the time steps in the 1990-2015 time series.

It provides the addition datasets for the years 1995 and 2005, calculated using linear interpolation when stressor data do not provide data at the specific year.

The ESA 150 m water-mask dataset (Lamarche et al. 2017) was used to provide better and more consistent alignment of datasets at the ocean-land-inland water interfaces.

The built-up stressor uses an updated version of the Global Human Settlement Layer (v2022A).

Values provided are 32-bit floating point values, with human modification values ranging from 0.0 to 1.0.

For more details on the approach and methods, please see: Theobald, D. M., Kennedy, C., Chen, B., Oakleaf, J., Baruch-Mordo, S., and Kiesecker, J.: Earth transformed: detailed mapping of global human modification from 1990 to 2017, Earth Syst. Sci. Data., https://doi.org/10.5194/essd-2019-252, 2020.

Version 1.5 was completed in collaboration with the Center for Biodiversity and Global Change at Yale University and supported by the E.O. Wilson Biodiversity Foundation.

Search
Clear search
Close search
Google apps
Main menu