Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This web map displays the land use/land cover (LULC) timeseries layer derived from ESA Sentinel-2 imagery at 10m resolution. The visualization uses blend modes and is best used in the new Map Viewer. The time slider can be used to advance through the five years of data from 2017-2021. There are also a series of bookmarks for the locations below:Urban growth examplesOuagadougouCairo/GizaDubai, UAEKaty, Texas, USALoudoun County, VirginiaInfrastructureIstanbul International Airport, TurkeyGrand Ethiopian Renaissance Dam, EthiopiaDeforestationBorder of Acre and Rondonia states, BrazilHarz Mountains, GermanyWetlands lossPantanal, BrazilParana river, ArgentinaVegetation changing after fireNorthern California: Paradise, Redding, Clear Lake, Santa Rosa, Mendocino National ForestKangaroo Island, AustraliaVictoria and NSW, AustraliaYakutia, RussiaHurricane ImpactAbaco Island, BahamasRecent Lava FlowHawaii IslandSurface MiningBrown Coal, Cottbus, GermanyLand ReclamationMarkermeer, NetherlandsEconomic DevelopmentNorth vs South Korea
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
About the dataLand use land cover (LULC) maps are an increasingly important tool for decision-makers in many industry sectors and developing nations around the world. The information provided by these maps helps inform policy and land management decisions by better understanding and quantifying the impacts of earth processes and human activity.ArcGIS Living Atlas of the World provides a detailed, accurate, and timely LULC map of the world. The data is the result of a three-way collaboration among Esri, Impact Observatory, and Microsoft. For more information about the data, see Sentinel-2 10m Land Use/Land Cover Time Series.About the appOne of the foremost capabilities of this app is the dynamic change analysis. The app provides dynamic visual and statistical change by comparing annual slices of the Sentinel-2 10m Land Use/Land Cover data as you explore the map.Overview of capabilities:Visual change analysis with either 'Step Mode' or 'Swipe Mode'Dynamic statistical change analysis by year, map extent, and classFilter by selected land cover classRegional class statistics summarized by administrative boundariesImagery mode for visual investigation and validation of land coverSelect imagery renderings (e.g. SWIR to visualize forest burn scars)Data download for offline use
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This layer displays a global map of land use/land cover (LULC) derived from ESA Sentinel-2 imagery at 10m resolution. Each year is generated with Impact Observatory’s deep learning AI land classification model, trained using billions of human-labeled image pixels from the National Geographic Society. The global maps are produced by applying this model to the Sentinel-2 Level-2A image collection on Microsoft’s Planetary Computer, processing over 400,000 Earth observations per year.The algorithm generates LULC predictions for nine classes, described in detail below. The year 2017 has a land cover class assigned for every pixel, but its class is based upon fewer images than the other years. The years 2018-2023 are based upon a more complete set of imagery. For this reason, the year 2017 may have less accurate land cover class assignments than the years 2018-2023.Variable mapped: Land use/land cover in 2017, 2018, 2019, 2020, 2021, 2022, 2023Source Data Coordinate System: Universal Transverse Mercator (UTM) WGS84Service Coordinate System: Web Mercator Auxiliary Sphere WGS84 (EPSG:3857)Extent: GlobalSource imagery: Sentinel-2 L2ACell Size: 10-metersType: ThematicAttribution: Esri, Impact ObservatoryWhat can you do with this layer?Global land use/land cover maps provide information on conservation planning, food security, and hydrologic modeling, among other things. This dataset can be used to visualize land use/land cover anywhere on Earth. This layer can also be used in analyses that require land use/land cover input. For example, the Zonal toolset allows a user to understand the composition of a specified area by reporting the total estimates for each of the classes. NOTE: Land use focus does not provide the spatial detail of a land cover map. As such, for the built area classification, yards, parks, and groves will appear as built area rather than trees or rangeland classes.Class definitionsValueNameDescription1WaterAreas where water was predominantly present throughout the year; may not cover areas with sporadic or ephemeral water; contains little to no sparse vegetation, no rock outcrop nor built up features like docks; examples: rivers, ponds, lakes, oceans, flooded salt plains.2TreesAny significant clustering of tall (~15 feet or higher) dense vegetation, typically with a closed or dense canopy; examples: wooded vegetation, clusters of dense tall vegetation within savannas, plantations, swamp or mangroves (dense/tall vegetation with ephemeral water or canopy too thick to detect water underneath).4Flooded vegetationAreas of any type of vegetation with obvious intermixing of water throughout a majority of the year; seasonally flooded area that is a mix of grass/shrub/trees/bare ground; examples: flooded mangroves, emergent vegetation, rice paddies and other heavily irrigated and inundated agriculture.5CropsHuman planted/plotted cereals, grasses, and crops not at tree height; examples: corn, wheat, soy, fallow plots of structured land.7Built AreaHuman made structures; major road and rail networks; large homogenous impervious surfaces including parking structures, office buildings and residential housing; examples: houses, dense villages / towns / cities, paved roads, asphalt.8Bare groundAreas of rock or soil with very sparse to no vegetation for the entire year; large areas of sand and deserts with no to little vegetation; examples: exposed rock or soil, desert and sand dunes, dry salt flats/pans, dried lake beds, mines.9Snow/IceLarge homogenous areas of permanent snow or ice, typically only in mountain areas or highest latitudes; examples: glaciers, permanent snowpack, snow fields.10CloudsNo land cover information due to persistent cloud cover.11RangelandOpen areas covered in homogenous grasses with little to no taller vegetation; wild cereals and grasses with no obvious human plotting (i.e., not a plotted field); examples: natural meadows and fields with sparse to no tree cover, open savanna with few to no trees, parks/golf courses/lawns, pastures. Mix of small clusters of plants or single plants dispersed on a landscape that shows exposed soil or rock; scrub-filled clearings within dense forests that are clearly not taller than trees; examples: moderate to sparse cover of bushes, shrubs and tufts of grass, savannas with very sparse grasses, trees or other plants.Classification ProcessThese maps include Version 003 of the global Sentinel-2 land use/land cover data product. It is produced by a deep learning model trained using over five billion hand-labeled Sentinel-2 pixels, sampled from over 20,000 sites distributed across all major biomes of the world.The underlying deep learning model uses 6-bands of Sentinel-2 L2A surface reflectance data: visible blue, green, red, near infrared, and two shortwave infrared bands. To create the final map, the model is run on multiple dates of imagery throughout the year, and the outputs are composited into a final representative map for each year.The input Sentinel-2 L2A data was accessed via Microsoft’s Planetary Computer and scaled using Microsoft Azure Batch.CitationKarra, Kontgis, et al. “Global land use/land cover with Sentinel-2 and deep learning.” IGARSS 2021-2021 IEEE International Geoscience and Remote Sensing Symposium. IEEE, 2021.AcknowledgementsTraining data for this project makes use of the National Geographic Society Dynamic World training dataset, produced for the Dynamic World Project by National Geographic Society in partnership with Google and the World Resources Institute.
Attribution 3.0 (CC BY 3.0)https://creativecommons.org/licenses/by/3.0/
License information was derived automatically
This dataset shows the tiling grid and their IDs for Sentinel 2 satellite imagery. The tiling grid IDs are useful for selecting imagery of an area of interest.
Sentinel 2 is an Earth observation satellite developed and operated by the European Space Agency (ESA). Its imagery has 13 bands in the visible, near infrared and short wave infrared part of the spectrum. It has a spatial resolution of 10 m, 20 m and 60 m depending on the spectral band.
Sentinel-2 has a 290 km field of view when capturing its imagery. This imagery is then projected on to a UTM grid and made available publicly on 100x100 km2 tiles. Each tile has a unique ID. This ID scheme allows all imagery for a given tile to be located.
Provenance:
The ESA make the tiling grid available as a KML file (see links). We were, however, unable to convert this KML into a shapefile for deployment on the eAtlas. The shapefile used for this layer was sourced from the Git repository developed by Justin Meyers (https://github.com/justinelliotmeyers/Sentinel-2-Shapefile-Index).
Why is this dataset in the eAtlas?:
Sentinel 2 imagery is very useful for the studying and mapping of reef systems. Selecting imagery for study often requires knowing what the tile grid IDs are for the area of interest. This dataset is intended as a reference layer. The eAtlas is not a custodian of this dataset and copies of the data should be obtained from the original sources.
Data Dictionary:
The Sentinel-2 mission is a land monitoring constellation of two satellites that provide high resolution optical imagery and provide continuity for the current SPOT and Landsat missions. The mission provides a global coverage of the Earth's land surface every 5 days, making the data of great use in on-going studies. L2A data are available from November 2016 over Europe region and globally since January 2017. L2A data provide Bottom of the atmosphere (BOA) reflectance.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains both large (A0) printable maps of the Torres Strait broken into six overlapping regions, based on a clear sky, clear water composite Sentinel 2 composite imagery and the imagery used to create these maps. These maps show satellite imagery of the region, overlaid with reef and island boundaries and names. Not all features are named, just the more prominent features. This also includes a vector map of Ashmore Reef and Boot Reef in Coral Sea as these were used in the same discussions that these maps were developed for. The map of Ashmore Reef includes the atoll platform, reef boundaries and depth polygons for 5 m and 10 m.
This dataset contains all working files used in the development of these maps. This includes all a copy of all the source datasets and all derived satellite image tiles and QGIS files used to create the maps. This includes cloud free Sentinel 2 composite imagery of the Torres Strait region with alpha blended edges to allow the creation of a smooth high resolution basemap of the region.
The base imagery is similar to the older base imagery dataset: Torres Strait clear sky, clear water Landsat 5 satellite composite (NERP TE 13.1 eAtlas, AIMS, source: NASA).
Most of the imagery in the composite imagery from 2017 - 2021.
Method: The Sentinel 2 basemap was produced by processing imagery from the World_AIMS_Marine-satellite-imagery dataset (not yet published) for the Torres Strait region. The TrueColour imagery for the scenes covering the mapped area were downloaded. Both the reference 1 imagery (R1) and reference 2 imagery (R2) was copied for processing. R1 imagery contains the lowest noise, most cloud free imagery, while R2 contains the next best set of imagery. Both R1 and R2 are typically composite images from multiple dates.
The R2 images were selectively blended using manually created masks with the R1 images. This was done to get the best combination of both images and typically resulted in a reduction in some of the cloud artefacts in the R1 images. The mask creation and previewing of the blending was performed in Photoshop. The created masks were saved in 01-data/R2-R1-masks. To help with the blending of neighbouring images a feathered alpha channel was added to the imagery. The processing of the merging (using the masks) and the creation of the feathered borders on the images was performed using a Python script (src/local/03-merge-R2-R1-images.py) using the Pillow library and GDAL. The neighbouring image blending mask was created by applying a blurring of the original hard image mask. This allowed neighbouring image tiles to merge together.
The imagery and reference datasets (reef boundaries, EEZ) were loaded into QGIS for the creation of the printable maps.
To optimise the matching of the resulting map slight brightness adjustments were applied to each scene tile to match its neighbours. This was done in the setup of each image in QGIS. This adjustment was imperfect as each tile was made from a different combinations of days (to remove clouds) resulting in each scene having a different tonal gradients across the scene then its neighbours. Additionally Sentinel 2 has slight stripes (at 13 degrees off the vertical) due to the swath of each sensor having a slight sensitivity difference. This effect was uncorrected in this imagery.
Single merged composite GeoTiff: The image tiles with alpha blended edges work well in QGIS, but not in ArcGIS Pro. To allow this imagery to be used across tools that don't support the alpha blending we merged and flattened the tiles into a single large GeoTiff with no alpha channel. This was done by rendering the map created in QGIS into a single large image. This was done in multiple steps to make the process manageable.
The rendered map was cut into twenty 1 x 1 degree georeferenced PNG images using the Atlas feature of QGIS. This process baked in the alpha blending across neighbouring Sentinel 2 scenes. The PNG images were then merged back into a large GeoTiff image using GDAL (via QGIS), removing the alpha channel. The brightness of the image was adjusted so that the darkest pixels in the image were 1, saving the value 0 for nodata masking and the boundary was clipped, using a polygon boundary, to trim off the outer feathering. The image was then optimised for performance by using internal tiling and adding overviews. A full breakdown of these steps is provided in the README.md in the 'Browse and download all data files' link.
The merged final image is available in export\TS_AIMS_Torres Strait-Sentinel-2_Composite.tif
.
Change Log: 2023-03-02: Eric Lawrey Created a merged version of the satellite imagery, with no alpha blending so that it can be used in ArcGIS Pro. It is now a single large GeoTiff image. The Google Earth Engine source code for the World_AIMS_Marine-satellite-imagery was included to improve the reproducibility and provenance of the dataset, along with a calculation of the distribution of image dates that went into the final composite image. A WMS service for the imagery was also setup and linked to from the metadata. A cross reference to the older Torres Strait clear sky clear water Landsat composite imagery was also added to the record.
22 Nov 2023: Eric Lawrey Added the data and maps for close up of Mer. - 01-data/TS_DNRM_Mer-aerial-imagery/ - preview/Torres-Strait-Mer-Map-Landscape-A0.jpeg - exports/Torres-Strait-Mer-Map-Landscape-A0.pdf Updated 02-Torres-Strait-regional-maps.qgz to include the layout for the new map.
Source datasets: Complete Great Barrier Reef (GBR) Island and Reef Feature boundaries including Torres Strait Version 1b (NESP TWQ 3.13, AIMS, TSRA, GBRMPA), https://eatlas.org.au/data/uuid/d2396b2c-68d4-4f4b-aab0-52f7bc4a81f5
Geoscience Australia (2014b), Seas and Submerged Lands Act 1973 - Australian Maritime Boundaries 2014a - Geodatabase [Dataset]. Canberra, Australia: Author. https://creativecommons.org/licenses/by/4.0/ [license]. Sourced on 12 July 2017, https://dx.doi.org/10.4225/25/5539DFE87D895
Basemap/AU_GA_AMB_2014a/Exclusive_Economic_Zone_AMB2014a_Limit.shp The original data was obtained from GA (Geoscience Australia, 2014a). The Geodatabase was loaded in ArcMap. The Exclusive_Economic_Zone_AMB2014a_Limit layer was loaded and exported as a shapefile. Since this file was small no clipping was applied to the data.
Geoscience Australia (2014a), Treaties - Australian Maritime Boundaries (AMB) 2014a [Dataset]. Canberra, Australia: Author. https://creativecommons.org/licenses/by/4.0/ [license]. Sourced on 12 July 2017, http://dx.doi.org/10.4225/25/5539E01878302 Basemap/AU_GA_Treaties-AMB_2014a/Papua_New_Guinea_TSPZ_AMB2014a_Limit.shp The original data was obtained from GA (Geoscience Australia, 2014b). The Geodatabase was loaded in ArcMap. The Papua_New_Guinea_TSPZ_AMB2014a_Limit layer was loaded and exported as a shapefile. Since this file was small no clipping was applied to the data.
AIMS Coral Sea Features (2022) - DRAFT This is a draft version of this dataset. The region for Ashmore and Boot reef was checked. The attributes in these datasets haven't been cleaned up. Note these files should not be considered finalised and are only suitable for maps around Ashmore Reef. Please source an updated version of this dataset for any other purpose. CS_AIMS_Coral-Sea-Features/CS_Names/Names.shp CS_AIMS_Coral-Sea-Features/CS_Platform_adj/CS_Platform.shp CS_AIMS_Coral-Sea-Features/CS_Reef_Boundaries_adj/CS_Reef_Boundaries.shp CS_AIMS_Coral-Sea-Features/CS_Depth/CS_AIMS_Coral-Sea-Features_Img_S2_R1_Depth5m_Coral-Sea.shp CS_AIMS_Coral-Sea-Features/CS_Depth/CS_AIMS_Coral-Sea-Features_Img_S2_R1_Depth10m_Coral-Sea.shp
Murray Island 20 Sept 2011 15cm SISP aerial imagery, Queensland Spatial Imagery Services Program, Department of Resources, Queensland This is the high resolution imagery used to create the map of Mer.
Marine satellite imagery (Sentinel 2 and Landsat 8) (AIMS), https://eatlas.org.au/data/uuid/5d67aa4d-a983-45d0-8cc1-187596fa9c0c - World_AIMS_Marine-satellite-imagery
Data Location: This dataset is filed in the eAtlas enduring data repository at: data\custodian\2020-2029-AIMS\TS_AIMS_Torres-Strait-Sentinel-2-regional-maps. On the eAtlas server it is stored at eAtlas GeoServer\data\2020-2029-AIMS.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Sentinel2GlobalLULC is a deep learning-ready dataset of RGB images from the Sentinel-2 satellites designed for global land use and land cover (LULC) mapping. Sentinel2GlobalLULC v2.1 contains 194,877 images in GeoTiff and JPEG format corresponding to 29 broad LULC classes. Each image has 224 x 224 pixels at 10 m spatial resolution and was produced by assigning the 25th percentile of all available observations in the Sentinel-2 collection between June 2015 and October 2020 in order to remove atmospheric effects (i.e., clouds, aerosols, shadows, snow, etc.). A spatial purity value was assigned to each image based on the consensus across 15 different global LULC products available in Google Earth Engine (GEE).
Our dataset is structured into 3 main zip-compressed folders, an Excel file with a dictionary for class names and descriptive statistics per LULC class, and a python script to convert RGB GeoTiff images into JPEG format. The first folder called "Sentinel2LULC_GeoTiff.zip" contains 29 zip-compressed subfolders where each one corresponds to a specific LULC class with hundreds to thousands of GeoTiff Sentinel-2 RGB images. The second folder called "Sentinel2LULC_JPEG.zip" contains 29 zip-compressed subfolders with a JPEG formatted version of the same images provided in the first main folder. The third folder called "Sentinel2LULC_CSV.zip" includes 29 zip-compressed CSV files with as many rows as provided images and with 12 columns containing the following metadata (this same metadata is provided in the image filenames):
For seven LULC classes, we could not export from GEE all images that fulfilled a spatial purity of 100% since there were millions of them. In this case, we exported a stratified random sample of 14,000 images and provided an additional CSV file with the images actually contained in our dataset. That is, for these seven LULC classes, we provide these 2 CSV files:
To clearly state the geographical coverage of images available in this dataset, we included in the version v2.1, a compressed folder called "Geographic_Representativeness.zip". This zip-compressed folder contains a csv file for each LULC class that provides the complete list of countries represented in that class. Each csv file has two columns, the first one gives the country code and the second one gives the number of images provided in that country for that LULC class. In addition to these 29 csv files, we provided another csv file that maps each ISO Alpha-2 country code to its original full country name.
© Sentinel2GlobalLULC Dataset by Yassir Benhammou, Domingo Alcaraz-Segura, Emilio Guirado, Rohaifa Khaldi, Boujemâa Achchab, Francisco Herrera & Siham Tabik is marked with Attribution 4.0 International (CC-BY 4.0)
The Barest Earth Sentinel-2 Map Index dataset depicts the 1 to 250 000 maps sheet tile frames that have been used to generate individual tile downloads of the Barest Earth Sentinel-2 product. This web service is designed to be used in conjunction with the Barest Earth Sentinel-2 web service to provide users with direct links for imagery download.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Important Note: This item is in mature support as of February 2023 and will be retired in December 2025. A new version of this item is available for your use. Esri recommends updating your maps and apps to use the new version. This layer displays change in pixels of the Sentinel-2 10m Land Use/Land Cover product developed by Esri, Impact Observatory, and Microsoft. Available years to compare with 2021 are 2018, 2019 and 2020. By default, the layer shows all comparisons together, in effect showing what changed 2018-2021. But the layer may be changed to show one of three specific pairs of years, 2018-2021, 2019-2021, or 2020-2021.Showing just one pair of years in ArcGIS Online Map ViewerTo show just one pair of years in ArcGIS Online Map viewer, create a filter. 1. Click the filter button. 2. Next, click add expression. 3. In the expression dialogue, specify a pair of years with the ProductName attribute. Use the following example in your expression dialogue to show only places that changed between 2020 and 2021:ProductNameis2020-2021By default, places that do not change appear as a
transparent symbol in ArcGIS Pro. But in ArcGIS Online Map Viewer, a transparent
symbol may need to be set for these places after a filter is
chosen. To do this:4. Click the styles button. 5. Under unique values click style options. 6. Click the symbol next to No Change at the bottom of the legend. 7. Click the slider next to "enable fill" to turn the symbol off.Showing just one pair of years in ArcGIS ProTo show just one pair of years in ArcGIS Pro, choose one of the layer's processing templates to single out a particular pair of years. The processing template applies a definition query that works in ArcGIS Pro. 1. To choose a processing template, right click the layer in the table of contents for ArcGIS Pro and choose properties. 2. In the dialogue that comes up, choose the tab that says processing templates. 3. On the right where it says processing template, choose the pair of years you would like to display. The processing template will stay applied for any analysis you may want to perform as well.How the change layer was created, combining LULC classes from two yearsImpact Observatory, Esri, and Microsoft used artificial intelligence to classify the world in 10 Land Use/Land Cover (LULC) classes for the years 2017-2021. Mosaics serve the following sets of change rasters in a single global layer: Change between 2018 and 2021Change between 2019 and 2021Change between 2020 and 2021To make this change layer, Esri used an arithmetic operation
combining the cells from a source year and 2021 to make a change index
value. ((from year * 16) + to year) In the example of the change between 2020 and 2021, the from year (2020) was multiplied by 16, then added to the to year (2021). Then the combined number is served as an index in an 8 bit unsigned mosaic with an attribute table which describes what changed or did not change in that timeframe. Variable mapped: Change in land cover between 2018, 2019, or 2020 and 2021 Data Projection: Universal Transverse Mercator (UTM)Mosaic Projection: WGS84Extent: GlobalSource imagery: Sentinel-2Cell Size: 10m (0.00008983152098239751 degrees)Type: ThematicSource: Esri Inc.Publication date: January 2022What can you do with this layer?Global LULC maps provide information on conservation planning, food security,
and hydrologic modeling, among other things. This dataset can be used to
visualize land cover anywhere on Earth. This
layer can also be used in analyses that require land cover input. For
example, the Zonal Statistics tools allow a user to understand the
composition of a specified area by reporting the total estimates for
each of the classes. Land Cover processingThis map was produced by a deep learning model trained using over 5 billion hand-labeled Sentinel-2 pixels, sampled from over 20,000 sites distributed across all major biomes of the world. The underlying deep learning model uses 6 bands of Sentinel-2 surface reflectance data: visible blue, green, red, near infrared, and two shortwave infrared bands. To create the final map, the model is run on multiple dates of imagery throughout the year, and the outputs are composited into a final representative map. Processing platformSentinel-2 L2A/B data was accessed via Microsoft’s Planetary Computer and scaled using Microsoft Azure Batch.Class definitions1. WaterAreas
where water was predominantly present throughout the year; may not
cover areas with sporadic or ephemeral water; contains little to no
sparse vegetation, no rock outcrop nor built up features like docks;
examples: rivers, ponds, lakes, oceans, flooded salt plains.2. TreesAny
significant clustering of tall (~15-m or higher) dense vegetation,
typically with a closed or dense canopy; examples: wooded vegetation,
clusters of dense tall vegetation within savannas, plantations, swamp or
mangroves (dense/tall vegetation with ephemeral water or canopy too
thick to detect water underneath).4. Flooded vegetationAreas
of any type of vegetation with obvious intermixing of water throughout a
majority of the year; seasonally flooded area that is a mix of
grass/shrub/trees/bare ground; examples: flooded mangroves, emergent
vegetation, rice paddies and other heavily irrigated and inundated
agriculture.5. CropsHuman
planted/plotted cereals, grasses, and crops not at tree height;
examples: corn, wheat, soy, fallow plots of structured land.7. Built AreaHuman
made structures; major road and rail networks; large homogenous
impervious surfaces including parking structures, office buildings and
residential housing; examples: houses, dense villages / towns / cities,
paved roads, asphalt.8. Bare groundAreas
of rock or soil with very sparse to no vegetation for the entire year;
large areas of sand and deserts with no to little vegetation; examples:
exposed rock or soil, desert and sand dunes, dry salt flats/pans, dried
lake beds, mines.9. Snow/IceLarge
homogenous areas of permanent snow or ice, typically only in mountain
areas or highest latitudes; examples: glaciers, permanent snowpack, snow
fields. 10. CloudsNo land cover information due to persistent cloud cover.11. Rangeland Open
areas covered in homogenous grasses with little to no taller
vegetation; wild cereals and grasses with no obvious human plotting
(i.e., not a plotted field); examples: natural meadows and fields with
sparse to no tree cover, open savanna with few to no trees, parks/golf
courses/lawns, pastures. Mix of small clusters of plants or single
plants dispersed on a landscape that shows exposed soil or rock;
scrub-filled clearings within dense forests that are clearly not taller
than trees; examples: moderate to sparse cover of bushes, shrubs and
tufts of grass, savannas with very sparse grasses, trees or other
plants.CitationKarra,
Kontgis, et al. “Global land use/land cover with Sentinel-2 and deep
learning.” IGARSS 2021-2021 IEEE International Geoscience and Remote
Sensing Symposium. IEEE, 2021.AcknowledgementsTraining
data for this project makes use of the National Geographic Society
Dynamic World training dataset, produced for the Dynamic World Project
by National Geographic Society in partnership with Google and the World
Resources Institute.For questions please email environment@esri.com
Mosaic of two images acquired by the Sentinel-2 satellite taken on 19 and 22 April 2016: the "Sentinels" are a fleet of satellites designed to return data and images deriving from Earth observation to the European Commission's Copernicus programme. The Sentinel 2 mission launched in June 2015, concerns land monitoring. The Sentinel-2 satellite image is also available as a background map from the Geoportal viewer. To use this service in other viewing software, copy the address from the "online resources" field. The version of the WMS is 1.3.0.
This web map is a subset of Sentinel-2 Views. Sentinel-2, 10, 20, and 60m Multispectral, Multitemporal, 13-band imagery is rendered on-the-fly and available for visualization and analytics. This imagery layer pulls directly from the Sentinel-2 on AWS collection and is updated daily with new imagery.This imagery layer can be applied across a number of industries, scientific disciplines, and management practices. Some applications include, but are not limited to, land cover and environmental monitoring, climate change, deforestation, disaster and emergency management, national security, plant health and precision agriculture, forest monitoring, watershed analysis and runoff predictions, land-use planning, tracking urban expansion, highlighting burned areas and estimating fire severity.Geographic CoverageGlobalContinental land masses from 65.4° South to 72.1° North, with these special guidelines:All coastal waters up to 20 km from the shoreAll islands greater than 100 km2All EU islandsAll closed seas (e.g. Caspian Sea)The Mediterranean SeaNote: Areas of interest going beyond the Mission baseline (as laid out in the Mission Requirements Document) will be assessed, and may be added to the baseline if sufficient resources are identified.Temporal CoverageThe revisit time for each point on Earth is every 5 days.This layer is updated daily with new imagery.This imagery layer is designed to include imagery collected within the past 14 months. Custom Image Services can be created for access to images older than 14 months.The number of images available will vary depending on location.Image Selection/FilteringThe most recent and cloud free images are displayed by default.Any image available, within the past 14 months, can be displayed via custom filtering.Filtering can be done based on attributes such as Acquisition Date, Estimated Cloud Cover, and Tile ID.Tile_ID is computed as [year][month][day]T[hours][minutes][seconds]_[UTMcode][latitudeband][square]_[sequence]. More…NOTE: Not using filters, and loading the entire archive, may affect performance.Analysis ReadyThis imagery layer is analysis ready with TOA correction applied.Visual RenderingDefault rendering is Natural Color (bands 4,3,2) with Dynamic Range Adjustment (DRA).The DRA version of each layer enables visualization of the full dynamic range of the images.Rendering (or display) of band combinations and calculated indices is done on-the-fly from the source images via Raster Functions.Various pre-defined Raster Functions can be selected or custom functions created.Available renderings include: Agriculture with DRA, Bathymetric with DRA, Color-Infrared with DRA, Natural Color with DRA, Short-wave Infrared with DRA, Geology with DRA, NDMI Colorized, Normalized Difference Built-Up Index (NDBI), NDWI Raw, NDWI - with VRE Raw, NDVI – with VRE Raw (NDRE), NDVI - VRE only Raw, NDVI Raw, Normalized Burn Ratio, NDVI Colormap.Multispectral BandsBandDescriptionWavelength (µm)Resolution (m)1Coastal aerosol0.433 - 0.453602Blue0.458 - 0.523103Green0.543 - 0.578104Red0.650 - 0.680105Vegetation Red Edge0.698 - 0.713206Vegetation Red Edge0.733 - 0.748207Vegetation Red Edge0.773 - 0.793208NIR0.785 - 0.900108ANarrow NIR0.855 - 0.875209Water vapour0.935 - 0.9556010SWIR – Cirrus1.365 - 1.3856011SWIR-11.565 - 1.6552012SWIR-22.100 - 2.28020Additional NotesOverviews exist with a spatial resolution of 150m and are updated every quarter based on the best and latest imagery available at that time.To work with source images at all scales, the ‘Lock Raster’ functionality is available.NOTE: ‘Lock Raster’ should only be used on the layer for short periods of time, as the imagery and associated record Object IDs may change daily.This ArcGIS Server dynamic imagery layer can be used in Web Maps and ArcGIS Desktop as well as Web and Mobile applications using the REST based Image services API.Images can be exported up to a maximum of 4,000 columns x 4,000 rows per request.Data SourceSentinel-2 imagery is the result of close collaboration between the (European Space Agency) ESA, the European Commission and USGS. Data is hosted by the Amazon Web Services as part of their Registry of Open Data. Users can access the imagery from Sentinel-2 on AWS , or alternatively access Sentinel2Look Viewer, EarthExplorer or the Copernicus Open Access Hub to download the scenes.For information on Sentinel-2 imagery, see Sentinel-2.
Dates of Images:Pre-Event: NonePost-Event: 12/20/2023Date of Next Image:UnknownSummary:The True Color RGB composite provides a product of how the surface would look to the naked eye from space. The RGB is created using the red, green, and blue channels of the respective instrument.Suggested Use:The True Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.Satellite/Sensor:MultiSpectral Instrument (MSI) on European Space Agency's (ESA) Copernicus Sentinel-2A/2B satellitesResolution:10 metersCredits:NASA/MSFC, USGS, ESA CopernicusEsri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags03/services/newengland_flooding_202312/sentinel2_truecolor/MapServer/WMSServerData Download:https://maps.disasters.nasa.gov/download/gis_products/event_specific/2023/newengland_flooding_202312/sentinel2/
Dates of Images:Pre-Event: NonePost-Event: 12/20/2023Date of Next Image:UnknownSummary:The Color Infrared composite is created using the near-infrared, red, and green channels, allowing for the ability to see areas impacted by the event. The near-infrared gives the ability to see through thin clouds. Healthy vegetation is shown as red, water is in blue.Suggested Use:A Color Infrared composite depicts healthy vegetation as red, water as blue. Some minor atmospheric corrections have occurred.Satellite/Sensor:MultiSpectral Instrument (MSI) on European Space Agency's (ESA) Copernicus Sentinel-2A/2B satellitesResolution:10 metersCredits:NASA/MSFC, USGS, ESA CopernicusEsri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags03/services/newengland_flooding_202312/sentinel2_colorinfrared/MapServer/WMSServerData Download:https://maps.disasters.nasa.gov/download/gis_products/event_specific/2023/newengland_flooding_202312/sentinel2/
Three ET datasets were generated to evaluate the potential integration of Landsat and Sentinel-2 data for improved ET mapping. The first ET dataset was generated by linear interpolation (Lint) of Landsat-based ET fraction (ETf) images of before and after the selected image dates. The second ET dataset was generated using the regular SSEBop approach using the Landsat image only (Lonly). The third ET dataset was generated from the proposed Landsat-Sentinel data fusion (L-S) approach by applying ETf images from Landsat and Sentinel. The scripts (two) used to generate these three ET datasets are included – one script for processing SSEBop model to generate ET maps from Lonly and another script for generating ET maps from Lint and L-S approach.
Data licence Germany – Attribution – Version 2.0https://www.govdata.de/dl-de/by-2-0
License information was derived automatically
This landcover map was produced with a classification method developed in the project incora (Inwertsetzung von Copernicus-Daten für die Raumbeobachtung, mFUND Förderkennzeichen: 19F2079C) in cooperation with ILS (Institut für Landes- und Stadtentwicklungsforschung gGmbH) and BBSR (Bundesinstitut für Bau-, Stadt- und Raumforschung) funded by BMVI (Federal Ministry of Transport and Digital Infrastructure). The goal of incora is an analysis of settlement and infrastructure dynamics in Germany based on Copernicus Sentinel data. This classification is based on a time-series of monthly averaged, atmospherically corrected Sentinel-2 tiles (MAJA L3A-WASP: https://geoservice.dlr.de/web/maps/sentinel2:l3a:wasp; DLR (2019): Sentinel-2 MSI - Level 2A (MAJA-Tiles)- Germany). It consists of the following landcover classes: 10: forest 20: low vegetation 30: water 40: built-up 50: bare soil 60: agriculture Potential training and validation areas were automatically extracted using spectral indices and their temporal variability from the Sentinel-2 data itself as well as the following auxiliary datasets: - OpenStreetMap (Map data copyrighted OpenStreetMap contributors and available from htttps://www.openstreetmap.org) - Copernicus HRL Imperviousness Status Map 2018 (© European Union, Copernicus Land Monitoring Service 2018, European Environment Agency (EEA)) - S2GLC Land Cover Map of Europe 2017 (Malinowski et al. 2020: Automated Production of Land Cover/Use Map of Europe Based on Sentinel-2 Imagery. Remote Sens. 2020, 12(21), 3523; https://doi.org/10.3390/rs12213523) - Germany NUTS administrative areas 1:250000 (© GeoBasis-DE / BKG 2020 / dl-de/by-2-0 / https://gdz.bkg.bund.de/index.php/default/nuts-gebiete-1-250-000-stand-31-12-nuts250-31-12.html) - Contains modified Copernicus Sentinel data (2020), processed by mundialis Processing was performed for blocks of federal states and individual maps were mosaicked afterwards. For each class 100,000 pixels from the potential training areas were extracted as training data. An exemplary validation of the classification results was perfomed for the federal state of North Rhine-Westphalia as its open data policy allows for direct access to official data to be used as reference. Rules to convert relevant ATKIS Basis-DLM object classes to the incora nomenclature were defined. Subsequently, 5.000 reference points were randomly sampled and their classification in each case visually examined and, if necessary, revised to obtain a robust reference data set. The comparison of this reference data set with the incora classification yielded the following results: overall accuracy: 88.4% class: user's accuracy / producer's accuracy (number of reference points n) forest: 95.0% / 93.8% (1410) low vegetation: 73.4% / 86.5% (844) water: 98.5% / 92.8% (69) built-up: 98.9% / 95.8% (983) bare soil: 23.9% / 82.9% (41) agriculture: 94.6% / 83.2% (1653) Incora report with details on methods and results: pending
Dates of Images:Pre-Event: 8/24/2022, 8/31/2022, 9/5/2022, 9/15/2022, 9/19/2022, 9/20/2022, 9/23/2022, 9/25/2022, 9/27/2022, 9/28/2022Post-Event: 9/30/2022Date of Next Image:UnknownSummary:The True Color RGB composite provides a product of how the surface would look to the naked eye from space. The RGB is created using the red, green, and blue channels of the respective instrument.Suggested Use:The True Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.Satellite/Sensor:MultiSpectral Instrument (MSI) on European Space Agency's (ESA) Copernicus Sentinel-2A/2B satellitesResolution:10 metersCredits:NASA/MSFC, USGS, ESA CopernicusEsri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags04/services/hurricane_ian_2022/sentinel2_truecolor/MapServer/WMSServerData Download:https://maps.disasters.nasa.gov/download/gis_products/event_specific/2022/hurricane_ian_2022/sentinel2/trueColor/
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains a list of Sentinel-2 tiles covering Italy for the year 2017. For each tile, a corresponding ground truth GeoTIFF is present which contains a clip of the soil consumption provided by ISPRA (https://www.isprambiente.gov.it).
Dataset can be used to train a Machine Learning model to extract imperviousness maps using Sentinel-2 satellite images.
More details can be found reading the paper:
Giacco, G., Marrone, S., Langella, G., & Sansone, C. (2022). ReFuse: Generating Imperviousness Maps from Multi-Spectral Sentinel-2 Satellite Imagery. Future Internet, 14(10), 278.
Dates of Images:Pre-Event: 8/24/2022, 8/31/2022, 9/5/2022, 9/15/2022, 9/19/2022, 9/20/2022, 9/23/2022, 9/25/2022, 9/27/2022, 9/28/2022Post-Event: 9/30/2022Date of Next Image:UnknownSummary:The Natural Color RGB provides a false composite look at the surface. This RGB uses a shortwave, the near-infrared, and red channels from the instrument.Suggested Use:For the Natural Color RGB, areas of water will appear blue, healthy green vegetation will appear as a bright green, urban areas in various shades of magenta.Satellite/Sensor:MultiSpectral Instrument (MSI) on European Space Agency's (ESA) Copernicus Sentinel-2A/2B satellitesResolution:20 metersCredits:NASA/MSFC, USGS, ESA CopernicusEsri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags04/services/hurricane_ian_2022/sentinel2_naturalcolor/MapServer/WMSServerData Download:https://maps.disasters.nasa.gov/download/gis_products/event_specific/2022/hurricane_ian_2022/sentinel2/naturalColor/
Date of Images:Post-Event: 2/7/2023, 2/9/2023, 2/11/2023, 2/12/2023, 2/14/2023, 2/16/2023, 2/17/2023, 2/19/2023, 2/21/2023, 2/22/2023, 2/24/2023Pre-Event: 12/28/2022, 1/10/2023, 1/22/2023, 1/25/2023, 1/28/2023Date of Next Image:UnknownSummary:The Natural Color RGB provides a false composite look at the surface. This RGB uses a shortwave, the near-infrared, and red channels from the instrument.Suggested Use:For the Natural Color RGB, areas of water will appear blue, healthy green vegetation will appear as a bright green, urban areas in various shades of magenta, snow covered areas in cyan.Satellite/Sensor:MultiSpectral Instrument (MSI) on European Space Agency's (ESA) Copernicus Sentinel-2A/2B satellitesResolution:20mCredits:NASA/MSFC, USGS, ESA CopernicusEsri REST Endpoint:See URL section on the right side of page.WMS Endpoint:https://maps.disasters.nasa.gov/ags04/services/turkey_earthquake_2023/Map1/MapServer/WMSServerData Download:https://maps.disasters.nasa.gov/download/gis_products/event_specific/2023/turkiye_earthquakes_202302/sentinel2/
Human settlements maps are useful in understanding growth patterns, population distribution, resource management, change detection, and a variety of other applications where information related to earth surface is required. Human settlements classification is a complex exercise and is hard to capture using traditional means. Deep learning models are highly capable of learning these complex semantics and can produce superior results.Using the modelFollow the guide to use the model. Before using this model, ensure that the supported deep learning libraries are installed. For more details, check Deep Learning Libraries Installer for ArcGIS.Fine-tuning the modelThis model can be fine-tuned using the Train Deep Learning Model tool. Follow the guide to fine-tune this model.InputRaster, mosaic dataset, or image service. (Preferred cell size is 10 meters.)Note: This model is trained to work on Sentinel-2 Imagery datasets which are in WGS 1984 Web Mercator (auxiliary sphere) coordinate system (WKID 3857).OutputClassified raster containing two classes: settlement and other.Applicable geographiesThis model is expected to work well in Europe.Model architectureThis model uses the UNet model architecture implemented in ArcGIS API for Python.Accuracy metrics This model has an overall accuracy of 94.1 percent.Sample resultsHere are a few results from the model.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This web map displays the land use/land cover (LULC) timeseries layer derived from ESA Sentinel-2 imagery at 10m resolution. The visualization uses blend modes and is best used in the new Map Viewer. The time slider can be used to advance through the five years of data from 2017-2021. There are also a series of bookmarks for the locations below:Urban growth examplesOuagadougouCairo/GizaDubai, UAEKaty, Texas, USALoudoun County, VirginiaInfrastructureIstanbul International Airport, TurkeyGrand Ethiopian Renaissance Dam, EthiopiaDeforestationBorder of Acre and Rondonia states, BrazilHarz Mountains, GermanyWetlands lossPantanal, BrazilParana river, ArgentinaVegetation changing after fireNorthern California: Paradise, Redding, Clear Lake, Santa Rosa, Mendocino National ForestKangaroo Island, AustraliaVictoria and NSW, AustraliaYakutia, RussiaHurricane ImpactAbaco Island, BahamasRecent Lava FlowHawaii IslandSurface MiningBrown Coal, Cottbus, GermanyLand ReclamationMarkermeer, NetherlandsEconomic DevelopmentNorth vs South Korea