This shaded relief image was generated from the lidar-based bare-earth digital elevation model (DEM). A shaded relief image provides an illustration of variations in elevation using artificial shadows. Based on a specified position of the sun, areas that would be in sunlight are highlighted and areas that would be in shadow are shaded. In this instance, the position of the sun was assumed to be 45 degrees above the northwest horizon.The shaded relief image shows areas that are not in direct sunlight as shadowed. It does not show shadows that would be cast by topographic features onto the surrounding surface.Using ERDAS IMAGINE, a 3X3 neighborhood around each pixel in the DEM was analyzed, and a comparison was made between the sun's position and the angle that each pixel faces. The pixel was then assigned a value between -1 and +1 to represent the amount of light reflected. Negative numbers and zero values represent shadowed areas, and positive numbers represent sunny areas. In ArcGIS Desktop 10.7.1, the image was converted to a JPEG 2000 format with values from 0 (black) to 255 (white).See the MassGIS datalayer page to download the data as a JPEG 2000 image file.View this service in the Massachusetts Elevation Finder.MassGIS has also published a Lidar Shaded Relief tile service (cache) hosted in ArcGIS Online.
Notice: this is not the latest Heat Island Severity image service.This layer contains the relative heat severity for every pixel for every city in the United States, including Alaska, Hawaii, and Puerto Rico. Heat Severity is a reclassified version of Heat Anomalies raster which is also published on this site. This data is generated from 30-meter Landsat 8 imagery band 10 (ground-level thermal sensor) from the summer of 2023.To explore previous versions of the data, visit the links below:Heat Severity - USA 2022Heat Severity - USA 2021Heat Severity - USA 2020Heat Severity - USA 2019Federal statistics over a 30-year period show extreme heat is the leading cause of weather-related deaths in the United States. Extreme heat exacerbated by urban heat islands can lead to increased respiratory difficulties, heat exhaustion, and heat stroke. These heat impacts significantly affect the most vulnerable—children, the elderly, and those with preexisting conditions.The purpose of this layer is to show where certain areas of cities are hotter than the average temperature for that same city as a whole. Severity is measured on a scale of 1 to 5, with 1 being a relatively mild heat area (slightly above the mean for the city), and 5 being a severe heat area (significantly above the mean for the city). The absolute heat above mean values are classified into these 5 classes using the Jenks Natural Breaks classification method, which seeks to reduce the variance within classes and maximize the variance between classes. Knowing where areas of high heat are located can help a city government plan for mitigation strategies.This dataset represents a snapshot in time. It will be updated yearly, but is static between updates. It does not take into account changes in heat during a single day, for example, from building shadows moving. The thermal readings detected by the Landsat 8 sensor are surface-level, whether that surface is the ground or the top of a building. Although there is strong correlation between surface temperature and air temperature, they are not the same. We believe that this is useful at the national level, and for cities that don’t have the ability to conduct their own hyper local temperature survey. Where local data is available, it may be more accurate than this dataset. Dataset SummaryThis dataset was developed using proprietary Python code developed at Trust for Public Land, running on the Descartes Labs platform through the Descartes Labs API for Python. The Descartes Labs platform allows for extremely fast retrieval and processing of imagery, which makes it possible to produce heat island data for all cities in the United States in a relatively short amount of time.What can you do with this layer?This layer has query, identify, and export image services available. Since it is served as an image service, it is not necessary to download the data; the service itself is data that can be used directly in any Esri geoprocessing tool that accepts raster data as input.In order to click on the image service and see the raw pixel values in a map viewer, you must be signed in to ArcGIS Online, then Enable Pop-Ups and Configure Pop-Ups.Using the Urban Heat Island (UHI) Image ServicesThe data is made available as an image service. There is a processing template applied that supplies the yellow-to-red or blue-to-red color ramp, but once this processing template is removed (you can do this in ArcGIS Pro or ArcGIS Desktop, or in QGIS), the actual data values come through the service and can be used directly in a geoprocessing tool (for example, to extract an area of interest). Following are instructions for doing this in Pro.In ArcGIS Pro, in a Map view, in the Catalog window, click on Portal. In the Portal window, click on the far-right icon representing Living Atlas. Search on the acronyms “tpl” and “uhi”. The results returned will be the UHI image services. Right click on a result and select “Add to current map” from the context menu. When the image service is added to the map, right-click on it in the map view, and select Properties. In the Properties window, select Processing Templates. On the drop-down menu at the top of the window, the default Processing Template is either a yellow-to-red ramp or a blue-to-red ramp. Click the drop-down, and select “None”, then “OK”. Now you will have the actual pixel values displayed in the map, and available to any geoprocessing tool that takes a raster as input. Below is a screenshot of ArcGIS Pro with a UHI image service loaded, color ramp removed, and symbology changed back to a yellow-to-red ramp (a classified renderer can also be used): A typical operation at this point is to clip out your area of interest. To do this, add your polygon shapefile or feature class to the map view, and use the Clip Raster tool to export your area of interest as a geoTIFF raster (file extension ".tif"). In the environments tab for the Clip Raster tool, click the dropdown for "Extent" and select "Same as Layer:", and select the name of your polygon. If you then need to convert the output raster to a polygon shapefile or feature class, run the Raster to Polygon tool, and select "Value" as the field.Other Sources of Heat Island InformationPlease see these websites for valuable information on heat islands and to learn about exciting new heat island research being led by scientists across the country:EPA’s Heat Island Resource CenterDr. Ladd Keith, University of ArizonaDr. Ben McMahan, University of Arizona Dr. Jeremy Hoffman, Science Museum of Virginia Dr. Hunter Jones, NOAA Daphne Lundi, Senior Policy Advisor, NYC Mayor's Office of Recovery and ResiliencyDisclaimer/FeedbackWith nearly 14,000 cities represented, checking each city's heat island raster for quality assurance would be prohibitively time-consuming, so Trust for Public Land checked a statistically significant sample size for data quality. The sample passed all quality checks, with about 98.5% of the output cities error-free, but there could be instances where the user finds errors in the data. These errors will most likely take the form of a line of discontinuity where there is no city boundary; this type of error is caused by large temperature differences in two adjacent Landsat scenes, so the discontinuity occurs along scene boundaries (see figure below). Trust for Public Land would appreciate feedback on these errors so that version 2 of the national UHI dataset can be improved. Contact Dale.Watt@tpl.org with feedback.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This data layer references data from a high-resolution tree canopy change-detection layer for Seattle, Washington. Tree canopy change was mapped by using remotely sensed data from two time periods (2016 and 2021). Tree canopy was assigned to three classes: 1) no change, 2) gain, and 3) loss. No change represents tree canopy that remained the same from one time period to the next. Gain represents tree canopy that increased or was newly added, from one time period to the next. Loss represents the tree canopy that was removed from one time period to the next. Mapping was carried out using an approach that integrated automated feature extraction with manual edits. Care was taken to ensure that changes to the tree canopy were due to actual change in the land cover as opposed to differences in the remotely sensed data stemming from lighting conditions or image parallax. Direct comparison was possible because land-cover maps from both time periods were created using object-based image analysis (OBIA) and included similar source datasets (LiDAR-derived surface models, multispectral imagery, and thematic GIS inputs). OBIA systems work by grouping pixels into meaningful objects based on their spectral and spatial properties, while taking into account boundaries imposed by existing vector datasets. Within the OBIA environment a rule-based expert system was designed to effectively mimic the process of manual image analysis by incorporating the elements of image interpretation (color/tone, texture, pattern, location, size, and shape) into the classification process. A series of morphological procedures were employed to ensure that the end product is both accurate and cartographically pleasing. No accuracy assessment was conducted, but the dataset was subjected to manual review and correction.University of Vermont Spatial Analysis LaboratoryThe dataset covers the following tree canopy categories:Environmental Justice Priority AreasCensus tracts composite / quintileExisting tree canopy percentage & environmental justice priority levelExisting tree canopyPossible tree canopyRelative percentage changeFor more information, please see the 2021 Tree Canopy Assessment.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This ArcGIS Online hosted feature service displays perimeters from the National Incident Feature Service (NIFS) that meet ALL of the following criteria:
This layer contains the relative heat severity for every pixel for every city in the United States. This 30-meter raster was derived from Landsat 8 imagery band 10 (ground-level thermal sensor) from the summers of 2019 and 2020.Federal statistics over a 30-year period show extreme heat is the leading cause of weather-related deaths in the United States. Extreme heat exacerbated by urban heat islands can lead to increased respiratory difficulties, heat exhaustion, and heat stroke. These heat impacts significantly affect the most vulnerable—children, the elderly, and those with preexisting conditions.The purpose of this layer is to show where certain areas of cities are hotter than the average temperature for that same city as a whole. Severity is measured on a scale of 1 to 5, with 1 being a relatively mild heat area (slightly above the mean for the city), and 5 being a severe heat area (significantly above the mean for the city). The absolute heat above mean values are classified into these 5 classes using the Jenks Natural Breaks classification method, which seeks to reduce the variance within classes and maximize the variance between classes. Knowing where areas of high heat are located can help a city government plan for mitigation strategies.This dataset represents a snapshot in time. It will be updated yearly, but is static between updates. It does not take into account changes in heat during a single day, for example, from building shadows moving. The thermal readings detected by the Landsat 8 sensor are surface-level, whether that surface is the ground or the top of a building. Although there is strong correlation between surface temperature and air temperature, they are not the same. We believe that this is useful at the national level, and for cities that don’t have the ability to conduct their own hyper local temperature survey. Where local data is available, it may be more accurate than this dataset. Dataset SummaryThis dataset was developed using proprietary Python code developed at The Trust for Public Land, running on the Descartes Labs platform through the Descartes Labs API for Python. The Descartes Labs platform allows for extremely fast retrieval and processing of imagery, which makes it possible to produce heat island data for all cities in the United States in a relatively short amount of time.What can you do with this layer?This layer has query, identify, and export image services available. Since it is served as an image service, it is not necessary to download the data; the service itself is data that can be used directly in any Esri geoprocessing tool that accepts raster data as input.In order to click on the image service and see the raw pixel values in a map viewer, you must be signed in to ArcGIS Online, then Enable Pop-Ups and Configure Pop-Ups.Using the Urban Heat Island (UHI) Image ServicesThe data is made available as an image service. There is a processing template applied that supplies the yellow-to-red or blue-to-red color ramp, but once this processing template is removed (you can do this in ArcGIS Pro or ArcGIS Desktop, or in QGIS), the actual data values come through the service and can be used directly in a geoprocessing tool (for example, to extract an area of interest). Following are instructions for doing this in Pro.In ArcGIS Pro, in a Map view, in the Catalog window, click on Portal. In the Portal window, click on the far-right icon representing Living Atlas. Search on the acronyms “tpl” and “uhi”. The results returned will be the UHI image services. Right click on a result and select “Add to current map” from the context menu. When the image service is added to the map, right-click on it in the map view, and select Properties. In the Properties window, select Processing Templates. On the drop-down menu at the top of the window, the default Processing Template is either a yellow-to-red ramp or a blue-to-red ramp. Click the drop-down, and select “None”, then “OK”. Now you will have the actual pixel values displayed in the map, and available to any geoprocessing tool that takes a raster as input. Below is a screenshot of ArcGIS Pro with a UHI image service loaded, color ramp removed, and symbology changed back to a yellow-to-red ramp (a classified renderer can also be used): Other Sources of Heat Island InformationPlease see these websites for valuable information on heat islands and to learn about exciting new heat island research being led by scientists across the country:EPA’s Heat Island Resource CenterDr. Ladd Keith, University of ArizonaDr. Ben McMahan, University of Arizona Dr. Jeremy Hoffman, Science Museum of Virginia Dr. Hunter Jones, NOAA Daphne Lundi, Senior Policy Advisor, NYC Mayor's Office of Recovery and ResiliencyDisclaimer/FeedbackWith nearly 14,000 cities represented, checking each city's heat island raster for quality assurance would be prohibitively time-consuming, so The Trust for Public Land checked a statistically significant sample size for data quality. The sample passed all quality checks, with about 98.5% of the output cities error-free, but there could be instances where the user finds errors in the data. These errors will most likely take the form of a line of discontinuity where there is no city boundary; this type of error is caused by large temperature differences in two adjacent Landsat scenes, so the discontinuity occurs along scene boundaries (see figure below). The Trust for Public Land would appreciate feedback on these errors so that version 2 of the national UHI dataset can be improved. Contact Pete.Aniello@tpl.org with feedback.Terms of UseYou understand and agree, and will advise any third party to whom you give any or all of the data, that The Trust for Public Land is neither responsible nor liable for any viruses or other contamination of your system arising from use of The Trust for Public Land’s data nor for any delays, inaccuracies, errors or omissions arising out of the use of the data. The Trust for Public Land’s data is distributed and transmitted "as is" without warranties of any kind, either express or implied, including without limitation, warranties of title or implied warranties of merchantability or fitness for a particular purpose. The Trust for Public Land is not responsible for any claim of loss of profit or any special, direct, indirect, incidental, consequential, and/or punitive damages that may arise from the use of the data. If you or any person to whom you make the data available are downloading or using the data for any visual output, attribution for same will be given in the following format: "This [document, map, diagram, report, etc.] was produced using data, in whole or in part, provided by The Trust for Public Land."
This database provides the pertinent details of the color infrared digital orthophoto quarter quad (CIR DOOQ) maps and wetlands (DNR_Wetlands) maps produced by the Maryland Department of Natural Resources. This database provides a simple and effective means of documenting the exact dates of the source material and other quality and reference information. This is a supplement to the metadata files for the DOQQs and wetlands files.This is a MD iMAP hosted service. Find more information on https://imap.maryland.gov.Map Service Layer:https://imagery.geodata.md.gov/imap/rest/services/DOQs/DOQIndexGrids/FeatureServer/0
This data layer references data from a high-resolution tree canopy change-detection layer for Seattle, Washington. Tree canopy change was mapped by using remotely sensed data from two time periods (2016 and 2021). Tree canopy was assigned to three classes: 1) no change, 2) gain, and 3) loss. No change represents tree canopy that remained the same from one time period to the next. Gain represents tree canopy that increased or was newly added, from one time period to the next. Loss represents the tree canopy that was removed from one time period to the next. Mapping was carried out using an approach that integrated automated feature extraction with manual edits. Care was taken to ensure that changes to the tree canopy were due to actual change in the land cover as opposed to differences in the remotely sensed data stemming from lighting conditions or image parallax. Direct comparison was possible because land-cover maps from both time periods were created using object-based image analysis (OBIA) and included similar source datasets (LiDAR-derived surface models, multispectral imagery, and thematic GIS inputs). OBIA systems work by grouping pixels into meaningful objects based on their spectral and spatial properties, while taking into account boundaries imposed by existing vector datasets. Within the OBIA environment a rule-based expert system was designed to effectively mimic the process of manual image analysis by incorporating the elements of image interpretation (color/tone, texture, pattern, location, size, and shape) into the classification process. A series of morphological procedures were employed to ensure that the end product is both accurate and cartographically pleasing. No accuracy assessment was conducted, but the dataset was subjected to manual review and correction.University of Vermont Spatial Analysis LaboratoryThis dataset consists of hexagons 50-acres in area, or several city blocks. The dataset covers the following tree canopy categories:Existing tree canopy percentPossible tree canopy - vegetation percentRelative percent changeAbsolute percent changeAverage maximum afternoon temperature (F)Tree canopy percentage & average afternoon temperature (F)For more information, please see the 2021 Tree Canopy Assessment.
This map features forests for the Caribbean, which are provided here as an excerpted subset of the World Land Cover 30m BaseVue 2013 layer. Separating forests is useful as a cartographic layer on environmentally oriented maps and analytically as a basis for ecosystem and habitat definition.
This data layer references data from a high-resolution tree canopy change-detection layer for Seattle, Washington. Tree canopy change was mapped by using remotely sensed data from two time periods (2016 and 2021). Tree canopy was assigned to three classes: 1) no change, 2) gain, and 3) loss. No change represents tree canopy that remained the same from one time period to the next. Gain represents tree canopy that increased or was newly added, from one time period to the next. Loss represents the tree canopy that was removed from one time period to the next. Mapping was carried out using an approach that integrated automated feature extraction with manual edits. Care was taken to ensure that changes to the tree canopy were due to actual change in the land cover as opposed to differences in the remotely sensed data stemming from lighting conditions or image parallax. Direct comparison was possible because land-cover maps from both time periods were created using object-based image analysis (OBIA) and included similar source datasets (LiDAR-derived surface models, multispectral imagery, and thematic GIS inputs). OBIA systems work by grouping pixels into meaningful objects based on their spectral and spatial properties, while taking into account boundaries imposed by existing vector datasets. Within the OBIA environment a rule-based expert system was designed to effectively mimic the process of manual image analysis by incorporating the elements of image interpretation (color/tone, texture, pattern, _location, size, and shape) into the classification process. A series of morphological procedures were employed to ensure that the end product is both accurate and cartographically pleasing. No accuracy assessment was conducted, but the dataset was subjected to manual review and correction.University of Vermont Spatial Analysis LaboratoryThis dataset consists of City of Seattle Public Schools areas which cover the following tree canopy categories:Existing tree canopy percentPossible tree canopy - vegetation percentRelative percent changeAbsolute percent changeFor more information, please see the 2021 Tree Canopy Assessment.
This data layer references data from a high-resolution tree canopy change-detection layer for Seattle, Washington. Tree canopy change was mapped by using remotely sensed data from two time periods (2016 and 2021). Tree canopy was assigned to three classes: 1) no change, 2) gain, and 3) loss. No change represents tree canopy that remained the same from one time period to the next. Gain represents tree canopy that increased or was newly added, from one time period to the next. Loss represents the tree canopy that was removed from one time period to the next. Mapping was carried out using an approach that integrated automated feature extraction with manual edits. Care was taken to ensure that changes to the tree canopy were due to actual change in the land cover as opposed to differences in the remotely sensed data stemming from lighting conditions or image parallax. Direct comparison was possible because land-cover maps from both time periods were created using object-based image analysis (OBIA) and included similar source datasets (LiDAR-derived surface models, multispectral imagery, and thematic GIS inputs). OBIA systems work by grouping pixels into meaningful objects based on their spectral and spatial properties, while taking into account boundaries imposed by existing vector datasets. Within the OBIA environment a rule-based expert system was designed to effectively mimic the process of manual image analysis by incorporating the elements of image interpretation (color/tone, texture, pattern, location, size, and shape) into the classification process. A series of morphological procedures were employed to ensure that the end product is both accurate and cartographically pleasing. No accuracy assessment was conducted, but the dataset was subjected to manual review and correction.University of Vermont Spatial Analysis LaboratoryThis dataset consists of City of Seattle Council District areas as they existed in the first comparison year (2016) which cover the following tree canopy categories:Existing tree canopy percentPossible tree canopy - vegetation percentRelative percent changeAbsolute percent changeFor more information, please see the 2021 Tree Canopy Assessment.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Each point in Coastal Resiliency Assessment Shoreline Points represents a 250 meter segment of the Maryland coast, including Atlantic, Chesapeake Bay and Coastal Bay shorelines. The Natural Capital Project's Coastal Vulnerability model was used to calculate a Shoreline Hazard Index, representing the relative exposure of each segment to storm-induced erosion and flooding. Inputs to the model included 6 physical variables (geomorphology, elevation, sea level rise, wave power, storm surge height and erosion rates) and 5 habitat types (forest, marsh, dune, oyster reef and underwater grass). Two scenarios of the model were run: one scenario incorporating the protective role of all existing coastal habitats and the other scenario simulating the complete loss of habitats. The difference between the two scenarios indicates the potential magnitude of coastal hazard reduction by habitats at each location. Model results were integrated with MD DNR’s Community Flood Risk Areas (March, 2016) in order to highlight areas where hazard reduction by habitats is most likely to benefit at-risk coastal communities.This dataset was produced under award number NA13NOS4190136 from the Office of Ocean and Coastal Resource Management (OCRM), National Oceanic and Atmospheric Administration (NOAA) through the Maryland Department of Natural Resources Chesapeake and Coastal Services (CCS). The statements, finding and recommendations are those of the authors and do not necessarily reflect the views of NOAA or the U.S. Department of Commerce. The Natural Capital Project (NatCap), CCS and The Nature Conservancy (TNC) all contributed to the production of this dataset. This is a MD iMAP hosted service. Find more information on https://imap.maryland.gov.Feature Service Link: https://mdgeodata.md.gov/imap/rest/services/Environment/MD_CoastalResiliencyAssessment/FeatureServer/2
The storm surge zones data used in this application were generated using the Sea, Lake, and Overland Surges from Hurricanes (SLOSH) model. SLOSH is a computerized model run by the National Weather Service to estimate storm surge heights resulting from historical, hypothetical, or predicted hurricanes. The model creates its estimates by assessing the pressure, size, forward speed, track, and wind data from a storm. Graphical output from the model displays color-coded storm surge heights for a particular area. The calculations are applied to a specific locale"s shoreline, incorporating the unique bay and river configurations, water depths, bridges, roads, and other physical features. This file is generated as part of the Hazards Analysis within the Hurricane Evacuation Study for the Maryland Western Shore.This is a MD iMAP hosted service layer. Find more information athttps://imap.maryland.gov.Feature Service Layer Link:https://mdgeodata.md.gov/imap/rest/services/Weather/MD_StormSurge/MapServer/0**Please note, due to the size of this dataset, you may receive an error message when trying to download the dataset. You can download this dataset directly from MD iMAP Services at:https://mdgeodata.md.gov/imap/rest/services/Weather/MD_StormSurge/MapServer/exts/MDiMAPDataDownload/customLayers/0**
Visualization OverviewThis visualization represents a "true color" band combination (Red = 1, Green = 4, Blue = 3) of data collected by the MODIS instrument on the NASA Aqua satellite. The imagery represents a natural looking view of the Earth's surface without the presence of aerosols (e.g. clouds and dust). At its highest resolution, this visualization represents the underlying data scaled to a resolution of 500m per pixel at the equator.The MODIS Surface Reflectance product is created by an atmospheric correction algorithm that includes aerosol correction and is designed to derive land surface properties. By contrast, the MODIS Corrected Reflectance product, which is also available in the Living Atlas, provides more natural-looking images by only removing gross atmospheric effects such as Rayleigh scattering from the visible bands. In clear atmospheric conditions the Corrected Reflectance product is similar to the Surface Reflectance product, but they depart from each other in the presence of aerosols.Multi-Spectral BandsThe following table lists the MODIS bands that are utilized to create this visualization. See here for a full description of all MODIS bands.BandDescriptionWavelength (µm)Resolution (m)1Visible (Red)0.620 - 0.670 2503Visible (Blue)0.459 - 0.4795004Visible (Green)0.545 - 0.565500Temporal CoverageBy default, this layer will display the imagery currently available for today’s date. This imagery is a "daily composite" that is assembled from hundreds of individual data files. When viewing imagery for “today,” you may notice that only a portion of the map has imagery. This is because the visualization is continually updated as the satellite collects more data. To view imagery over time, you can update the layer properties to enable time animation and configure time settings. Currently, this layer is available from present back to the start of the mission (July 3rd, 2002).NASA Global Imagery Browse Services (GIBS), NASA Worldview, & NASA LANCEThis visualization is provided through the NASA Global Imagery Browse Services (GIBS), which are a set of standard services to deliver global, full-resolution satellite imagery for hundreds of NASA Earth science datasets and science parameters. Through its services, and the NASA Worldview client, GIBS enables interactive exploration of NASA's Earth imagery for a broad range of users. The data and imagery are generated within 3 hours of acquisition through the NASA LANCE capability.Esri and NASA Collaborative ServicesThis visualization is made available through an ArcGIS image service hosted on Esri servers and facilitates access to a NASA GIBS service endpoint. For each image service request, the Esri server issues multiple requests to the GIBS service, processes and assembles the responses, and returns a proper mosaic image to the user. Processing occurs on-the-fly for each and every request to ensure that any update to the GIBS imagery is immediately available to the user. As such, availability of this visualization is dependent on both the Esri and the NASA GIBS services.
This data layer references data from a high-resolution tree canopy change-detection layer for Seattle, Washington. Tree canopy change was mapped by using remotely sensed data from two time periods (2016 and 2021). Tree canopy was assigned to three classes: 1) no change, 2) gain, and 3) loss. No change represents tree canopy that remained the same from one time period to the next. Gain represents tree canopy that increased or was newly added, from one time period to the next. Loss represents the tree canopy that was removed from one time period to the next. Mapping was carried out using an approach that integrated automated feature extraction with manual edits. Care was taken to ensure that changes to the tree canopy were due to actual change in the land cover as opposed to differences in the remotely sensed data stemming from lighting conditions or image parallax. Direct comparison was possible because land-cover maps from both time periods were created using object-based image analysis (OBIA) and included similar source datasets (LiDAR-derived surface models, multispectral imagery, and thematic GIS inputs). OBIA systems work by grouping pixels into meaningful objects based on their spectral and spatial properties, while taking into account boundaries imposed by existing vector datasets. Within the OBIA environment a rule-based expert system was designed to effectively mimic the process of manual image analysis by incorporating the elements of image interpretation (color/tone, texture, pattern, location, size, and shape) into the classification process. A series of morphological procedures were employed to ensure that the end product is both accurate and cartographically pleasing. No accuracy assessment was conducted, but the dataset was subjected to manual review and correction.University of Vermont Spatial Analysis LaboratoryThis dataset consists of hexagons 50-acres in area, or several city blocks. The dataset covers the following tree canopy categories:Existing tree canopy percentPossible tree canopy - vegetation percentRelative percent changeAbsolute percent changeAverage maximum afternoon temperature (F)Tree canopy percentage & average afternoon temperature (F)For more information, please see the 2021 Tree Canopy Assessment.
This web application enables the exploration of Arctic elevation based on the 2m resolution Arctic Digital Elevation Models (DEM) created by the Polar Geospatial Center. The app displays multiple different renderings as well as profiles of the data. In many areas the coverage is available from multiple dates and the app displays temporal profiles as well as computing the differences. The current datasets consisting of 2m DEMs, cover the Arctic from 60*N to the Pole and will gradually, and incrementally be replaced with better 2m versions as they are produced during 2018. The elevations are digital surface models photogrammetrically generated from stereo satellite imagery and have not been edited to create terrain heights. The current datasets are preliminary and are known to contain some errors and artifacts. As more control becomes available, the elevation values will be refined and adjusted. The original PGC datasets have been adjusted according to the PGC proposed correction parameters, to give WGS84 ellipsoidal heights, but available in this service also as orthometric heights computed using the EGM2008 geoid separation. Details on how the DEMs are generated and their use can be found in ArcticDEM datasets. The DEMs were created from DigitalGlobe, Inc., imagery and funded under National Science Foundation awards 1043681, 1559691, and 1542736.
The app also provides access to the Arctic Landsat imagery that is updated daily and also served through ArcGIS Online.
Quick access to server functions defined for the following elevation derivatives are provided:
The Time tool enables access to a temporal time slider and temporal profile for a selected point. The Time tool is only accessible at larger zoom scales. The Identify tool enables access to elevation, slope and aspect values for the specified point as well as information on the source image and links to download the source data. From the app it is also possible to export defined areas of the DEMs. These can be exported in user defined projections and resolutions. The Bookmark tool link to pre-selected interesting locations.
For more information on the underlying services see Arctic DEM layer.
The application is written using Web AppBuilder for ArcGIS accessing imagery layers using the ArcGIS API for JavaScript.
Attribution 3.0 (CC BY 3.0)https://creativecommons.org/licenses/by/3.0/
License information was derived automatically
This dataset collection contains A0 maps of the Keppel Island region based on satellite imagery and fine-scale habitat mapping of the islands and marine environment. This collection provides the source satellite imagery used to produce these maps and the habitat mapping data.
The imagery used to produce these maps was developed by blending high-resolution imagery (1 m) from ArcGIS Online with a clear-sky composite derived from Sentinel 2 imagery (10 m). The Sentinel 2 imagery was used to achieve full coverage of the entire region, while the high-resolution was used to provide detail around island areas.
The blended imagery is a derivative product of the Sentinel 2 imagery and ArcGIS Online imagery, using Photoshop to to manually blend the best portions of each imagery into the final product. The imagery is provided for the sole purpose of reproducing the A0 maps.
Methods:
The high resolution satellite composite composite was developed by manual masking and blending of a Sentinel 2 composite image and high resolution imagery from ArcGIS Online World Imagery (2019).
The Sentinel 2 composite was produced by statistically combining the clearest 10 images from 2016 - 2019. These images were manually chosen based on their very low cloud cover, lack of sun glint and clear water conditions. These images were then combined together to remove clouds and reduce the noise in the image.
The processing of the images was performed using a script in Google Earth Engine. The script combines the manually chosen imagery to estimate the clearest imagery. The dates of the images were chosen using the EOBrowser (https://www.sentinel-hub.com/explore/eobrowser) to preview all the Sentinel 2 imagery from 2015-2019. The images that were mostly free of clouds, with little or no sun glint, were recorded. Each of these dates was then viewed in Google Earth Engine with high contrast settings to identify images that had high water surface noise due to algal blooms, waves, or re-suspension. These were excluded from the list. All the images were then combined by applying a histogram analysis of each pixel, with the final image using the 40th percentile of the time series of the brightness of each pixel. This approach helps exclude effects from clouds.
The contrast of the image was stretched to highlight the marine features, whilst retaining detail in the land features. This was done by choosing a black point for each channel that would provide a dark setting for deep clear water. Gamma correction was then used to lighten up the dark water features, whilst not ove- exposing the brighter shallow areas.
Both the high resolution satellite imagery and Sentinel 2 imagery was combined at 1 m pixel resolution. The resolution of the Sentinel 2 tiles was up sampled to match the resolution of the high-resolution imagery. These two sets of imagery were then layered in Photoshop. The brightness of the high-resolution satellite imagery was then adjusting to match the Sentinel 2 imagery. A mask was then used to retain and blend the imagery that showed the best detail of each area. The blended tiles were then merged with the overall area imagery by performing a GDAL merge, resulting in an upscaling of the Sentinel 2 imagery to 1 m resolution.
Habitat Mapping:
A 5 m resolution habitat mapping was developed based on the satellite imagery, aerial imagery available, and monitoring site information. This habitat mapping was developed to help with monitoring site selection and for the mapping workshop with the Woppaburra TOs on North Keppel Island in Dec 2019.
The habitat maps should be considered as draft as they don't consider all available in water observations. They are primarily based on aerial and satellite images.
The habitat mapping includes: Asphalt, Buildings, Mangrove, Cabbage-tree palm, Sheoak, Other vegetation, Grass, Salt Flat, Rock, Beach Rock, Gravel, Coral, Sparse coral, Unknown not rock (macroalgae on rubble), Marine feature (rock).
This assumed layers allowed the digitisation of these features to be sped up, so for example, if there was coral growing over a marine feature then the boundary of the marine feature would need to be digitised, then the coral feature, but not the boundary between the marine feature and the coral. We knew that the coral was going to cut out from the marine feature because the coral is on top of the marine feature, saving us time in digitising this boundary. Digitisation was performed on an iPad using Procreate software and an Apple pencil to draw the features as layers in a drawing. Due to memory limitations of the iPad the region was digitised using 6000x6000 pixel tiles. The raster images were converted back to polygons and the tiles merged together.
A python script was then used to clip the layer sandwich so that there is no overlap between feature types.
Habitat Validation:
Only limited validation was performed on the habitat map. To assist in the development of the habitat mapping, nearly every YouTube video available, at the time of development (2019), on the Keppel Islands was reviewed and, where possible, georeferenced to provide a better understanding of the local habitats at the scale of the mapping, prior to the mapping being conducted. Several validation points were observed during the workshop. The map should be considered as largely unvalidated.
data/coastline/Keppels_AIMS_Coastline_2017.shp:
The coastline dataset was produced by starting with the Queensland coastline dataset by DNRME (Downloaded from http://qldspatial.information.qld.gov.au/catalogue/custom/detail.page?fid={369DF13C-1BF3-45EA-9B2B-0FA785397B34} on 31 Aug 2019). This was then edited to work at a scale of 1:5000, using the aerial imagery from Queensland Globe as a reference and a high-tide satellite image from 22 Feb 2015 from Google Earth Pro. The perimeter of each island was redrawn. This line feature was then converted to a polygon using the "Lines to Polygon" QGIS tool. The Keppel island features were then saved to a shapefile by exporting with a limited extent.
data/labels/Keppel-Is-Map-Labels.shp:
This contains 70 named places in the Keppel island region. These names were sourced from literature and existing maps. Unfortunately, no provenance of the names was recorded. These names are not official. This includes the following attributes:
- Name: Name of the location. Examples Bald, Bluff
- NameSuffix: End of the name which is often a description of the feature type: Examples: Rock, Point
- TradName: Traditional name of the location
- Scale: Map scale where the label should be displayed.
data/lat/Keppel-Is-Sentinel2-2016-19_B4-LAT_Poly3m_V3.shp:
This corresponds to a rough estimate of the LAT contours around the Keppel Islands. LAT was estimated from tidal differences in Sentinel-2 imagery and light penetration in the red channel. Note this is not very calibrated and should be used as a rough guide. Only one rough in-situ validation was performed at low tide on Ko-no-mie at the edge of the reef near the education centre. This indicated that the LAT estimate was within a depth error range of about +-0.5 m.
data/habitat/Keppels_AIMS_Habitat-mapping_2019.shp:
This shapefile contains the mapped land and marine habitats. The classification type is recorded in the Type attribute.
Format:
GeoTiff (Internal JPEG format - 538 MB)
PDF (A0 regional maps - ~30MB each)
Shapefile (Habitat map, Coastline, Labels, LAT estimate)
Data Location:
This dataset is filed in the eAtlas enduring data repository at: data\custodian\2020-2029-AIMS\Keppels_AIMS_Regional-maps
This image service is part of a collection of maps for PGA, PGV and spectral accelerations at 0.2 s (SA02), 1.0 s (SA10), and 2.0 s (SA20) to illustrate seismic hazards in California. For each ground motion parameter, maps at two different hazard levels were presented: one with a 2% probability of being exceeded in 50 years (equivalent to 2,475-year recurrence interval) and the other with 10% probability of being exceeded in 50 years (equivalent to 475-year recurrence interval). The ArcGIS Online interface allows users to select any two ground motion hazard maps to compare side by side. Ground motion parameters were calculated using the 2023 update of the U.S. Geological Survey National Seismic Hazard Model. See the “Scientific Background” on MS48 webpage for detailed information.Due to software limitations, symbology cannot be added to this service. To match the symbology used in the MS48 Ground Motion application, use the following configuration:Esri Color Ramp: MagmaMinimum: 4.41 gMaximum: 0.032 gGamma: 1
This shaded relief image was generated from the lidar-based bare-earth digital elevation model (DEM). A shaded relief image provides an illustration of variations in elevation using artificial shadows. Based on a specified position of the sun, areas that would be in sunlight are highlighted and areas that would be in shadow are shaded. In this instance, the position of the sun was assumed to be 45 degrees above the northwest horizon.The shaded relief image shows areas that are not in direct sunlight as shadowed. It does not show shadows that would be cast by topographic features onto the surrounding surface.Using ERDAS IMAGINE, a 3X3 neighborhood around each pixel in the DEM was analyzed, and a comparison was made between the sun's position and the angle that each pixel faces. The pixel was then assigned a value between -1 and +1 to represent the amount of light reflected. Negative numbers and zero values represent shadowed areas, and positive numbers represent sunny areas. In ArcGIS Desktop 10.7.1, the image was converted to a JPEG 2000 format with values from 0 (black) to 255 (white).See the MassGIS datalayer page to download the data as a JPEG 2000 image file.View this service in the Massachusetts Elevation Finder.MassGIS has also published a Lidar Shaded Relief tile service (cache) hosted in ArcGIS Online.