This series of products from MODIS represents the only daily global composites available and is suitable for use at global and regional levels. This True Color band composition (Bands 1 4 3 | Red, Green, Blue) most accurately shows how we see the earth’s surface with our own eyes. It is a natural looking image that is useful for land surface, oceanic and atmospheric analysis. There are four True Color products in total. For each satellite (Aqua and Terra) there is a 250 meter corrected reflectance product and a 500 meter surface reflectance product. Although the resolution is coarser than other satellites, this allows for a global collection of imagery on a daily basis, which is made available in near real-time. In contrast, Landsat needs 16 days to collect a global composite. Besides the maximum resolution difference, the surface and corrected reflectance products also differ in the algorithm used for atmospheric correction.NASA Global Imagery Browse Services (GIBS)This image layer provides access to a subset of the NASA Global Imagery Browse Services (GIBS), which are a set of standard services to deliver global, full-resolution satellite imagery. The GIBS goal is to enable interactive exploration of NASA's Earth imagery for a broad range of users. The purpose of this image layer, and the other GIBS image services hosted by Esri, is to enable convenient access to this beautiful and useful satellite imagery for users of ArcGIS. The source data used by this image layer is a finished image; it is not recommended for quantitative analysis.Several full resolution, global imagery products are built and served by GIBS in near real-time (usually within 3.5 hours of observation). These products are built from NASA Earth Observing System satellites data courtesy of LANCE data providers and other sources. The MODIS instrument aboard Terra and Aqua satellites, the AIRS instrument aboard Aqua, and the OMI instrument aboard Aura are used as sources. Several of the MODIS global products are made available on this Esri hosted service.This image layer hosted by Esri provides direct access to one of the GIBS image products. The Esri servers do not store any of this data itself. Instead, for each received data request, multiple image tiles are retrieved from GIBS, which are then processed and assembled into the proper image for the response. This processing takes place on-the-fly, for each and every request. This ensures that any update to the GIBS data is immediately available in the Esri mosaic service.Note on Time: The image service supporting this map is time enabled, but time has been disabled on this image layer so that the most recent imagery displays by default. If you would like to view imagery over time, you can update the layer properties to enable time animation and configure time settings. The results can be saved in a web map to use later or share with others.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
GIS_data_and_jupyter_python_notebook.zip: Data for Modeling SDS via Random Forest Models. Contains a ArcGIS Pro project with example data collected at Marston Farm (Boone, IA) and cropped Planet scope 4-band imagery of the area for 2016, 2017 and 2018.Preview for jupyter notebook: preview of a jupyter (Python 3) notebook that demonstrates the use of Random forest classifier using the GIS data.
https://www.marketreportanalytics.com/privacy-policyhttps://www.marketreportanalytics.com/privacy-policy
The global satellite-based Earth observation market is experiencing robust growth, projected to reach $4.04 billion in 2025 and exhibiting a compound annual growth rate (CAGR) of 6.52% from 2025 to 2033. This expansion is driven by increasing demand across diverse sectors. Government initiatives focused on infrastructure development, disaster management, and climate change mitigation are significant contributors. The rising adoption of advanced technologies like Synthetic Aperture Radar (SAR) and optical imagery, offering high-resolution data for precise analysis, fuels market growth. Furthermore, the burgeoning need for efficient agriculture practices, urban planning, and environmental monitoring is driving the demand for timely and accurate Earth observation data. Value-added services, providing processed and analyzed information derived from raw satellite data, are also gaining traction, enhancing the market's value proposition. Key market segments include data services, value-added services, and various application areas like urban development (including public safety), agriculture, climate and environmental services, energy, infrastructure monitoring, and disaster and emergency management. Technological advancements, including miniaturization and cost reduction in satellite technology, are lowering the barrier to entry for new players, fostering innovation and increasing the availability of data. While data security and privacy concerns present potential restraints, the overall market outlook remains highly positive, with significant opportunities for growth in emerging economies and expanding application areas. The continuous development of advanced analytics and artificial intelligence capabilities further enhances the potential of satellite data analysis, contributing to the market's sustained expansion. Recent developments include: March 2024 - the US Navy awarded Planet Labs a contract for maritime surveillance in the Pacific. The company’s satellite imagery will be used for vessel detection and monitoring by the Naval Information Warfare Center Pacific. For this initiative, Planet will collaborate with SynMax, a data analytics organization specializing in artificial intelligence (AI) and satellite imagery, to deliver actionable intelligence to its customers., March 2024 - ICEYE launched ICEYE Ocean Vision, a synthetic aperture radar (SAR) product family, to deliver actionable intelligence, especially for maritime domain awareness. ICEYE Ocean Vision Detect mainly provides insights into the presence, location, and size of vessels at sea, enabling authorities to take decisive action in mitigating threats.. Key drivers for this market are: Increasing Requirement for Efficient Monitoring of Vast Land Areas, Rising Smart City Initiatives; Big Data and Imagery Analytics. Potential restraints include: Increasing Requirement for Efficient Monitoring of Vast Land Areas, Rising Smart City Initiatives; Big Data and Imagery Analytics. Notable trends are: Urban Development to be the Fastest Growing Application.
Date of Images:12/19/2023Date of Next Image:UnknownSummary:This PlanetScope imagery captured by Planet Labs Inc. on December 19, 2023 shows the post-event conditions after flooding in New England.The true Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.Suggested Use:True Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.Satellite/Sensor:PlanetScopeResolution:3 metersCredits:NASA Disasters Program, Includes copyrighted material of Planet Labs PBC. All rights reserved.Esri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags03/services/newengland_flooding_202312/planet_tc_20231219/ImageServer/WMSServer
This app is part of Indicators of the Planet. Please see https://livingatlas.arcgis.com/indicatorsThis layer presents detectable thermal activity from VIIRS satellites for the last 7 days. VIIRS Thermal Hotspots and Fire Activity is a product of NASA’s Land, Atmosphere Near real-time Capability for EOS (LANCE) Earth Observation Data, part of NASA's Earth Science Data.Source: NASA LANCE - VNP14IMG_NRT active fire detection - WorldScale/Resolution: 375-meterUpdate Frequency: Hourly using the aggregated live feed methodologyArea Covered: WorldWhat can I do with this layer?This layer represents the most frequently updated and most detailed global remotely sensed wildfire information. Detection attributes include time, location, and intensity. It can be used to track the location of fires from the recent past, a few hours up to seven days behind real time. This layer also shows the location of wildfire over the past 7 days as a time-enabled service so that the progress of fires over that timeframe can be reproduced as an animation.The VIIRS thermal activity layer can be used to visualize and assess wildfires worldwide. However, it should be noted that this dataset contains many “false positives” (e.g., oil/natural gas wells or volcanoes) since the satellite will detect any large thermal signal.Fire points in this service are generally available within 3 1/4 hours after detection by a VIIRS device. LANCE estimates availability at around 3 hours after detection, and esri livefeeds updates this feature layer every 15 minutes from LANCE.Even though these data display as point features, each point in fact represents a pixel that is >= 375 m high and wide. A point feature means somewhere in this pixel at least one "hot" spot was detected which may be a fire.VIIRS is a scanning radiometer device aboard the Suomi NPP and NOAA-20 satellites that collects imagery and radiometric measurements of the land, atmosphere, cryosphere, and oceans in several visible and infrared bands. The VIIRS Thermal Hotspots and Fire Activity layer is a livefeed from a subset of the overall VIIRS imagery, in particular from NASA's VNP14IMG_NRT active fire detection product. The downloads are automatically downloaded from LANCE, NASA's near real time data and imagery site, every 15 minutes.The 375-m data complements the 1-km Moderate Resolution Imaging Spectroradiometer (MODIS) Thermal Hotspots and Fire Activity layer; they both show good agreement in hotspot detection but the improved spatial resolution of the 375 m data provides a greater response over fires of relatively small areas and provides improved mapping of large fire perimeters.Attribute informationLatitude and Longitude: The center point location of the 375 m (approximately) pixel flagged as containing one or more fires/hotspots.Satellite: Whether the detection was picked up by the Suomi NPP satellite (N) or NOAA-20 satellite (1). For best results, use the virtual field WhichSatellite, redefined by an arcade expression, that gives the complete satellite name.Confidence: The detection confidence is a quality flag of the individual hotspot/active fire pixel. This value is based on a collection of intermediate algorithm quantities used in the detection process. It is intended to help users gauge the quality of individual hotspot/fire pixels. Confidence values are set to low, nominal and high. Low confidence daytime fire pixels are typically associated with areas of sun glint and lower relative temperature anomaly (<15K) in the mid-infrared channel I4. Nominal confidence pixels are those free of potential sun glint contamination during the day and marked by strong (>15K) temperature anomaly in either day or nighttime data. High confidence fire pixels are associated with day or nighttime saturated pixels.Please note: Low confidence nighttime pixels occur only over the geographic area extending from 11 deg E to 110 deg W and 7 deg N to 55 deg S. This area describes the region of influence of the South Atlantic Magnetic Anomaly which can cause spurious brightness temperatures in the mid-infrared channel I4 leading to potential false positive alarms. These have been removed from the NRT data distributed by FIRMS.FRP: Fire Radiative Power. Depicts the pixel-integrated fire radiative power in MW (MegaWatts). FRP provides information on the measured radiant heat output of detected fires. The amount of radiant heat energy liberated per unit time (the Fire Radiative Power) is thought to be related to the rate at which fuel is being consumed (Wooster et. al. (2005)).DayNight: D = Daytime fire, N = Nighttime fireNote about near real time data:Near real time data is not checked thoroughly before it's posted on LANCE or downloaded and posted to the Living Atlas. NASA's goal is to get vital fire information to its customers within three hours of observation time. However, the data is screened by a confidence algorithm which seeks to help users gauge the quality of individual hotspot/fire points. Low confidence daytime fire pixels are typically associated with areas of sun glint and lower relative temperature anomaly (<15K) in the mid-infrared channel I4. Medium confidence pixels are those free of potential sun glint contamination during the day and marked by strong (>15K) temperature anomaly in either day or nighttime data. High confidence fire pixels are associated with day or nighttime saturated pixels.This layer is provided for informational purposes and is not monitored 24/7 for accuracy and currency.
Date of Images:2/2/2023, 2/10/2023, 2/17/2023Summary:The NASA GSFC landslides team manually mapped landslide initiation points after the February 6, 2023 earthquakes in Türkiye. The landslide initiation points were derived from PlanetScope imagery. This is only a portion of the region where landslides occurred, and areas covered with clouds or snow were not included.NOTE: This is a rapid response product. We have not done any form of manual corrections. These landslides were manually mapped using different dates of PlanetScope imagery, but all are presumed to have been triggered by the 2/6/2023 earthquakes. As clouds cleared or snow melted, more landslides were identified.This PlanetScope imagery captured by Planet Labs Inc. in February 2023 shows a few landslides following the earthquakes in Türkiye. True Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB s produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.Suggested Use:True Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.Satellite/Sensor:PlanetScopeResolution:3 metersCredits:NASA Disasters Program, NASA GSFC, Includes copyrighted material of Planet Labs PBC. All rights reserved.Esri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags04/services/turkey_earthquake_2023/Landslides_after_Earthquakes_in_Turkiye/MapServer/WMSServerData Download:landslide points: https://maps.disasters.nasa.gov/download/gis_products/event_specific/2023/turkiye_earthquakes_202302/landslides/
Mature Support Notice: This item is in mature support as of February 2024. A new version of this item is available for your use. This web application highlights some of the capabilities for accessing Landsat imagery layers, powered by ArcGIS for Server, accessing Landsat Public Datasets running on the Amazon Web Services Cloud. The layers are updated with new Landsat images on a daily basis. Created for you to visualize our planet and understand how the Earth has changed over time, the Esri Landsat Explorer app provides the power of Landsat satellites, which gather data beyond what the eye can see. Use this app to draw on Landsat's different bands to better explore the planet's geology, vegetation, agriculture, and cities. Additionally, access the entire Landsat archive to visualize how the Earth's surface has changed over the last forty years.Quick access to the following band combinations and indices is provided: Agriculture : Highlights agriculture in bright green; Bands 6, 5, 2Natural Color : Sharpened with 15m panchromatic band; Bands 4, 3, 2 +8Color Infrared : Healthy vegetation is bright red; Bands 5, 4 ,3 SWIR (Short Wave Infrared) : Highlights rock formations; Bands 7, 6, 4Geology : Highlights geologic features; Bands 7, 6, 2Bathymetric : Highlights underwater features; Bands 4, 3, 1Panchromatic : Panchromatic images at 15m; Band 8Vegetation Index : Normalized Difference Vegetation Index(NDVI); (Band 5 - Band 4)/(Band 5 + Band 4)Moisture Index : Normalized Difference Moisture Index (NDMI); (Band 5 - Band 6)/(Band 5 + Band 6)SAVI : Soil Adjusted Veg. Index); Offset + Scale*(1.5*(Band 5 - Band 4)/(Band 5 + Band 4 + 0.5))Water Index : Offset + Scale*(Band 3 - Band 6)/(Band 3 + Band 6)Burn Index : Offset + Scale*(Band 5 - Band 7)/(Band 5 + Band 7)Urban Index : Offset + Scale*(Band 5 - Band 6)/(Band 5 + Band 6)Optionally, you can also choose the "Custom Bands" or "Custom Index" option to create your own band combinations The Time tool enables access to a temporal time slider and a temporal profile of different indices for a selected point. The Time tool is only accessible at larger zoom scales. It provides temporal profiles for NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index) and Urban Index. The Identify tool enables access to information on the images, and can also provide a spectral profile for a selected point. The Stories tool will direct you to pre-selected interesting locations. The application is written using Web AppBuilder for ArcGIS accessing imagery layers using ArcGIS API for JavaScript. The following Imagery Layers are being accessed : Multispectral Landsat - Provides access to 30m 8-band multispectral imagery and a range of functions that provide different band combinations and indices.Pansharpened Landsat - Provides access to 15m 4-band (Red, Green, Blue and NIR) panchromatic-sharpened imagery.Panchromatic Landsat - Provides access to 15m panchromatic imagery. These imagery layers can be accessed through the public group Landsat Community on ArcGIS Online.
https://www.gnu.org/licenses/gpl-3.0.htmlhttps://www.gnu.org/licenses/gpl-3.0.html
Observations and Mapping of Sedimentary Features near Select Tesserae on VenusPetersen, S. K.1 and Carter, L. M.2 (1Department of Planetary Sciences, 2Lunar and Planetary Laboratory)AbstractFrom spring 2024 to spring 2025, I worked with Professor Lynn Carter at the Lunar and Planetary Laboratory on a project to determine how tesserae on Venus interact with the planet’s sedimentary cycle and associated processes. We produced a map of sedimentary features that is on this ReDATA repository and associated with an abstract and poster presentation for the 56th Lunar and Planetary Science Conference (LPSC). We also adapted a dune crestline mapping algorithm from Telfer et al. (2015) to apply to dune fields on Earth to attempt to use the dune fields as analogs for microdunes on Venus. We found that sedimentary features near tesserae were somewhat rare. Nonetheless, we mapped the location of wind streaks, possible mass wasting, low emissivity terrain, and newly classified “outlines” in the immediate vicinity of several tesserae. We also discovered two microdune fields in Tellus tessera and Husbishag tessera. While we ran out of time to use our crestline mapping algorithm to study analogs of Venus microdunes, we were able to qualitatively determine that dune morphology does affect their appearance in SAR imagery depending on the direction they are imaged from.Note: The following is an abridged version of reports/ScottPetersen_SeniorThesis.pdf. Please see that document for important details.1. IntroductionMost of the data in this project came from NASA’s Magellan mission’s SAR imagery, though we also used the mission’s radiometry data. Magellan acquired S-band HH SAR images that were either westward looking or eastward looking (Ford et al., 1993). These image sets are called “left look” and “right look” images, respectively. We also used Sentinel-1 SAR imagery for the Earth analog sites. Sentinel-1 uses a C-band radar imager (European Space Agency, 2025a).This work was done to increase understanding of tesserae by providing a map of sedimentary features in the vicinity of several tesserae. The map is intended primarily to aid in targeting the VenSAR instrument aboard the EnVision spacecraft. VenSAR is expected to have a maximum resolution an order of magnitude greater than Magellan’s maximum resolution (European Space Agency, 2025b). As VenSAR will be mapping pre-selected targets on Venus’s surface, helping to determine the most useful sediment/tessera-related sites will aid in maximizing the gain in understanding of tesserae’s relationship to sediments.2. Methods2.1 MappingMapping of Venus was done in ArcGIS Pro (Esri, 2025). Magellan imagery was acquired from the online USGS Astropedia tool (USGS…, n.d.). Section bounds were determined by using Ivanov and Head (2011)’s geologic map of Venus, which allowed us to acquire imagery of our target tesserae.Mapping was conducted by manually scanning across an image section at 1:400,000 scale while adding encountered sedimentary features to corresponding ArcGIS feature classes. Features were identified manually based on comparison to past works, especially Greeley et al. (1995) and Malin (1992). As mapping progressed, a greater diversity of features was sampled and classified. As a result, not all mapped tesserae contain each class of feature; however, this does not mean that certain features are absent near specific tesserae.2.2 Microdune AnalysisThe morphology of microdunes is impossible to determine without higher resolution imagery. We hoped to indirectly infer some of their properties by determining how the morphology of dune fields on Earth affect their appearance in SAR images, as Blom and Elachi (1981) have previously found SAR images of dune fields to be sensitive to look direction and incidence angle. We used Sentinel-1 SAR images acquired from ESA’s online Copernicus system (European Space Agency, 2025c). Due to time constraints, we only investigated the dune field at White Sands National Park. We then adapted an automatic dune crestline mapping algorithm from Telfer et al. (2015) to attempt to quantitatively characterize the dunes using the USGS’s DEMs.3. Results3.1 Description of Mapping DataOur mapping campaign covered approximately 7% of Venus’ surface. The following sections describe each class in detail, each named after the corresponding .shp file.3.1.1 MassWasting_LowConfidenceMass Wasting, Low Confidence features are suspected instances of mass wasting but are too small to be confidently interpreted. They were identified by the presence of lobe shapes, a pattern of dark-bright-dark on slopes facing away from the radar beam, or visual similarity to talus slopes as shown in Malin (1992) and Carter (2023). These features occur frequently in graben near tesserae but are rare within or on the margins of tesserae. We believe the dark-bright-dark pattern is the result of roughening on the scale of Magellan’s radar wavelength occurring at the base of a slope due to the presence of talus deposits.3.1.2 MassWasting_HighConfidenceThese features are also suspected instances of mass wasting but are large enough to allow a highly confident interpretation as mass wasting. Most instances of this class of feature are more related to coronae or chasmata than tesserae, which is supported by Jesina et al. (2025)’s discussion of mass wasting as a tracer for seismic activity.3.1.3 PitsThis class of feature refers to any chain of pits observed. There would often be instances of mass wasting, low confidence on large pit’s walls. We interpreted these mostly as volcanic dikes breaching the surface or as incipient graben. They occurred frequently within and outside of tesserae.3.1.4 Dark_LowEpsilonThese features were characterized by the presence of radar dark terrain and a low dielectric constant. We used Magellan’s global emissivity map (Magellan Team, 1993) to derive a global map of the surface dielectric constant using the methodology in Ford et al. (1993). The globally averaged surface dielectric constant of Venus is approximately 4 (Pettengill et al., 1992), while the emissivity-derived dielectric constant in Dark Low Epsilon features was generally between 2 and 2.5.3.1.5 Wind StreaksWind streaks were abundant, though it was very rare for any to be emanating from a tessera. Wind streaks were subdivided into five classes based on Greeley et al. (1995)’s classification scheme for wind streaks. Several streaks had also been previously noted by Greeley et al. (1995), but we were unable to verify exactly which ones due to difficulty precisely georeferencing their maps.Streaks_Linear_SparseGreeley et al. (1995) had a single category for “linear streaks,” however we decided to split this category into “sparse” and “dense” subclasses. The sparse linear streaks were characterized by having a length to width ratio significantly greater than the topographic feature they were attached to and not being directly attached to other wind streaks.Streaks_Linear_DenseDense linear streaks are streaks that occur in a very closely packed group but are still resolvable as individual streaks with high length to width ratios. As noted in Greeley et al. (1995), these streaks can be so densely packed that it becomes difficult to determine if the streaks are radar bright and the underlying terrain is radar dark, or vice versa.Streaks_TransverseStreaks of this class were identified based on criteria identical to the transverse streaks in Greeley et al. (1995). They always occurred immediately adjacent to a linear topographic feature. While they could cover the same area as a dense linear streak, they were not composed of many individual streaks and instead were a continuous covering of radar bright or dark material.Streaks_WispyWispy streaks were also classified the same way as in Greeley et al. (1995). These streaks had meandering, linear forms and could approach lengths of 100 km. They often seemed to be approximately perpendicular to nearby streaks, though it seems their directionality was mainly controlled by the presence of topographic features, as was also the case in Greeley et al. (1995).Streaks_FanFan streaks were any streaks that occurred adjacent to shield volcanoes. While Greeley et al. (1995) had a broader characterization of fan streaks as any streak that occurred in isolation with a length to width ratio smaller than 20:1, we only qualitatively noted the length to width ratio of all streaks. The streaks we classified as fan streaks had distinctively small length to width ratios.3.1.6 OutlinesOutlines are features we define as a diffuse radar bright or radar dark patch of terrain on the margins of a topographical feature. While they share some similarities with transverse streaks such as their diffuse form and appearance next to topography, they do not seem to have any directionality. They could be intermittent or continuous. We divided them into two subclasses based on their radar brightness relative to the surrounding terrain.Outlines_BrightBright outlines were often near instances of Mass Wasting, Low Confidence. They occasionally had nearly identical characteristics to Mass Wasting, Low Confidence, but were still classified as outlines due to the qualitatively larger distance the radar bright region extended. There were also several instances where the outlines only featured the diffuse brightness around a topographic margin.Outlines_DarkDark outlines sometimes occurred in near wind streaks, so such instances of dark outlines may just be an unusual expression of streaks. Dark outlines were mainly found on tessera margins. Dark outlines also sometimes share Dark Low Epsilon features’ low dielectric constant but were not classified as such because they were not as dark.3.2 Microdunes3.2.1 Two New Microdune FieldsTwo new microdune fields were discovered during our mapping campaign. The first was in Tellus tessera. The difference of
NOTE: This services contains a large amount of data and may be slow to load. For best results and faster loading, zoom into an area of interest.Date of Images:10/6/2022Date of Next Image:None ExpectedSummary:This PlanetScope imagery captured by Planet Labs Inc. on October 6, 2022 shows the impacts from Hurricane Ian across Florida.The true Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.The color infrared image is created using the near-infrared, red, and green channels from the Planet instrument allowing for the ability to see areas impacted from the hurricane. The near-infrared gives the ability to see through thin clouds. Healthy vegetation is shown as red, water is in blue.Suggested Use:True Color:True Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.Color Infrared:A false color composite depicts healthy vegetation as red, water as blue. Some minor atmospheric corrections have occurred.Satellite/Sensor:PlanetScopeResolution:3 metersCredits:NASA Disasters Program, Includes copyrighted material of Planet Labs PBC. All rights reserved.Esri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags04/services/hurricane_ian_2022/planet_20221006/MapServer/WMSServer
The 2021 Chesapeake Bay SAV Coverage was mapped from digital multispectral imagery with a 25cm GSD to assess water quality in the Bay. WorldView 2 from Maxar and PlanetScope satellite imagery from Planet Labs were used to augment the aerial imagery for the Anacostia River and the upper Potomac River, and middle Rappahannock River, including Menokin Bay and Cat Point Creek, the Mattaponi, Pamunkey, and the majority of the Chickahominy and James rivers, and Mobjack Bay. NAIP aerial imagery was used to augment the aerial imagery for the Chester, Choptank, Little Choptank, Magothy, and Miles rivers, Harris Creek, Eastern Bay, the western shore of Chesapeake Bay along the Calvert Cliffs, the Great Wicomico River and western shore of Chesapeake Bay from the mouth of the Rappahannock River up to Ingram Bay.. Each area of SAV was interpreted from the rectified imagry and classified into one of four density classes by the percentage of cover. The SAV beds were entered into an SDE GIS fetaure class using the quality control procedures documented below. The dataset contains all SAV areas that were identified from the areas flown. Some areas that are presumed to contain no SAV were not flown. Some small beds, particularly along narrow tributaries may not have been distinguishable on the aerial photography.This is an MD iMAP hosted service. Find more information at https://imap.maryland.gov/.Feature Service Link: https://mdgeodata.md.gov/imap/rest/services/Biota/MD_SubmergedAquaticVegetation/FeatureServer/0
Created for you to visualize our planet and understand how the Earth has changed over time, the Esri Landsat Explorer app provides the power of Landsat satellites, which gather data beyond what the eye can see. Use this app to draw on Landsat's different bands to better explore the planet's geology, vegetation, agriculture, and cities. Additionally, access the entire Landsat archive to visualize how the Earth's surface has changed over the last forty years.To get a guided look at the Landsat Explorer app tools, click for a tutorial.For more information about the Landsat missions, visit https://landsat.usgs.gov.For information on Esri's full imagery capabilities, visit http://www.esri.com/imagery.To build your own web app like Landsat Explorer, visit http://developers.arcgis.com.
NOTE: Due to the higher resolution of this data, it may be slow to load or require the user to zoom to a smaller area of interest.Date of Images:Pre-Event: 7/30/2024, 8/1/2024Post-Event: 8/10/2024Date of Next Image:UnknownSummary:The True Color RGB composite provides a product of how the surface would look to the naked eye from space. The RGB is created using the red, green, and blue channels of the respective instrument.The dynamically generated Normalized Difference Vegetation Index (NDVI) layer is an index for quantifying green vegetation. It reflects the state of vegetation health based on how vegetation reflects light at certain wavelengths.Suggested Use:The True Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.For NDVI, dark green colors are areas with a lot of green leaf growth which indicates the presence of chlorophyll. Chlorophyll reflects more infrared light and less visible light. Areas with some green leaf growth are in light greens, and areas with little to no vegetation growth are even lighter greens.Satellite/Sensor:PlanetScopeResolution:3 metersEsri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags04/services/brasil_flood_2024/planet_true/MapServer/WMSServer?request=GetCapabilities&service=WMS
Date of Images:1/10/2024Date of Next Image:UnknownSummary:This PlanetScope imagery captured by Planet Labs Inc. on January 10, 2024 shows the post-event conditions after the Southeast United States severe storms.The true Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.Suggested Use:True Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.Satellite/Sensor:PlanetScopeResolution:3 metersCredits:NASA Disasters Program, Includes copyrighted material of Planet Labs PBC. All rights reserved.Esri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags03/services/se_us_severestorms_202401/planet_truecolor_20240110/ImageServer/WMSServer
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
1) DBSM_Data_RioYeso' = Automatic weather station (AWS) data from Yeso Embalse and Termas del Plomo meteorological stations (available from Chilean Water Directorate, 'Dirección General de Aguas' or 'DGA' http://www.arcgis.com/apps/OnePane/basicviewer/index.html?appid=d508beb3a88f43d28c17a8ec9fac5ef0), used to force a distributed blowing snow model of Essery et al. (1999) to derive spatial snow depth of the Rio del Yeso catchment, Chile. The format is as follows:
{'Year','Month','Day','Hour','Incoming shortwave radiation (Wm2)','Incoming longwave radiation (Wm2)','SnowfallRate(mm/hr)','RainfallRate(mm/hr)','Air temperature (celsius)','Relative humidity (%)','Wind speed (m s-1)','Compass wind direction','Air pressure (hPa)'};
2) 'snowHeightPleiadesREG' = A snow depth map (horizontal resolution 4m) derived from triplets of high resoution stereo optical satellite images (Pléiades) following the methodology of Marti et al. (2016). The snow depth map is derived for a high mountain catchment (Rio del Yeso) of the central Chilean Andes (see Burger et al., 2018).
3) 'L2_LiDAR_4m' = A LiDAR (Light detection and Ranging) spatial snow depth map at a horizontal resolution of 4 m. The data were captured by a Reigl VZ-6000 LiDAR scanner and generated from the difference of two constructed digital elevation models (DEMs) between the dates 13th September, 2017 (with snow) and 12th December, 2017 (without snow).
4) 'L2_Pleiades_SDLidar_NEW' = The Pléiades snow depth map as described in 2), extracted by the areas of LiDAR scan described in 3).
5) 'SnowDepthResults' = A folder containing a corrected and gap-filled Pléiades snow depth map ('SD_PleiadesCORR') and for comparison: 'SD_TOPO', a statistical estimation of snow depth using topographic parameters and the regression equation of Grünewald et al. (2013) and; The physically based estimates of snow depth using the DBSM model as in 1) without snow transport for the 4th September, 2017 ('SD_EXTP_Sep04') and 13th September, 2017 ('SD_EXTP_Sep13') and with snow transport for those dates ('SD_Wind_Sep04','SD_Wind_Sep13').
6) 'rdyDEM' = An independent ASTER GDEM (https://asterweb.jpl.nasa.gov/gdem.asp) cut to the area of the study catchment (horizontal resolution = 30 m).
7) 'PlanetScope_20170907_TPK' = An stitched optical PlanetScope image of the catchment (horizontal resolution of 3.25 m) derived from access under the research and teaching iniative (planet.com).
Cited work:
Burger, F. et al. (2018) ‘Interannual variability in glacier contribution to runoff from a high ‐ elevation Andean catchment : understanding the role of debris cover in glacier hydrology’, Hydrological Processes, pp. 1–16. doi: 10.1002/hyp.13354.
Essery, R., Li, L. and Pomeroy, J. (1999) ‘A distributed model of blowing snow over complex terrain’, Hydrological Processes, 13(14–15), pp. 2423–2438. doi: 10.1002/(SICI)1099-1085(199910)13:14/15<2423::AID-HYP853>3.0.CO;2-U.
Grünewald, T. et al. (2013) ‘Statistical modelling of the snow depth distribution in open alpine terrain’, Hydrology and Earth System Sciences, 17(8), pp. 3005–3021. doi: 10.5194/hess-17-3005-2013.
Marti, R. et al. (2016) ‘Mapping snow depth in open alpine terrain from stereo satellite imagery’, The Cryosphere, pp. 1361–1380. doi: 10.5194/tc-10-1361-2016.
Date of Images:8/31/2023Date of Next Image:UnknownSummary:This PlanetScope imagery captured by Planet Labs Inc. in August 2023 shows the post-event conditions after Hurricane Idalia made landfall in Florida.The color infrared image is created using the near-infrared, red, and green channels from the Planet instrument allowing for the ability to see areas impacted from the flooding. The near-infrared gives the ability to see through thin clouds. Healthy vegetation is shown as red, water is in blue.Suggested Use:A false color composite depicts healthy vegetation as red, water as blue. Some minor atmospheric corrections have occurred.Satellite/Sensor:PlanetScopeResolution:3 metersCredits:NASA Disasters Program, Includes copyrighted material of Planet Labs PBC. All rights reserved.Esri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags04/services/hurricane_idalia_2023/planet_colorinfrared_postevent/ImageServer/WMSServer
Planet Sample 2020
Date of Images:7/11/2023Date of Next Image:UnknownSummary:This PlanetScope imagery captured by Planet Labs Inc. in July 2023 shows the impacts from flooding in Vermont and New England.The color infrared image is created using the near-infrared, red, and green channels from the Planet instrument allowing for the ability to see areas impacted from the flooding. The near-infrared gives the ability to see through thin clouds. Healthy vegetation is shown as red, water is in blue.Suggested Use:A false color composite depicts healthy vegetation as red, water as blue. Some minor atmospheric corrections have occurred.Satellite/Sensor:PlanetScopeResolution:3 metersCredits:NASA Disasters Program, Includes copyrighted material of Planet Labs PBC. All rights reserved.Esri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags04/services/vermont_flooding_202307/planet_colorinfrared_20230711/ImageServer/WMSServer
NOTE: This services contains a large amount of data and may be slow to load. For best results and faster loading, zoom into an area of interest.Date of Images:6/23/2024, 6/24/2024, 6/25/2024, 6/26/2024Date of Next Image:UnknownSummary:This PlanetScope imagery captured by Planet Labs Inc. in June 2024 shows the impacts from flooding in Iowa and surrounding areas.The color infrared image is created using the near-infrared, red, and green channels from the Planet instrument allowing for the ability to see areas impacted from the wildfires. The near-infrared gives the ability to see through thin clouds. Healthy vegetation is shown as red, water is in blue.Suggested Use:A false color composite depicts healthy vegetation as red, water as blue. Some minor atmospheric corrections have occurred.Satellite/Sensor:PlanetScopeResolution:3 metersCredits:NASA Disasters Program, Includes copyrighted material of Planet Labs PBC. All rights reserved.Esri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags04/services/iowa_flood_202406/Iowa_Floods_Color_Infrared/MapServer/WMSServer
Date of Images:8/19/2023, 8/21/2023, 8/22/2023, 8/23/2023, 8/24/2023, 8/25/2023, 8/26/2023Date of Next Image:UnknownSummary:This PlanetScope imagery captured by Planet Labs Inc. in August 2023 shows the pre-event conditions before Hurricane Idalia made landfall in Florida.The true Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.Suggested Use:True Color RGB provides a product of how the surface would look to the naked eye from space. The True Color RGB is produced using the 3 visible wavelength bands (red, green, and blue) from the respective sensor. Some minor atmospheric corrections have occurred.Satellite/Sensor:PlanetScopeResolution:3 metersCredits:NASA Disasters Program, Includes copyrighted material of Planet Labs PBC. All rights reserved.Esri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags04/services/hurricane_idalia_2023/planet_truecolor_preevent/ImageServer/WMSServer
Date of Images:1/10/2024Date of Next Image:UnknownSummary:This PlanetScope imagery captured by Planet Labs Inc. on January 10, 2024 shows the post-event conditions after the Southeast United States severe storms.The color infrared image is created using the near-infrared, red, and green channels from the Planet instrument allowing for the ability to see areas impacted from the flooding. The near-infrared gives the ability to see through thin clouds. Healthy vegetation is shown as red, water is in blue.Suggested Use:A false color composite depicts healthy vegetation as red, water as blue. Some minor atmospheric corrections have occurred.Satellite/Sensor:PlanetScopeResolution:3 metersCredits:NASA Disasters Program, Includes copyrighted material of Planet Labs PBC. All rights reserved.Esri REST Endpoint:See URL section on right side of pageWMS Endpoint:https://maps.disasters.nasa.gov/ags03/services/se_us_severestorms_202401/planet_colorinfrared_20240110/ImageServer/WMSServer
This series of products from MODIS represents the only daily global composites available and is suitable for use at global and regional levels. This True Color band composition (Bands 1 4 3 | Red, Green, Blue) most accurately shows how we see the earth’s surface with our own eyes. It is a natural looking image that is useful for land surface, oceanic and atmospheric analysis. There are four True Color products in total. For each satellite (Aqua and Terra) there is a 250 meter corrected reflectance product and a 500 meter surface reflectance product. Although the resolution is coarser than other satellites, this allows for a global collection of imagery on a daily basis, which is made available in near real-time. In contrast, Landsat needs 16 days to collect a global composite. Besides the maximum resolution difference, the surface and corrected reflectance products also differ in the algorithm used for atmospheric correction.NASA Global Imagery Browse Services (GIBS)This image layer provides access to a subset of the NASA Global Imagery Browse Services (GIBS), which are a set of standard services to deliver global, full-resolution satellite imagery. The GIBS goal is to enable interactive exploration of NASA's Earth imagery for a broad range of users. The purpose of this image layer, and the other GIBS image services hosted by Esri, is to enable convenient access to this beautiful and useful satellite imagery for users of ArcGIS. The source data used by this image layer is a finished image; it is not recommended for quantitative analysis.Several full resolution, global imagery products are built and served by GIBS in near real-time (usually within 3.5 hours of observation). These products are built from NASA Earth Observing System satellites data courtesy of LANCE data providers and other sources. The MODIS instrument aboard Terra and Aqua satellites, the AIRS instrument aboard Aqua, and the OMI instrument aboard Aura are used as sources. Several of the MODIS global products are made available on this Esri hosted service.This image layer hosted by Esri provides direct access to one of the GIBS image products. The Esri servers do not store any of this data itself. Instead, for each received data request, multiple image tiles are retrieved from GIBS, which are then processed and assembled into the proper image for the response. This processing takes place on-the-fly, for each and every request. This ensures that any update to the GIBS data is immediately available in the Esri mosaic service.Note on Time: The image service supporting this map is time enabled, but time has been disabled on this image layer so that the most recent imagery displays by default. If you would like to view imagery over time, you can update the layer properties to enable time animation and configure time settings. The results can be saved in a web map to use later or share with others.