81 datasets found
  1. Trees in Satellite Imagery

    • kaggle.com
    zip
    Updated Jul 13, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Mehmet Cagri Aksoy (2022). Trees in Satellite Imagery [Dataset]. https://www.kaggle.com/datasets/mcagriaksoy/trees-in-satellite-imagery
    Explore at:
    zip(33359310 bytes)Available download formats
    Dataset updated
    Jul 13, 2022
    Authors
    Mehmet Cagri Aksoy
    License

    Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
    License information was derived automatically

    Description

    About Dataset

    This dataset is designed for binary classification tasks on geospatial imagery, specifically to distinguish between land areas with trees and those without. The images were captured by the Sentinel-2 satellite.

    The dataset structure is straightforward: - Each image has a resolution of 64×64 pixels with encoded in JPG format. - Images are organized into two folders: "Trees" and "NoTrees", corresponding to the two classes. - Each folder contains 5,200 images, totaling 10,400 images across the dataset.

    Note: The dataset does not include predefined training, validation, or test splits. Users should partition the data as needed for their specific machine learning, deep learning workflows.

    And you can also cite the source of this data EUROSAT: Helber, P., Bischke, B., Dengel, A., & Borth, D. (2019). Eurosat: A novel dataset and deep learning benchmark for land use and land cover classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 12(7), 2217-2226.

  2. d

    Declassified Satellite Imagery 2 (2002)

    • catalog.data.gov
    • gimi9.com
    • +3more
    Updated Apr 10, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    DOI/USGS/EROS (2025). Declassified Satellite Imagery 2 (2002) [Dataset]. https://catalog.data.gov/dataset/declassified-satellite-imagery-2-2002
    Explore at:
    Dataset updated
    Apr 10, 2025
    Dataset provided by
    DOI/USGS/EROS
    Description

    Declassified satellite images provide an important worldwide record of land-surface change. With the success of the first release of classified satellite photography in 1995, images from U.S. military intelligence satellites KH-7 and KH-9 were declassified in accordance with Executive Order 12951 in 2002. The data were originally used for cartographic information and reconnaissance for U.S. intelligence agencies. Since the images could be of historical value for global change research and were no longer critical to national security, the collection was made available to the public. Keyhole (KH) satellite systems KH-7 and KH-9 acquired photographs of the Earth’s surface with a telescopic camera system and transported the exposed film through the use of recovery capsules. The capsules or buckets were de-orbited and retrieved by aircraft while the capsules parachuted to earth. The exposed film was developed and the images were analyzed for a range of military applications. The KH-7 surveillance system was a high resolution imaging system that was operational from July 1963 to June 1967. Approximately 18,000 black-and-white images and 230 color images are available from the 38 missions flown during this program. Key features for this program were larger area of coverage and improved ground resolution. The cameras acquired imagery in continuous lengthwise sweeps of the terrain. KH-7 images are 9 inches wide, vary in length from 4 inches to 500 feet long, and have a resolution of 2 to 4 feet. The KH-9 mapping program was operational from March 1973 to October 1980 and was designed to support mapping requirements and exact positioning of geographical points for the military. This was accomplished by using image overlap for stereo coverage and by using a camera system with a reseau grid to correct image distortion. The KH-9 framing cameras produced 9 x 18 inch imagery at a resolution of 20-30 feet. Approximately 29,000 mapping images were acquired from 12 missions. The original film sources are maintained by the National Archives and Records Administration (NARA). Duplicate film sources held in the USGS EROS Center archive are used to produce digital copies of the imagery.

  3. Satellite Image Caption Change Detection

    • kaggle.com
    zip
    Updated Jun 14, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Kursat Komurcu (2024). Satellite Image Caption Change Detection [Dataset]. https://www.kaggle.com/datasets/kursatkomurcu/satellite-image-caption-change-detection
    Explore at:
    zip(14202001802 bytes)Available download formats
    Dataset updated
    Jun 14, 2024
    Authors
    Kursat Komurcu
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Description

    @inproceedings{komurcu2024change, title={Change detection in satellite imagery using transformer models and machine learning techniques: a comprehensive captioning dataset}, author={Kürşat K{"o}m{"u}rc{"u} and Linas Petkevi{\v{c}}ius}, booktitle={DAMSS: 15th Conference on Data Analysis Methods for Software Systems, Druskininkai, Lithuania, November 28-30, 2024}, pages={56--57}, year={2024}, publisher={Vilniaus universiteto leidykla} }

    https://scholar.google.com/citations?view_op=view_citation&hl=en&user=Pf0mz8UAAAAJ&citation_for_view=Pf0mz8UAAAAJ:9yKSN-GCB0IC

    This dataset contains image captions of 4 datasets. captions folder contains caption csv files and other folders contain image pairs. Also, there are augmented images inside these folders.

    There are 3 columns in csv files: change: 0 or 1. There is a change or not? caption1: Description of first image caption2 : Description of second image

    These captions were created using MiniCPM-V model

    Links of original datasets:

    CLCD: https://github.com/liumency/CropLand-CD DSIFN: https://github.com/GeoZcx/A-deeply-supervised-image-fusion-network-for-change-detection-in-remote-sensing-images/tree/master/dataset LEVIR-CD: https://chenhao.in/LEVIR/ S2Looking: https://github.com/S2Looking/Dataset

  4. WorldView-2 Level 2A Multispectral 8-Band Satellite Imagery

    • data.nasa.gov
    • s.cnmilf.com
    • +2more
    Updated Apr 1, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2025). WorldView-2 Level 2A Multispectral 8-Band Satellite Imagery [Dataset]. https://data.nasa.gov/dataset/worldview-2-level-2a-multispectral-8-band-satellite-imagery
    Explore at:
    Dataset updated
    Apr 1, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    The WorldView-2 Level 2A Multispectral 8-Band Imagery collection contains satellite imagery acquired from Maxar Technologies (formerly known as DigitalGlobe) by the Commercial Smallsat Data Acquisition (CSDA) Program. Imagery is collected by the DigitalGlobe WorldView-2 satellite using the WorldView-110 camera across the global land surface from October 2009 to the present. This satellite imagery is in the visible and near-infrared waveband range with data in the coastal, blue, green, yellow, red, red edge, and near-infrared (2 bands) wavelengths. It has a spatial resolution of 1.85m at nadir and a temporal resolution of approximately 1.1 days. The data are provided in National Imagery Transmission Format (NITF) and GeoTIFF formats. These level 2A data have been processed and undergone radiometric correction, sensor correction, projected to a plane using a map projection and datum, and has a coarse DEM applied. The data potentially serve a wide variety of applications that require high resolution imagery. Data access is restricted based on a National Geospatial-Intelligence Agency (NGA) license, and investigators must be approved by the CSDA Program.

  5. Images and 4-class labels for semantic segmentation of Sentinel-2 and...

    • zenodo.org
    • data.niaid.nih.gov
    • +1more
    txt, zip
    Updated Nov 24, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Daniel Buscombe; Daniel Buscombe (2022). Images and 4-class labels for semantic segmentation of Sentinel-2 and Landsat RGB, NIR, and SWIR satellite images of coasts (water, whitewater, sediment, other) [Dataset]. http://doi.org/10.5281/zenodo.7344571
    Explore at:
    zip, txtAvailable download formats
    Dataset updated
    Nov 24, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Daniel Buscombe; Daniel Buscombe
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Images and 4-class labels for semantic segmentation of Sentinel-2 and Landsat RGB, NIR, and SWIR satellite images of coasts (water, whitewater, sediment, other)

    Description

    579 images and 579 associated labels for semantic segmentation of Sentinel-2 and Landsat RGB satellite images of coasts. The 4 classes are 0=water, 1=whitewater, 2=sediment, 3=other

    These images and labels have been made using the Doodleverse software package, Doodler*. These images and labels could be used within numerous Machine Learning frameworks for image segmentation, but have specifically been made for use with the Doodleverse software package, Segmentation Gym**.

    Some (422) of these images and labels were originally included in the Coast Train*** data release, and have been modified from their original by reclassifying from the original classes to the present 4 classes.

    The label images are a subset of the following data release**** https://doi.org/10.5281/zenodo.7335647

    Imagery comes from the following 10 sand beach sites:

    1. Duck, NC, Hatteras NC, USA
    2. Santa Cruz CA, USA
    3. Galveston TX, USA
    4. Truc Vert,France
    5. Sunset State Beach CA, USA
    6. Torrey Pines CA, USA
    7. Narrabeen, NSW, Australia
    8. Elwha WA, USA
    9. Ventura region, CA, USA
    10. Klamath region, CA USA

    Imagery are a mixture of 10-m Sentinel-2 and 15-m pansharpened Landsat 7, 8, and 9 visible-band imagery of various sizes. Red, Green, Blue, NIR, and SWIR bands only

    File descriptions

    1. classes.txt, a file containing the class names
    2. images.zip, a zipped folder containing the 3-band RGB images of varying sizes and extents
    3. nir.zip, a zipped folder containing the corresponding near-infrared (NIR) imagery
    4. swir.zip, a zipped folder containing the corresponding shortwave-infrared (SWIR) imagery
    5. labels.zip, a zipped folder containing the 1-band label images
    6. overlays.zip, a zipped folder containing a semi-transparent overlay of the color-coded label on the image (blue=0=water, red=1=whitewater, yellow=2=sediment, green=3=other)
    7. resized_images.zip, RGB images resized to 512x512x3 pixels
    8. resized_nir.zip, NIR images resized to 512x512x3 pixels
    9. resized_swir.zip, SWIR images resized to 512x512x3 pixels
    10. resized_labels.zip, label images resized to 512x512 pixels

    References

    *Doodler: Buscombe, D., Goldstein, E.B., Sherwood, C.R., Bodine, C., Brown, J.A., Favela, J., Fitzpatrick, S., Kranenburg, C.J., Over, J.R., Ritchie, A.C. and Warrick, J.A., 2021. Human‐in‐the‐Loop Segmentation of Earth Surface Imagery. Earth and Space Science, p.e2021EA002085https://doi.org/10.1029/2021EA002085. See https://github.com/Doodleverse/dash_doodler.

    **Segmentation Gym: Buscombe, D., & Goldstein, E. B. (2022). A reproducible and reusable pipeline for segmentation of geoscientific imagery. Earth and Space Science, 9, e2022EA002332. https://doi.org/10.1029/2022EA002332 See: https://github.com/Doodleverse/segmentation_gym

    ***Coast Train data release: Wernette, P.A., Buscombe, D.D., Favela, J., Fitzpatrick, S., and Goldstein E., 2022, Coast Train--Labeled imagery for training and evaluation of data-driven models for image segmentation: U.S. Geological Survey data release, https://doi.org/10.5066/P91NP87I. See https://coasttrain.github.io/CoastTrain/ for more information

    **** Buscombe, Daniel, Goldstein, Evan, Bernier, Julie, Bosse, Stephen, Colacicco, Rosa, Corak, Nick, Fitzpatrick, Sharon, del Jesús González Guillén, Anais, Ku, Venus, Paprocki, Julie, Platt, Lindsay, Steele, Bethel, Wright, Kyle, & Yasin, Brandon. (2022). Images and 4-class labels for semantic segmentation of Sentinel-2 and Landsat RGB satellite images of coasts (water, whitewater, sediment, other) (v1.0) [Data set]. Zenodo. https://doi.org/10.5281/zenodo.7335647

  6. n

    High-Resolution QuickBird Imagery and Related GIS Layers for Barrow, Alaska,...

    • cmr.earthdata.nasa.gov
    • datasets.ai
    • +3more
    not provided
    Updated Oct 7, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). High-Resolution QuickBird Imagery and Related GIS Layers for Barrow, Alaska, USA, Version 1 [Dataset]. https://cmr.earthdata.nasa.gov/search/concepts/C1386246127-NSIDCV0.html
    Explore at:
    not providedAvailable download formats
    Dataset updated
    Oct 7, 2025
    Time period covered
    Aug 1, 2002 - Aug 2, 2002
    Area covered
    Description

    This data set contains high-resolution QuickBird imagery and geospatial data for the entire Barrow QuickBird image area (156.15° W - 157.07° W, 71.15° N - 71.41° N) and Barrow B4 Quadrangle (156.29° W - 156.89° W, 71.25° N - 71.40° N), for use in Geographic Information Systems (GIS) and remote sensing software. The original QuickBird data sets were acquired by DigitalGlobe from 1 to 2 August 2002, and consist of orthorectified satellite imagery. Federal Geographic Data Committee (FGDC)-compliant metadata for all value-added data sets are provided in text, HTML, and XML formats.

    Accessory layers include: 1:250,000- and 1:63,360-scale USGS Digital Raster Graphic (DRG) mosaic images (GeoTIFF format); 1:250,000- and 1:63,360-scale USGS quadrangle index maps (ESRI Shapefile format); an index map for the 62 QuickBird tiles (ESRI Shapefile format); and a simple polygon layer of the extent of the Barrow QuickBird image area and the Barrow B4 quadrangle area (ESRI Shapefile format).

    Unmodified QuickBird data comprise 62 data tiles in Universal Transverse Mercator (UTM) Zone 4 in GeoTIFF format. Standard release files describing the QuickBird data are included, along with the DigitalGlobe license agreement and product handbooks.

    The baseline geospatial data support education, outreach, and multi-disciplinary research of environmental change in Barrow, which is an area of focused scientific interest. Data are provided on four DVDs. This product is available only to investigators funded specifically from the National Science Foundation (NSF), Office of Polar Programs (OPP), Arctic Sciences Section. An NSF OPP award number must be provided when ordering this data. Contact NSIDC User Services at nsidc@nsidc.org to order the data, and include an NSF OPP award number in the email.

  7. NOAA Colorized Satellite Imagery

    • gis-fema.hub.arcgis.com
    • disasterpartners.org
    • +17more
    Updated Jun 27, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NOAA GeoPlatform (2019). NOAA Colorized Satellite Imagery [Dataset]. https://gis-fema.hub.arcgis.com/maps/8e93e0f942ae4d54a8d089e3cd5d2774
    Explore at:
    Dataset updated
    Jun 27, 2019
    Dataset provided by
    National Oceanic and Atmospheric Administrationhttp://www.noaa.gov/
    Authors
    NOAA GeoPlatform
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Area covered
    Description

    Metadata: NOAA GOES-R Series Advanced Baseline Imager (ABI) Level 1b RadiancesMore information about this imagery can be found here.This satellite imagery combines data from the NOAA GOES East and West satellites and the JMA Himawari satellite, providing full coverage of weather events for most of the world, from the west coast of Africa west to the east coast of India. The tile service updates to the most recent image every 10 minutes at 1.5 km per pixel resolution.The infrared (IR) band detects radiation that is emitted by the Earth’s surface, atmosphere and clouds, in the “infrared window” portion of the spectrum. The radiation has a wavelength near 10.3 micrometers, and the term “window” means that it passes through the atmosphere with relatively little absorption by gases such as water vapor. It is useful for estimating the emitting temperature of the Earth’s surface and cloud tops. A major advantage of the IR band is that it can sense energy at night, so this imagery is available 24 hours a day.The Advanced Baseline Imager (ABI) instrument samples the radiance of the Earth in sixteen spectral bands using several arrays of detectors in the instrument’s focal plane. Single reflective band ABI Level 1b Radiance Products (channels 1 - 6 with approximate center wavelengths 0.47, 0.64, 0.865, 1.378, 1.61, 2.25 microns, respectively) are digital maps of outgoing radiance values at the top of the atmosphere for visible and near-infrared (IR) bands. Single emissive band ABI L1b Radiance Products (channels 7 - 16 with approximate center wavelengths 3.9, 6.185, 6.95, 7.34, 8.5, 9.61, 10.35, 11.2, 12.3, 13.3 microns, respectively) are digital maps of outgoing radiance values at the top of the atmosphere for IR bands. Detector samples are compressed, packetized and down-linked to the ground station as Level 0 data for conversion to calibrated, geo-located pixels (Level 1b Radiance data). The detector samples are decompressed, radiometrically corrected, navigated and resampled onto an invariant output grid, referred to as the ABI fixed grid.McIDAS merge technique and color mapping provided by the Cooperative Institute for Meteorological Satellite Studies (Space Science and Engineering Center, University of Wisconsin - Madison) using satellite data from SSEC Satellite Data Services and the McIDAS visualization software.

  8. f

    Power Plant Satellite Imagery Dataset

    • datasetcatalog.nlm.nih.gov
    • figshare.com
    Updated Aug 16, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Chandrasekar, Gouttham; Johnson, Timothy; Nagenalli, Trishul; Jeuland, Marc; Li, Boning; Hossain, Shamikh; Brigman, Benjamin; Bradbury, Kyle; Collins, Leslie (2017). Power Plant Satellite Imagery Dataset [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0001777483
    Explore at:
    Dataset updated
    Aug 16, 2017
    Authors
    Chandrasekar, Gouttham; Johnson, Timothy; Nagenalli, Trishul; Jeuland, Marc; Li, Boning; Hossain, Shamikh; Brigman, Benjamin; Bradbury, Kyle; Collins, Leslie
    Description

    This dataset contains satellite imagery of 4,454 power plants within the United States. The imagery is provided at two resolutions: 1m (4-band NAIP iamgery with near-infrared) and 30m (Landsat 8, pansharpened to 15m). The NAIP imagery is available for the U.S. and Landsat 8 is available globally. This dataset may be of value for computer vision work, machine learning, as well as energy and environmental analyses.Additionally, annotations of the specific locations of the spatial extent of the power plants in each image is provided. These annotations were collected via the crowdsourcing platform, Amazon Mechanical Turk, using multiple annotators for each image to ensure quality. Links to the sources of the imagery data, the annotation tool, and the team that created the dataset are included in the "References" section.To read more on these data, please refer to the "Power Plant Satellite Imagery Dataset Overview.pdf" file. To download a sample of the data without downloading the entire dataset, download "sample.zip" which includes two sample powerplants and the NAIP, Landsat 8, and binary annotations for each.Note: the NAIP imagery may appear "washed out" when viewed in standard image viewing software because it includes a near-infrared band in addition to the standard RGB data.

  9. R

    Remote Sensing Software Report

    • datainsightsmarket.com
    doc, pdf, ppt
    Updated Jun 16, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Data Insights Market (2025). Remote Sensing Software Report [Dataset]. https://www.datainsightsmarket.com/reports/remote-sensing-software-1937670
    Explore at:
    pdf, ppt, docAvailable download formats
    Dataset updated
    Jun 16, 2025
    Dataset authored and provided by
    Data Insights Market
    License

    https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The booming remote sensing software market is projected to reach $5 billion by 2025, growing at a CAGR of 8% until 2033. Driven by advancements in sensor technology and cloud computing, this market caters to various sectors, including environmental monitoring, urban planning, and defense. Learn about key market trends and leading players.

  10. R

    Remote Sensing Image Processing Platform Report

    • datainsightsmarket.com
    doc, pdf, ppt
    Updated Jun 29, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Data Insights Market (2025). Remote Sensing Image Processing Platform Report [Dataset]. https://www.datainsightsmarket.com/reports/remote-sensing-image-processing-platform-494488
    Explore at:
    ppt, doc, pdfAvailable download formats
    Dataset updated
    Jun 29, 2025
    Dataset authored and provided by
    Data Insights Market
    License

    https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The Remote Sensing Image Processing Platform market is booming, projected to reach $2542 million by 2025, driven by AI, cloud computing, and high-resolution imagery. Explore market trends, key players (ESRI, Hexagon, etc.), and future growth projections in this comprehensive analysis.

  11. World Imagery

    • cacgeoportal.com
    • hurricane-tx-arcgisforem.hub.arcgis.com
    • +4more
    Updated Dec 13, 2009
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Esri (2009). World Imagery [Dataset]. https://www.cacgeoportal.com/maps/10df2279f9684e4a9f6a7f08febac2a9
    Explore at:
    Dataset updated
    Dec 13, 2009
    Dataset authored and provided by
    Esrihttp://esri.com/
    Area covered
    World,
    Description

    World Imagery provides one meter or better satellite and aerial imagery for most of the world’s landmass and lower resolution satellite imagery worldwide. The map is currently comprised of the following sources:Worldwide 15-m resolution TerraColor imagery at small and medium map scales.Vantor imagery basemap products around the world: Vivid Premium at 15-cm HD resolution for select metropolitan areas, Vivid Advanced 30-cm HD for more than 1,000 metropolitan areas, and Vivid Standard from 1.2-m to 0.6-cm resolution for the most of the world, with 30-cm HD across the United States and parts of Western Europe. More information on the Vantor products is included below. High-resolution aerial photography contributed by the GIS User Community. This imagery ranges from 30-cm to 3-cm resolution. You can contribute your imagery to this map and have it served by Esri via the Community Maps Program. Vantor Basemap ProductsVivid PremiumProvides committed image currency in a high-resolution, high-quality image layer over defined metropolitan and high-interest areas across the globe. The product provides 15-cm HD resolution imagery.Vivid AdvancedProvides committed image currency in a high-resolution, high-quality image layer over defined metropolitan and high-interest areas across the globe. The product includes a mix of native 30-cm and 30-cm HD resolution imagery.Vivid StandardProvides a visually consistent and continuous image layer over large areas through advanced image mosaicking techniques, including tonal balancing and seamline blending across thousands of image strips. Available from 1.2-m down to 30-cm HD. More on Vantor HD. Imagery UpdatesYou can use the Updates Mode in the World Imagery Wayback app to learn more about recent and pending updates. Accessing this information requires a user login with an ArcGIS organizational account. CitationsThis layer includes imagery provider, collection date, resolution, accuracy, and source of the imagery. With the Identify tool in ArcGIS Desktop or the ArcGIS Online Map Viewer you can see imagery citations. Citations returned apply only to the available imagery at that location and scale. You may need to zoom in to view the best available imagery. Citations can also be accessed in the World Imagery with Metadata web map. UseYou can add this layer to the ArcGIS Online Map Viewer, ArcGIS Desktop, or ArcGIS Pro. To view this layer with a useful reference overlay, open the Imagery Hybrid web map. FeedbackHave you ever seen a problem in the Esri World Imagery Map that you wanted to report? You can use the Imagery Map Feedback web map to provide comments on issues. The feedback will be reviewed by the ArcGIS Online team and considered for one of our updates.

  12. a

    Recent GOES Weather Satellite Imagery

    • eo-for-disaster-management-amerigeoss.hub.arcgis.com
    • livingatlas-dcdev.opendata.arcgis.com
    • +2more
    Updated Jun 18, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    ArcGIS StoryMaps (2019). Recent GOES Weather Satellite Imagery [Dataset]. https://eo-for-disaster-management-amerigeoss.hub.arcgis.com/maps/5f25bbe4966a4205a785aaf046727c5e
    Explore at:
    Dataset updated
    Jun 18, 2019
    Dataset authored and provided by
    ArcGIS StoryMaps
    Area covered
    Description

    Map InformationThis nowCOAST updating map service provides maps depicting visible, infrared, and water vapor imagery composited from NOAA/NESDIS GOES-EAST and GOES-WEST. The horizontal resolutions of the IR, visible, and water vapor composite images are approximately 1km, 4km, and 4km, respectively. The visible and IR imagery depict the location of clouds. The water vapor imagery indicates the amount of water vapor contained in the mid to upper levels of the troposphere. The darker grays indicate drier air while the brighter grays/whites indicates more saturated air. The GOES composite imagers are updated in the nowCOAST map service every 30 minutes. For more detailed information about the update schedule, see: http://new.nowcoast.noaa.gov/help/#section=updatescheduleBackground InformationThe GOES map layer displays visible (VIS) and infrared (IR4) cloud, and water vapor (WV) imagery from the NOAA/ National Environmental Satellite, Data, and Information Service (NESDIS) Geostationary Satellites (GOES-East and GOES-West). These satellites circle the Earth in a geosynchronous orbit (i.e. orbit the equatorial plane of the Earth at a speed matching the rotation of the Earth). This allows the satellites to hover continuously over one position on the surface. The geosynchronous plane is about 35,800 km (22,300 miles) above the Earth which is high enough to allow the satellites a full-disc view of the Earth. GOES-East is positioned at 75 deg W longitude and the equator. GOES-West is located at 135 deg W and the equator. The two satellites cover an area from 20 deg W to 165 deg E. The images are derived from data from GOES' Imagers. An imager is a multichannel instrument that senses radiant energy and reflected solar energy from the Earth's surface and atmosphere. The VIS, IR4, and WV images are obtained from GOES Imager Channels 1, 4, and 3, respectively. The GOES raster images are obtained from NESDIS servers in geo-referenced Tagged-Image File Format (geoTIFF).Time InformationThis map is time-enabled, meaning that each individual layer contains time-varying data and can be utilized by clients capable of making map requests that include a time component.This particular service can be queried with or without the use of a time component. If the time parameter is specified in a request, the data or imagery most relevant to the provided time value, if any, will be returned. If the time parameter is not specified in a request, the latest data or imagery valid for the present system time will be returned to the client. If the time parameter is not specified and no data or imagery is available for the present time, no data will be returned.In addition to ArcGIS Server REST access, time-enabled OGC WMS 1.3.0 access is also provided by this service.Due to software limitations, the time extent of the service and map layers displayed below does not provide the most up-to-date start and end times of available data. Instead, users have three options for determining the latest time information about the service:Issue a returnUpdates=true request for an individual layer or for the service itself, which will return the current start and end times of available data, in epoch time format (milliseconds since 00:00 January 1, 1970). To see an example, click on the "Return Updates" link at the bottom of this page under "Supported Operations". Refer to the ArcGIS REST API Map Service Documentation for more information.Issue an Identify (ArcGIS REST) or GetFeatureInfo (WMS) request against the proper layer corresponding with the target dataset. For raster data, this would be the "Image Footprints with Time Attributes" layer in the same group as the target "Image" layer being displayed. For vector (point, line, or polygon) data, the target layer can be queried directly. In either case, the attributes returned for the matching raster(s) or vector feature(s) will include the following:validtime: Valid timestamp.starttime: Display start time.endtime: Display end time.reftime: Reference time (sometimes reffered to as issuance time, cycle time, or initialization time).projmins: Number of minutes from reference time to valid time.desigreftime: Designated reference time; used as a common reference time for all items when individual reference times do not match.desigprojmins: Number of minutes from designated reference time to valid time.Query the nowCOAST LayerInfo web service, which has been created to provide additional information about each data layer in a service, including a list of all available "time stops" (i.e. "valid times"), individual timestamps, or the valid time of a layer's latest available data (i.e. "Product Time"). For more information about the LayerInfo web service, including examples of various types of requests, refer to the nowCOAST help documentation at: http://new.nowcoast.noaa.gov/help/#section=layerinfoReferencesNOAA, 2013: Geostationary Operational Environmental Satellites (GOES). (Available at http://www.ospo.noaa.gov/Operations/GOES/index.html)A Basic Introduction to Water Vapor Imagery. (Available at http://cimss.ssec.wisc.edu/goes/misc/wv/wv_intro.html)CIMSS, 1996: Water Vapor Imagery Tutorial (Available at http://cimss.ssec.wisc.edu/goes/misc/wv/)

  13. WorldView-1 Level 1B Panchromatic Satellite Imagery

    • data.nasa.gov
    • gimi9.com
    • +3more
    Updated Apr 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2025). WorldView-1 Level 1B Panchromatic Satellite Imagery [Dataset]. https://data.nasa.gov/dataset/worldview-1-level-1b-panchromatic-satellite-imagery
    Explore at:
    Dataset updated
    Apr 1, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    The WorldView-1 Level 1B Panchromatic Imagery collection contains satellite imagery acquired from Maxar Technologies (formerly known as DigitalGlobe) by the Commercial Smallsat Data Acquisition (CSDA) Program. Panchromatic imagery is collected by the DigitalGlobe WorldView-1 satellite using the WorldView-60 camera across the global land surface from September 2007 to the present. Data have a spatial resolution of 0.5 meters at nadir and a temporal resolution of approximately 1.7 days. The data are provided in National Imagery Transmission Format (NITF) and GeoTIFF formats. This level 1B data is sensor corrected and is an un-projected (raw) product. The data potentially serve a wide variety of applications that require high resolution imagery. Data access is restricted based on a National Geospatial-Intelligence Agency (NGA) license, and investigators must be approved by the CSDA Program.

  14. R

    Remote Sensing Interpretation Software Report

    • datainsightsmarket.com
    doc, pdf, ppt
    Updated Jun 9, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Data Insights Market (2025). Remote Sensing Interpretation Software Report [Dataset]. https://www.datainsightsmarket.com/reports/remote-sensing-interpretation-software-532284
    Explore at:
    ppt, doc, pdfAvailable download formats
    Dataset updated
    Jun 9, 2025
    Dataset authored and provided by
    Data Insights Market
    License

    https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The remote sensing interpretation software market is experiencing robust growth, driven by increasing demand for precise geospatial data across diverse sectors. The market's expansion is fueled by technological advancements in satellite imagery, drone technology, and artificial intelligence (AI), enabling more efficient and accurate data analysis. Applications span agriculture (precision farming), urban planning (infrastructure development and monitoring), environmental monitoring (deforestation tracking, pollution detection), defense & security (surveillance and intelligence), and natural resource management. The rising adoption of cloud-based solutions and the growing need for real-time data processing further contribute to market expansion. We estimate the market size in 2025 to be approximately $5 billion, considering the significant investments in R&D and the expanding applications across various sectors. A compound annual growth rate (CAGR) of 12% is projected from 2025 to 2033, indicating substantial future growth potential. However, the market also faces challenges. High initial investment costs for software and hardware, the need for specialized expertise in data interpretation, and data security and privacy concerns act as restraints on market growth. Furthermore, the market is characterized by intense competition among established players like Hexagon, Microsoft, and IBM, and emerging technology providers. The market is segmented by software type (cloud-based, on-premise), application (agriculture, urban planning, environmental monitoring), and region. North America and Europe currently hold significant market share, driven by early adoption and established infrastructure. However, the Asia-Pacific region is witnessing rapid growth due to increasing government initiatives and rising investments in infrastructure development. The competitive landscape is dynamic, with mergers and acquisitions, strategic partnerships, and technological innovations shaping the market’s future. The market's trajectory suggests a promising future, but continued innovation and addressal of challenges will be crucial to sustain this growth.

  15. C

    Sentinel 2 satellite image

    • ckan.mobidatalab.eu
    wms
    Updated Apr 29, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    GeoDatiGovIt RNDT (2023). Sentinel 2 satellite image [Dataset]. https://ckan.mobidatalab.eu/dataset/satellite-image-sentinel-2
    Explore at:
    wmsAvailable download formats
    Dataset updated
    Apr 29, 2023
    Dataset provided by
    GeoDatiGovIt RNDT
    Description

    Mosaic of two images acquired by the Sentinel-2 satellite taken on 19 and 22 April 2016: the "Sentinels" are a fleet of satellites designed to return data and images deriving from Earth observation to the European Commission's Copernicus programme. The Sentinel 2 mission launched in June 2015, concerns land monitoring. The Sentinel-2 satellite image is also available as a background map from the Geoportal viewer. To use this service in other viewing software, copy the address from the "online resources" field. The version of the WMS is 1.3.0.

  16. a

    MAP for website - Satellite Maps Western Hemisphere

    • noaa.hub.arcgis.com
    Updated Aug 4, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NOAA GeoPlatform (2023). MAP for website - Satellite Maps Western Hemisphere [Dataset]. https://noaa.hub.arcgis.com/maps/4406a7daa7b94b5f8c8364f7f2dc9bf2
    Explore at:
    Dataset updated
    Aug 4, 2023
    Dataset authored and provided by
    NOAA GeoPlatform
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Area covered
    Description

    This application is intended for informational purposes only and is not an operational product. The tool provides the capability to access, view and interact with satellite imagery, and shows the latest view of Earth as it appears from space.For additional imagery from NOAA's GOES East and GOES West satellites, please visit our Imagery and Data page or our cooperative institute partners at CIRA and CIMSS.This website should not be used to support operational observation, forecasting, emergency, or disaster mitigation operations, either public or private. In addition, we do not provide weather forecasts on this site — that is the mission of the National Weather Service. Please contact them for any forecast questions or issues. Using the Maps​What does the Layering Options icon mean?The Layering Options widget provides a list of operational layers and their symbols, and allows you to turn individual layers on and off. The order in which layers appear in this widget corresponds to the layer order in the map. The top layer ‘checked’ will indicate what you are viewing in the map, and you may be unable to view the layers below.Layers with expansion arrows indicate that they contain sublayers or subtypes.What does the Time Slider icon do?The Time Slider widget enables you to view temporal layers in a map, and play the animation to see how the data changes over time. Using this widget, you can control the animation of the data with buttons to play and pause, go to the previous time period, and go to the next time period.Do these maps work on mobile devices and different browsers?Yes!Why are there black stripes / missing data on the map?NOAA Satellite Maps is for informational purposes only and is not an operational product; there are times when data is not available.Why does the imagery load slowly?This map viewer does not load pre-generated web-ready graphics and animations like many satellite imagery apps you may be used to seeing. Instead, it downloads geospatial data from our data servers through a Map Service, and the app in your browser renders the imagery in real-time. Each pixel needs to be rendered and geolocated on the web map for it to load.How can I get the raw data and download the GIS World File for the images I choose?The geospatial data Map Service for the NOAA Satellite Maps GOES satellite imagery is located on our Satellite Maps ArcGIS REST Web Service ( available here ).We support open information sharing and integration through this RESTful Service, which can be used by a multitude of GIS software packages and web map applications (both open and licensed).Data is for display purposes only, and should not be used operationally.Are there any restrictions on using this imagery?NOAA supports an open data policy and we encourage publication of imagery from NOAA Satellite Maps; when doing so, please cite it as "NOAA" and also consider including a permalink (such as this one) to allow others to explore the imagery.For acknowledgment in scientific journals, please use:We acknowledge the use of imagery from the NOAA Satellite Maps application: LINKThis imagery is not copyrighted. You may use this material for educational or informational purposes, including photo collections, textbooks, public exhibits, computer graphical simulations and internet web pages. This general permission extends to personal web pages. About this satellite imageryWhat am I looking at in these maps?In this map you are seeing the past 24 hours (updated approximately every 10 minutes) of the Western Hemisphere and Pacific Ocean, as seen by the NOAA GOES East (GOES-16) and GOES West (GOES-18) satellites. In this map you can also view four different ‘layers’. The views show ‘GeoColor’, ‘infrared’, and ‘water vapor’.This maps shows the coverage area of the GOES East and GOES West satellites. GOES East, which orbits the Earth from 75.2 degrees west longitude, provides a continuous view of the Western Hemisphere, from the West Coast of Africa to North and South America. GOES West, which orbits the Earth at 137.2 degrees west longitude, sees western North and South America and the central and eastern Pacific Ocean all the way to New Zealand.What does the GOES GeoColor imagery show?The 'Merged GeoColor’ map shows the coverage area of the GOES East and GOES West satellites and includes the entire Western Hemisphere and most of the Pacific Ocean. This imagery uses a combination of visible and infrared channels and is updated approximately every 15 minutes in real time. GeoColor imagery approximates how the human eye would see Earth from space during daylight hours, and is created by combining several of the spectral channels from the Advanced Baseline Imager (ABI) – the primary instrument on the GOES satellites. The wavelengths of reflected sunlight from the red and blue portions of the spectrum are merged with a simulated green wavelength component, creating RGB (red-green-blue) imagery. At night, infrared imagery shows high clouds as white and low clouds and fog as light blue. The static city lights background basemap is derived from a single composite image from the Visible Infrared Imaging Radiometer Suite (VIIRS) Day Night Band. For example, temporary power outages will not be visible. Learn more.What does the GOES infrared map show?The 'GOES infrared' map displays heat radiating off of clouds and the surface of the Earth and is updated every 15 minutes in near real time. Higher clouds colorized in orange often correspond to more active weather systems. This infrared band is one of 12 channels on the Advanced Baseline Imager, the primary instrument on both the GOES East and West satellites. on the GOES the multiple GOES East ABI sensor’s infrared bands, and is updated every 15 minutes in real time. Infrared satellite imagery can be "colorized" or "color-enhanced" to bring out details in cloud patterns. These color enhancements are useful to meteorologists because they signify “brightness temperatures,” which are approximately the temperature of the radiating body, whether it be a cloud or the Earth’s surface. In this imagery, yellow and orange areas signify taller/colder clouds, which often correlate with more active weather systems. Blue areas are usually “clear sky,” while pale white areas typically indicate low-level clouds. During a hurricane, cloud top temperatures will be higher (and colder), and therefore appear dark red. This imagery is derived from band #13 on the GOES East and GOES West Advanced Baseline Imager.How does infrared satellite imagery work?The infrared (IR) band detects radiation that is emitted by the Earth’s surface, atmosphere and clouds, in the “infrared window” portion of the spectrum. The radiation has a wavelength near 10.3 micrometers, and the term “window” means that it passes through the atmosphere with relatively little absorption by gases such as water vapor. It is useful for estimating the emitting temperature of the Earth’s surface and cloud tops. A major advantage of the IR band is that it can sense energy at night, so this imagery is available 24 hours a day.What do the colors on the infrared map represent?In this imagery, yellow and orange areas signify taller/colder clouds, which often correlate with more active weather systems. Blue areas are clear sky, while pale white areas indicate low-level clouds, or potentially frozen surfaces. Learn more about this weather imagery.What does the GOES water vapor map layer show?The GOES ‘water vapor’ map displays the concentration and location of clouds and water vapor in the atmosphere and shows data from both the GOES East and GOES West satellites. Imagery is updated approximately every 15 minutes in real time. Water vapor imagery, which is useful for determining locations of moisture and atmospheric circulations, is created using a wavelength of energy sensitive to the content of water vapor in the atmosphere. In this imagery, green-blue and white areas indicate the presence of high water vapor or moisture content, whereas dark orange and brown areas indicate little or no moisture present. This imagery is derived from band #10 on the GOES East and GOES West Advanced Baseline Imager.What do the colors on the water vapor map represent?In this imagery, green-blue and white areas indicate the presence of high water vapor or moisture content, whereas dark orange and brown areas indicate less moisture present. Learn more about this water vapor imagery.About the satellitesWhat are the GOES satellites?NOAA’s most sophisticated Geostationary Operational Environmental Satellites (GOES), known as the GOES-R Series, provide advanced imagery and atmospheric measurements of Earth’s Western Hemisphere, real-time mapping of lightning activity, and improved monitoring of solar activity and space weather.The first satellite in the series, GOES-R, now known as GOES-16, was launched in 2016 and is currently operational as NOAA’s GOES East satellite. In 2018, NOAA launched another satellite in the series, GOES-T, which joined GOES-16 in orbit as GOES-18. GOES-17 became operational as GOES West in January 2023.Together, GOES East and GOES West provide coverage of the Western Hemisphere and most of the Pacific Ocean, from the west coast of Africa all the way to New Zealand. Each satellite orbits the Earth from about 22,200 miles away.

  17. Z

    Images and 2-class labels for semantic segmentation of Sentinel-2 and...

    • data.niaid.nih.gov
    Updated Dec 2, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Buscombe, Daniel (2022). Images and 2-class labels for semantic segmentation of Sentinel-2 and Landsat RGB satellite images of coasts (water, other) [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_7384241
    Explore at:
    Dataset updated
    Dec 2, 2022
    Dataset provided by
    Marda Science LLC
    Authors
    Buscombe, Daniel
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Images and 2-class labels for semantic segmentation of Sentinel-2 and Landsat RGB satellite images of coasts (water, other)

    Images and 2-class labels for semantic segmentation of Sentinel-2 and Landsat RGB satellite images of coasts (water, other)

    Description

    4088 images and 4088 associated labels for semantic segmentation of Sentinel-2 and Landsat RGB satellite images of coasts. The 2 classes are 1=water, 0=other. Imagery are a mixture of 10-m Sentinel-2 and 15-m pansharpened Landsat 7, 8, and 9 visible-band imagery of various sizes. Red, Green, Blue bands only

    These images and labels could be used within numerous Machine Learning frameworks for image segmentation, but have specifically been made for use with the Doodleverse software package, Segmentation Gym**.

    Two data sources have been combined

    Dataset 1

    1018 image-label pairs from the following data release**** https://doi.org/10.5281/zenodo.7335647

    Labels have been reclassified from 4 classes to 2 classes.

    Some (422) of these images and labels were originally included in the Coast Train*** data release, and have been modified from their original by reclassifying from the original classes to the present 2 classes.

    These images and labels have been made using the Doodleverse software package, Doodler*.

    Dataset 2

    3070 image-label pairs from the Sentinel-2 Water Edges Dataset (SWED)***** dataset, https://openmldata.ukho.gov.uk/, described by Seale et al. (2022)******

    A subset of the original SWED imagery (256 x 256 x 12) and labels (256 x 256 x 1) have been chosen, based on the criteria of more than 2.5% of the pixels represent water

    File descriptions

    classes.txt, a file containing the class names
    
    images.zip, a zipped folder containing the 3-band RGB images of varying sizes and extents
    
    labels.zip, a zipped folder containing the 1-band label images
    
    overlays.zip, a zipped folder containing a semi-transparent overlay of the color-coded label on the image (red=1=water, bllue=0=other)
    
    resized_images.zip, RGB images resized to 512x512x3 pixels
    
    resized_labels.zip, label images resized to 512x512x1 pixels
    

    References

    *Doodler: Buscombe, D., Goldstein, E.B., Sherwood, C.R., Bodine, C., Brown, J.A., Favela, J., Fitzpatrick, S., Kranenburg, C.J., Over, J.R., Ritchie, A.C. and Warrick, J.A., 2021. Human‐in‐the‐Loop Segmentation of Earth Surface Imagery. Earth and Space Science, p.e2021EA002085https://doi.org/10.1029/2021EA002085. See https://github.com/Doodleverse/dash_doodler.

    **Segmentation Gym: Buscombe, D., & Goldstein, E. B. (2022). A reproducible and reusable pipeline for segmentation of geoscientific imagery. Earth and Space Science, 9, e2022EA002332. https://doi.org/10.1029/2022EA002332 See: https://github.com/Doodleverse/segmentation_gym

    ***Coast Train data release: Wernette, P.A., Buscombe, D.D., Favela, J., Fitzpatrick, S., and Goldstein E., 2022, Coast Train--Labeled imagery for training and evaluation of data-driven models for image segmentation: U.S. Geological Survey data release, https://doi.org/10.5066/P91NP87I. See https://coasttrain.github.io/CoastTrain/ for more information

    ****Buscombe, Daniel, Goldstein, Evan, Bernier, Julie, Bosse, Stephen, Colacicco, Rosa, Corak, Nick, Fitzpatrick, Sharon, del Jesús González Guillén, Anais, Ku, Venus, Paprocki, Julie, Platt, Lindsay, Steele, Bethel, Wright, Kyle, & Yasin, Brandon. (2022). Images and 4-class labels for semantic segmentation of Sentinel-2 and Landsat RGB satellite images of coasts (water, whitewater, sediment, other) (v1.0) [Data set]. Zenodo. https://doi.org/10.5281/zenodo.7335647

    *****Seale, C., Redfern, T., Chatfield, P. 2022. Sentinel-2 Water Edges Dataset (SWED) https://openmldata.ukho.gov.uk/

    ******Seale, C., Redfern, T., Chatfield, P., Luo, C. and Dempsey, K., 2022. Coastline detection in satellite imagery: A deep learning approach on new benchmark data. Remote Sensing of Environment, 278, p.113044.

  18. D

    Satellite Imagery Object Detection Market Research Report 2033

    • dataintelo.com
    csv, pdf, pptx
    Updated Sep 30, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). Satellite Imagery Object Detection Market Research Report 2033 [Dataset]. https://dataintelo.com/report/satellite-imagery-object-detection-market
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Sep 30, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Satellite Imagery Object Detection Market Outlook



    The global satellite imagery object detection market size reached USD 1.92 billion in 2024, according to the latest research, and is expected to grow at a CAGR of 13.4% during the forecast period, reaching USD 5.76 billion by 2033. This robust growth is primarily driven by the increasing adoption of advanced analytics technologies, such as artificial intelligence and deep learning, which have significantly enhanced the accuracy and efficiency of object detection from satellite images. The rising demand for real-time geospatial intelligence across defense, disaster management, urban planning, and agriculture further accelerates the expansion of the satellite imagery object detection market globally.



    One of the primary growth factors for the satellite imagery object detection market is the rapid advancement in remote sensing technologies and artificial intelligence. The integration of machine learning and deep learning algorithms into satellite imagery analysis has drastically improved the precision and speed of object detection, enabling more effective monitoring and assessment of large geographic areas. This technological evolution is particularly vital for sectors like defense and intelligence, where timely and accurate detection of objects, such as vehicles, ships, or infrastructure, can be critical for national security and strategic planning. Furthermore, the proliferation of high-resolution commercial satellites and the availability of open-source satellite data have democratized access to satellite imagery, fostering innovation and enabling a broader range of applications across industries.



    Another significant driver is the growing need for disaster management and environmental monitoring solutions. As climate change leads to more frequent and severe natural disasters, governments and organizations are increasingly leveraging satellite imagery object detection to assess damage, monitor environmental changes, and coordinate emergency response efforts. The ability to detect and analyze objects such as flooded areas, damaged infrastructure, or deforestation in near real-time is invaluable for minimizing human and economic losses. Additionally, the adoption of satellite imagery in agriculture for crop monitoring, yield estimation, and precision farming is gaining momentum, further contributing to the expansion of the satellite imagery object detection market.



    The market is also benefiting from the expansion of commercial applications, including urban planning, oil and gas exploration, and logistics. Urban planners and municipal authorities utilize satellite imagery object detection to monitor urban sprawl, plan infrastructure development, and manage land use more efficiently. In the oil and gas sector, companies employ these solutions for pipeline monitoring, site selection, and environmental compliance. The growing emphasis on smart cities and sustainable development is expected to create new opportunities for market participants, as satellite imagery becomes an essential tool for data-driven decision-making and resource optimization.



    Regionally, North America continues to dominate the satellite imagery object detection market, driven by significant investments in space technology, a strong presence of leading technology providers, and robust demand from defense and commercial sectors. Europe and Asia Pacific are also witnessing substantial growth, fueled by increasing government initiatives, technological advancements, and the rising adoption of satellite-based solutions across various industries. Emerging markets in Latin America and the Middle East & Africa are gradually catching up, supported by infrastructure modernization projects and growing awareness of the benefits of satellite imagery analytics.



    Component Analysis



    The satellite imagery object detection market by component is segmented into software, hardware, and services. Software forms the backbone of the market, accounting for the largest share, as it encompasses the algorithms and platforms that enable the detection, classification, and analysis of objects within satellite images. With advancements in AI and machine learning, software solutions are becoming increasingly sophisticated, offering higher accuracy, faster processing times, and seamless integration with existing geospatial information systems (GIS). Companies are investing heavily in developing proprietary software that can handle large volumes of data, automate feature ex

  19. GeoEye-1 Level 1B Panchromatic Satellite Imagery

    • catalog.data.gov
    • s.cnmilf.com
    • +3more
    Updated Sep 19, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NASA/CSDA (2025). GeoEye-1 Level 1B Panchromatic Satellite Imagery [Dataset]. https://catalog.data.gov/dataset/geoeye-1-level-1b-panchromatic-satellite-imagery
    Explore at:
    Dataset updated
    Sep 19, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    The GeoEye-1 Level 1B Panchromatic Imagery collection contains satellite imagery acquired from Maxar Technologies (formerly known as DigitalGlobe) by the Commercial Smallsat Data Acquisition (CSDA) Program. Imagery is collected by the GeoEye-1 satellite using the GeoEye-1 Imaging System across the global land surface from September 2008 to the present. This data product includes panchromatic imagery with a spatial resolution of 0.46m at nadir (0.41m before summer 2013) and a temporal resolution of approximately 3 days. The data are provided in National Imagery Transmission Format (NITF) and GeoTIFF formats. This level 1B data is sensor corrected and is an un-projected (raw) product. The data potentially serve a wide variety of applications that require high resolution imagery. Data access is restricted based on a National Geospatial-Intelligence Agency (NGA) license, and investigators must be approved by the CSDA Program.

  20. WorldView-3 Level 2A Multispectral 8-Band Satellite Imagery

    • catalog.data.gov
    • gimi9.com
    • +1more
    Updated Sep 18, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NASA/CSDA (2025). WorldView-3 Level 2A Multispectral 8-Band Satellite Imagery [Dataset]. https://catalog.data.gov/dataset/worldview-3-level-2a-multispectral-8-band-satellite-imagery
    Explore at:
    Dataset updated
    Sep 18, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    The WorldView-3 Level 2A Multispectral 8-Band Imagery collection contains satellite imagery acquired from Maxar Technologies (formerly known as DigitalGlobe) by the Commercial Smallsat Data Acquisition (CSDA) Program. Imagery is collected by the DigitalGlobe WorldView-3 satellite using the WorldView-110 camera across the global land surface from August 2014 to the present. This satellite imagery is in a range of wavebands with data in the coastal, blue, green, yellow, red, red edge, and near-infrared (2 bands) wavelengths. The imagery has a spatial resolution of 1.24m at nadir and a temporal resolution of less than one day. The data are provided in National Imagery Transmission Format (NITF). These level 2A data have been processed and undergone radiometric correction, sensor correction, projected to a plane using a map projection and datum, and has a coarse DEM applied. The data potentially serve a wide variety of applications that require high resolution imagery. Data access is restricted based on a National Geospatial-Intelligence Agency (NGA) license, and investigators must be approved by the CSDA Program.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Mehmet Cagri Aksoy (2022). Trees in Satellite Imagery [Dataset]. https://www.kaggle.com/datasets/mcagriaksoy/trees-in-satellite-imagery
Organization logo

Trees in Satellite Imagery

Detect forests (trees) in Sentinel-2 satellite image chips

Explore at:
zip(33359310 bytes)Available download formats
Dataset updated
Jul 13, 2022
Authors
Mehmet Cagri Aksoy
License

Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
License information was derived automatically

Description

About Dataset

This dataset is designed for binary classification tasks on geospatial imagery, specifically to distinguish between land areas with trees and those without. The images were captured by the Sentinel-2 satellite.

The dataset structure is straightforward: - Each image has a resolution of 64×64 pixels with encoded in JPG format. - Images are organized into two folders: "Trees" and "NoTrees", corresponding to the two classes. - Each folder contains 5,200 images, totaling 10,400 images across the dataset.

Note: The dataset does not include predefined training, validation, or test splits. Users should partition the data as needed for their specific machine learning, deep learning workflows.

And you can also cite the source of this data EUROSAT: Helber, P., Bischke, B., Dengel, A., & Borth, D. (2019). Eurosat: A novel dataset and deep learning benchmark for land use and land cover classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 12(7), 2217-2226.

Search
Clear search
Close search
Google apps
Main menu