100+ datasets found
  1. i

    Data from: GAZADeepDAV: A High Resolution Geotagged Satellite Imagery...

    • ieee-dataport.org
    Updated Oct 9, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Marwen Bouabid (2024). GAZADeepDAV: A High Resolution Geotagged Satellite Imagery Dataset for Analyzing War-Induced Damage [Dataset]. https://ieee-dataport.org/documents/gazadeepdav-high-resolution-geotagged-satellite-imagery-dataset-analyzing-war-induced
    Explore at:
    Dataset updated
    Oct 9, 2024
    Authors
    Marwen Bouabid
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    validation

  2. G

    High Resolution Satellite Imagery

    • ouvert.canada.ca
    • catalogue.arctic-sdi.org
    • +1more
    esri rest, html
    Updated Jan 9, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Government of Yukon (2025). High Resolution Satellite Imagery [Dataset]. https://ouvert.canada.ca/data/dataset/0a14b357-8a89-6e98-720e-3a800022cb99
    Explore at:
    esri rest, htmlAvailable download formats
    Dataset updated
    Jan 9, 2025
    Dataset provided by
    Government of Yukon
    License

    Open Government Licence - Canada 2.0https://open.canada.ca/en/open-government-licence-canada
    License information was derived automatically

    Description

    This image service contains high resolution satellite imagery for selected regions throughout the Yukon. Imagery is 1m pixel resolution, or better. Imagery was supplied by the Government of Yukon, and the Canadian Department of National Defense. All the imagery in this service is licensed. If you have any questions about Yukon government satellite imagery, please contact Geomatics.Help@gov.yk.can. This service is managed by Geomatics Yukon.

  3. Global commercial satellite imagery data 2022, by spatial resolution

    • statista.com
    Updated Mar 2, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Statista (2022). Global commercial satellite imagery data 2022, by spatial resolution [Dataset]. https://www.statista.com/statistics/1293723/commercial-satellite-imagery-resolution-worldwide/
    Explore at:
    Dataset updated
    Mar 2, 2022
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    2022
    Area covered
    World
    Description

    Satellite images are essentially the eyes in the sky. Some of the recent satellites, such as WorldView-3, provide images with a spatial resolution of *** meters. This satellite with a revisit time of under ** hours can scan a new image of the exact location with every revisit.

    Spatial resolution explained Spatial resolution is the size of the physical dimension that can be represented on a pixel of the image. Or in other words, spatial resolution is a measure of the smallest object that the sensor can resolve measured in meters. Generally, spatial resolution can be divided into three categories:

    – Low resolution: over 60m/pixel. (useful for regional perspectives such as monitoring larger forest areas)

    – Medium resolution: 10‒30m/pixel. (Useful for monitoring crop fields or smaller forest patches)

    – High to very high resolution: ****‒5m/pixel. (Useful for monitoring smaller objects like buildings, narrow streets, or vehicles)

    Based on the application of the imagery for the final product, a choice can be made on the resolution, as labor intensity from person-hours to computing power required increases with the resolution of the imagery.

  4. The WorldStrat Dataset: Open High-Resolution Satellite Imagery With Paired...

    • zenodo.org
    application/gzip, csv +2
    Updated Jul 16, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Julien Cornebise; Julien Cornebise; Ivan Oršolić; Ivan Oršolić; Freddie Kalaitzis; Freddie Kalaitzis (2024). The WorldStrat Dataset: Open High-Resolution Satellite Imagery With Paired Multi-Temporal Low-Resolution [Dataset]. http://doi.org/10.5281/zenodo.6810792
    Explore at:
    csv, application/gzip, txt, pdfAvailable download formats
    Dataset updated
    Jul 16, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Julien Cornebise; Julien Cornebise; Ivan Oršolić; Ivan Oršolić; Freddie Kalaitzis; Freddie Kalaitzis
    Description

    What is this dataset?

    Nearly 10,000 km² of free high-resolution and matched low-resolution satellite imagery of unique locations which ensure stratified representation of all types of land-use across the world: from agriculture to ice caps, from forests to multiple urbanization densities.

    Those locations are also enriched with typically under-represented locations in ML datasets: sites of humanitarian interest, illegal mining sites, and settlements of persons at risk.

    Each high-resolution image (1.5 m/pixel) comes with multiple temporally-matched low-resolution images from the freely accessible lower-resolution Sentinel-2 satellites (10 m/pixel).

    We accompany this dataset with a paper, datasheet for datasets and an open-source Python package to: rebuild or extend the WorldStrat dataset, train and infer baseline algorithms, and learn with abundant tutorials, all compatible with the popular EO-learn toolbox.

    Why make this?

    We hope to foster broad-spectrum applications of ML to satellite imagery, and possibly develop the same power of analysis allowed by costly private high-resolution imagery from free public low-resolution Sentinel2 imagery. We illustrate this specific point by training and releasing several highly compute-efficient baselines on the task of Multi-Frame Super-Resolution.

    Licences

    • The high-resolution Airbus imagery is distributed, with authorization from Airbus, under Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0).
    • The labels, Sentinel2 imagery, and trained weights are released under Creative Commons with Attribution 4.0 International (CC BY 4.0).
    • The source code (will be shortly released on GitHub) under 3-Clause BSD license.
  5. d

    High-Resolution QuickBird Imagery and Related GIS Layers for Barrow, Alaska,...

    • catalog.data.gov
    • datasets.ai
    • +4more
    Updated Apr 11, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NSIDC (2025). High-Resolution QuickBird Imagery and Related GIS Layers for Barrow, Alaska, USA, Version 1 [Dataset]. https://catalog.data.gov/dataset/high-resolution-quickbird-imagery-and-related-gis-layers-for-barrow-alaska-usa-version-1
    Explore at:
    Dataset updated
    Apr 11, 2025
    Dataset provided by
    NSIDC
    Area covered
    United States, Alaska, Utqiagvik
    Description

    This data set contains high-resolution QuickBird imagery and geospatial data for the entire Barrow QuickBird image area (156.15° W - 157.07° W, 71.15° N - 71.41° N) and Barrow B4 Quadrangle (156.29° W - 156.89° W, 71.25° N - 71.40° N), for use in Geographic Information Systems (GIS) and remote sensing software. The original QuickBird data sets were acquired by DigitalGlobe from 1 to 2 August 2002, and consist of orthorectified satellite imagery. Federal Geographic Data Committee (FGDC)-compliant metadata for all value-added data sets are provided in text, HTML, and XML formats. Accessory layers include: 1:250,000- and 1:63,360-scale USGS Digital Raster Graphic (DRG) mosaic images (GeoTIFF format); 1:250,000- and 1:63,360-scale USGS quadrangle index maps (ESRI Shapefile format); an index map for the 62 QuickBird tiles (ESRI Shapefile format); and a simple polygon layer of the extent of the Barrow QuickBird image area and the Barrow B4 quadrangle area (ESRI Shapefile format). Unmodified QuickBird data comprise 62 data tiles in Universal Transverse Mercator (UTM) Zone 4 in GeoTIFF format. Standard release files describing the QuickBird data are included, along with the DigitalGlobe license agreement and product handbooks. The baseline geospatial data support education, outreach, and multi-disciplinary research of environmental change in Barrow, which is an area of focused scientific interest. Data are provided on four DVDs. This product is available only to investigators funded specifically from the National Science Foundation (NSF), Office of Polar Programs (OPP), Arctic Sciences Section. An NSF OPP award number must be provided when ordering this data.

  6. Data from: Satellite Image

    • open.canada.ca
    • ouvert.canada.ca
    pdf
    Updated Mar 14, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Natural Resources Canada (2022). Satellite Image [Dataset]. https://open.canada.ca/data/en/dataset/912a9d77-0a3f-5e0c-91f5-197ee5317e9f
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Mar 14, 2022
    Dataset provided by
    Ministry of Natural Resources of Canadahttps://www.nrcan.gc.ca/
    License

    Open Government Licence - Canada 2.0https://open.canada.ca/en/open-government-licence-canada
    License information was derived automatically

    Description

    The satellite image of Canada is a composite of several individual satellite images form the Advanced Very High Resolution Radiometre (AVHRR) sensor on board various NOAA Satellites. The colours reflect differences in the density of vegetation cover: bright green for dense vegetation in humid southern regions; yellow for semi-arid and for mountainous regions; brown for the north where vegetation cover is very sparse; and white for snow and ice. An inset map shows a satellite image mosaic of North America with 35 land cover classes, based on data from the SPOT satellite VGT (vegetation) sensor.

  7. Vertical artifacts in high-resolution WorldView-2 and WorldView-3 satellite...

    • catalog.data.gov
    • datasets.ai
    Updated Mar 14, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. EPA Office of Research and Development (ORD) (2022). Vertical artifacts in high-resolution WorldView-2 and WorldView-3 satellite imagery [Dataset]. https://catalog.data.gov/dataset/vertical-artifacts-in-high-resolution-worldview-2-and-worldview-3-satellite-imagery
    Explore at:
    Dataset updated
    Mar 14, 2022
    Dataset provided by
    United States Environmental Protection Agencyhttp://www.epa.gov/
    Description

    Satellite sensor artifacts can negatively impact the interpretation of satellite data. One such artifact is linear features in imagery which can be caused by a variety of sensor issues and can present as either wide, consistent features called banding, or as narrow, inconsistent features called striping. This study used high-resolution data from DigitalGlobe's WorldView-3 satellite collected at Lake Okeechobee, Florida, on 30 August 2017. Primarily designed as a land sensor, this study investigated the impact of vertical artifacts on both at-sensor radiance and a spectral index for an aquatic target. This dataset is not publicly accessible because: NGA Nextview license agreements prohibit the distribution of original data files from WorldView due to copyright. It can be accessed through the following means: National Geospatial Intelligence Agency contract details prevent distribution of Maxar data. Questions regarding Nextvew can be sent so NGANextView_License@nga.mil. Questions regarding the NASA Commercial Data Buy can be sent to yvonne.ivey@nasa.gov. Format: high-resolution data from DigitalGlobe's WorldView-3 satellite. This dataset is associated with the following publication: Coffer, M., P. Whitman, B. Schaeffer, V. Hill, R. Zimmerman, W. Salls, M. Lebrasse, and D. Graybill. Vertical artifacts in high-resolution WorldView-2 and WorldView-3 satellite imagery of aquatic systems. INTERNATIONAL JOURNAL OF REMOTE SENSING. Taylor & Francis, Inc., Philadelphia, PA, USA, 43(4): 1199-1225, (2022).

  8. d

    Declassified Satellite Imagery 2 (2002)

    • catalog.data.gov
    • gimi9.com
    • +4more
    Updated Apr 10, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    DOI/USGS/EROS (2025). Declassified Satellite Imagery 2 (2002) [Dataset]. https://catalog.data.gov/dataset/declassified-satellite-imagery-2-2002
    Explore at:
    Dataset updated
    Apr 10, 2025
    Dataset provided by
    DOI/USGS/EROS
    Description

    Declassified satellite images provide an important worldwide record of land-surface change. With the success of the first release of classified satellite photography in 1995, images from U.S. military intelligence satellites KH-7 and KH-9 were declassified in accordance with Executive Order 12951 in 2002. The data were originally used for cartographic information and reconnaissance for U.S. intelligence agencies. Since the images could be of historical value for global change research and were no longer critical to national security, the collection was made available to the public. Keyhole (KH) satellite systems KH-7 and KH-9 acquired photographs of the Earth’s surface with a telescopic camera system and transported the exposed film through the use of recovery capsules. The capsules or buckets were de-orbited and retrieved by aircraft while the capsules parachuted to earth. The exposed film was developed and the images were analyzed for a range of military applications. The KH-7 surveillance system was a high resolution imaging system that was operational from July 1963 to June 1967. Approximately 18,000 black-and-white images and 230 color images are available from the 38 missions flown during this program. Key features for this program were larger area of coverage and improved ground resolution. The cameras acquired imagery in continuous lengthwise sweeps of the terrain. KH-7 images are 9 inches wide, vary in length from 4 inches to 500 feet long, and have a resolution of 2 to 4 feet. The KH-9 mapping program was operational from March 1973 to October 1980 and was designed to support mapping requirements and exact positioning of geographical points for the military. This was accomplished by using image overlap for stereo coverage and by using a camera system with a reseau grid to correct image distortion. The KH-9 framing cameras produced 9 x 18 inch imagery at a resolution of 20-30 feet. Approximately 29,000 mapping images were acquired from 12 missions. The original film sources are maintained by the National Archives and Records Administration (NARA). Duplicate film sources held in the USGS EROS Center archive are used to produce digital copies of the imagery.

  9. Data from: Site-specific management of cotton root rot using airborne and...

    • catalog.data.gov
    • agdatacommons.nal.usda.gov
    Updated Apr 21, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Agricultural Research Service (2025). Data from: Site-specific management of cotton root rot using airborne and high resolution satellite imagery and variable rate technology [Dataset]. https://catalog.data.gov/dataset/data-from-site-specific-management-of-cotton-root-rot-using-airborne-and-high-resolution-s-9a191
    Explore at:
    Dataset updated
    Apr 21, 2025
    Dataset provided by
    Agricultural Research Servicehttps://www.ars.usda.gov/
    Description

    Cotton root rot is a century-old cotton disease that now can be effectively controlled with Topguard Terra fungicide. Because this disease tends to occur in the same general areas within fields in recurring years, site-specific application of the fungicide only to infested areas can be as effective as and considerably more economical than uniform application. The overall objective of this research was to demonstrate how site-specific fungicide application could be implemented based on historical remote sensing imagery and using variable-rate technology. Procedures were developed for creating binary prescription maps from historical airborne and high-resolution satellite imagery. Two different variable-rate liquid control systems were adapted to two existing cotton planters, respectively, for site-specific fungicide application at planting. One system was used for site-specific application on multiple fields in 2015 and 2016 near Edroy, Texas, and the other system was used on multiple fields in both years near San Angelo, Texas. Airborne multispectral imagery taken during the two growing seasons was used to monitor the performance of the site-specific treatments. Results based on prescription maps derived from historical airborne and satellite imagery of two fields in 2015 and one field in 2016 are reported in this article. Two years of field experiments showed that the prescription maps and the variable-rate systems performed well and that site-specific fungicide treatments effectively controlled cotton root rot. Reduction in fungicide use was 41%, 43%, and 63% for the three fields, respectively. The methodologies and results of this research will provide cotton growers, crop consultants, and agricultural dealers with practical guidelines for implementing site-specific fungicide application using historical imagery and variable-rate technology for effective management of cotton root rot. Resources in this dataset: Resource Title: A ground picture of cotton root rot File Name: IMG_0124.JPG Resource Description: A cotton root rot-infested area in a cotton field near Edroy, TX. Resource Title: An aerial image of a cotton field File Name: Color-infrared image of a field.jpg Resource Description: Aerial color-infrared (CIR) image of a cotton field infested with cotton root rot. Resource Title: As-applied fungicide application data File Name: Jim Ermis-Farm 1-Field 11 Fungicide Application.csv Resource Description: As-applied fungicide application rates for variable rate application of Topguard to a cotton field infested with cotton rot

  10. MSG: High resolution visible imagery over the UK

    • catalogue.ceda.ac.uk
    • data-search.nerc.ac.uk
    Updated Jul 18, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NERC EDS Centre for Environmental Data Analysis (2025). MSG: High resolution visible imagery over the UK [Dataset]. https://catalogue.ceda.ac.uk/uuid/d9935bb3ebc54939bd3cc4ee05d88892
    Explore at:
    Dataset updated
    Jul 18, 2025
    Dataset provided by
    Centre for Environmental Data Analysishttp://www.ceda.ac.uk/
    License

    https://artefacts.ceda.ac.uk/licences/specific_licences/msg.pdfhttps://artefacts.ceda.ac.uk/licences/specific_licences/msg.pdf

    Area covered
    Variables measured
    Visible Imagery, http://vocab.ndg.nerc.ac.uk/term/P141/4/GVAR0925
    Description

    The Meteosat Second Generation (MSG) satellites, operated by EUMETSAT (The European Organisation for the Exploitation of Meteorological Satellites), provide almost continuous imagery to meteorologists and researchers in Europe and around the world. These include visible, infra-red, water vapour, High Resolution Visible (HRV) images and derived cloud top height, cloud top temperature, fog, snow detection and volcanic ash products. These images are available for a range of geographical areas.

    This dataset contains high resolution visible images from MSG satellites over the UK area. Imagery available from March 2005 onwards at a frequency of 15 minutes (some are hourly) and are at least 24 hours old.

    The geographic extent for images within this datasets is available via the linked documentation 'MSG satellite imagery product geographic area details'. Each MSG imagery product area can be referenced from the third and fourth character of the image product name giving in the filename. E.g. for EEAO11 the corresponding geographic details can be found under the entry for area code 'AO' (i.e West Africa).

  11. AK RGB High Resolution Imagery (50cm)

    • gis.data.alaska.gov
    • statewide-geoportal-1-soa-dnr.hub.arcgis.com
    Updated Jan 22, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alaska Department of Natural Resources ArcGIS Online (2021). AK RGB High Resolution Imagery (50cm) [Dataset]. https://gis.data.alaska.gov/maps/13dd1ccf165845eea5db36465e7d565c
    Explore at:
    Dataset updated
    Jan 22, 2021
    Dataset provided by
    https://arcgis.com/
    Authors
    Alaska Department of Natural Resources ArcGIS Online
    Area covered
    Description

    Suggested use: Use tiled Map Service for large scale mapping when high resolution color imagery is needed.A web app to view tile and block metadata such as year, sensor, and cloud cover can be found here. CoverageState of AlaskaProduct TypeTile CacheImage BandsRGBSpatial Resolution50cmAccuracy5m CE90 or betterCloud Cover<10% overallOff Nadir Angle<30 degreesSun Elevation>30 degreesWMS version of this data: https://geoportal.alaska.gov/arcgis/services/ahri_2020_rgb_cache/MapServer/WMSServer?request=GetCapabilities&service=WMSWMTS version of this data:https://geoportal.alaska.gov/arcgis/rest/services/ahri_2020_rgb_cache/MapServer/WMTS/1.0.0/WMTSCapabilities.xml

  12. d

    A circa 2010 global land cover reference dataset from commercial high...

    • catalog.data.gov
    • data.usgs.gov
    Updated Jul 6, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). A circa 2010 global land cover reference dataset from commercial high resolution satellite data [Dataset]. https://catalog.data.gov/dataset/a-circa-2010-global-land-cover-reference-dataset-from-commercial-high-resolution-satellite
    Explore at:
    Dataset updated
    Jul 6, 2024
    Dataset provided by
    U.S. Geological Survey
    Description

    The data are 475 thematic land cover raster’s at 2m resolution. Land cover classification was to the land cover classes: Tree (1), Water (2), Barren (3), Other Vegetation (4) and Ice & Snow (8). Cloud cover and Shadow were sometimes coded as Cloud (5) and Shadow (6), however for any land cover application would be considered NoData. Some raster’s may have Cloud and Shadow pixels coded or recoded to NoData already. Commercial high-resolution satellite data was used to create the classifications. Usable image data for the target year (2010) was acquired for 475 of the 500 primary sample locations, with 90% of images acquired within ±2 years of the 2010 target. The remaining 25 of the 500 sample blocks had no usable data so were not able to be mapped. Tabular data is included with the raster classifications indicating the specific high-resolution sensor and date of acquisition for source imagery as well as the stratum to which that sample block belonged. Methods for this classification are described in Pengra et al. (2015). A 1-stage cluster sampling design was used where 500 (475 usable), 5 km x 5 km sample blocks were the primary sampling units (note; the nominal size was 5km x 5km blocks, but some have deviations in dimensions due only partial coverage of the sample block with usable imagery). Sample blocks were selected using stratified random sampling within a sample frame stratified by a modification of the Köppen Climate/Vegetation classification and population density (Olofsson et al., 2012). Secondary sampling units are each of the classified 2m pixels of the raster. This design satisfies the criteria that define a probability sampling design and thus serves as the basis to support rigorous design-based statistical inference (Stehman, 2000).

  13. NZ 10m Satellite Imagery (2020-2021)

    • data.linz.govt.nz
    • geodata.nz
    dwg with geojpeg +8
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Land Information New Zealand, NZ 10m Satellite Imagery (2020-2021) [Dataset]. https://data.linz.govt.nz/layer/106279-nz-10m-satellite-imagery-2020-2021/
    Explore at:
    pdf, erdas imagine, kea, jpeg2000 lossless, geotiff, jpeg2000, geojpeg, kml, dwg with geojpegAvailable download formats
    Dataset authored and provided by
    Land Information New Zealandhttps://www.linz.govt.nz/
    License

    https://data.linz.govt.nz/license/attribution-4-0-international/https://data.linz.govt.nz/license/attribution-4-0-international/

    Area covered
    Description

    This dataset provides a seamless cloud-free 10m resolution satellite imagery layer of the New Zealand mainland and offshore islands.

    The imagery was captured by the European Space Agency Sentinel-2 satellites between September 2020 - April 2021.

    Technical specifications:

    • 450 x ortho-rectified RGB GeoTIFF images in NZTM projection, tiled into the LINZ Standard 1:50,000 tile layout
    • Satellite sensors: ESA Sentinel-2A and Sentinel-2B
    • Acquisition dates: September 2020 - April 2021
    • Spectral resolution: R, G, B
    • Spatial resolution: 10 meters
    • Radiometric resolution: 8-bits (downsampled from 12-bits)

    This is a visual product only. The data has been downsampled from 12-bits to 8-bits, and the original values of the images have been modified for visualisation purposes.

  14. SnowEx Colorado 3M Snow Depth Time Series and DEMs from High-Resolution...

    • catalog.data.gov
    • dataone.org
    • +4more
    Updated Apr 10, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NASA NSIDC DAAC (2025). SnowEx Colorado 3M Snow Depth Time Series and DEMs from High-Resolution Satellite Image Pairs V001 [Dataset]. https://catalog.data.gov/dataset/snowex-colorado-3m-snow-depth-time-series-and-dems-from-high-resolution-satellite-image-pa
    Explore at:
    Dataset updated
    Apr 10, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Area covered
    Colorado
    Description

    This data set contains a time series of snow depth maps and related intermediary snow-on and snow-off DEMs for Grand Mesa and the Banded Peak Ranch areas of Colorado derived from very-high-resolution (VHR) satellite stereo images and lidar point cloud data. Two of the snow depth maps coincide temporally with the 2017 NASA SnowEx Grand Mesa field campaign, providing a comparison between the satellite derived snow depth and in-situ snow depth measurements. The VHR stereo images were acquired each year between 2016 and 2022 during the approximate timing of peak snow depth by the Maxar WorldView-2, WorldView-3, and CNES/Airbus Pléiades-HR 1A and 1B satellites, while lidar data was sourced from the USGS 3D Elevation Program.

  15. a

    World Imagery - ESRI

    • hub.arcgis.com
    • fesec-cesj.opendata.arcgis.com
    Updated Feb 14, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Centre d'enseignement Saint-Joseph de Chimay (2019). World Imagery - ESRI [Dataset]. https://hub.arcgis.com/maps/CESJ::world-imagery-esri/about
    Explore at:
    Dataset updated
    Feb 14, 2019
    Dataset authored and provided by
    Centre d'enseignement Saint-Joseph de Chimay
    Area covered
    World,
    Description

    World Imagery provides one meter or better satellite and aerial imagery in many parts of the world and lower resolution satellite imagery worldwide. The map includes 15m TerraColor imagery at small and mid-scales (~1:591M down to ~1:72k) and 2.5m SPOT Imagery (~1:288k to ~1:72k) for the world. The map features 0.5m resolution imagery in the continental United States and parts of Western Europe from DigitalGlobe. Additional DigitalGlobe sub-meter imagery is featured in many parts of the world. In the United States, 1 meter or better resolution NAIP imagery is available in some areas. In other parts of the world, imagery at different resolutions has been contributed by the GIS User Community. In select communities, very high resolution imagery (down to 0.03m) is available down to ~1:280 scale. You can contribute your imagery to this map and have it served by Esri via the Community Maps Program. View the list of Contributors for the World Imagery Map.CoverageView the links below to learn more about recent updates and map coverage:What's new in World ImageryWorld coverage mapCitationsThis layer includes imagery provider, collection date, resolution, accuracy, and source of the imagery. With the Identify tool in ArcGIS Desktop or the ArcGIS Online Map Viewer you can see imagery citations. Citations returned apply only to the available imagery at that location and scale. You may need to zoom in to view the best available imagery. Citations can also be accessed in the World Imagery with Metadata web map.UseYou can add this layer to the ArcGIS Online Map Viewer, ArcGIS Desktop, or ArcGIS Pro. To view this layer with a useful reference overlay, open the Imagery Hybrid web map. A similar raster web map, Imagery with Labels, is also available.FeedbackHave you ever seen a problem in the Esri World Imagery Map that you wanted to report? You can use the Imagery Map Feedback web map to provide comments on issues. The feedback will be reviewed by the ArcGIS Online team and considered for one of our updates.

  16. n

    QuickBird full archive

    • cmr.earthdata.nasa.gov
    • eocat.esa.int
    • +2more
    not provided
    Updated Apr 24, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). QuickBird full archive [Dataset]. https://cmr.earthdata.nasa.gov/search/concepts/C1965336934-ESA.html
    Explore at:
    not providedAvailable download formats
    Dataset updated
    Apr 24, 2025
    Time period covered
    Nov 1, 2001 - Mar 31, 2015
    Area covered
    Earth
    Description

    QuickBird high resolution optical products are available as part of the Maxar Standard Satellite Imagery products from the QuickBird, WorldView-1/-2/-3/-4, and GeoEye-1 satellites. All details about the data provision, data access conditions and quota assignment procedure are described into the Terms of Applicability available in Resources section.

    In particular, QuickBird offers archive panchromatic products up to 0.60 m GSD resolution and 4-Bands Multispectral products up to 2.4 m GSD resolution.

    Band Combination Data Processing Level Resolution Panchromatic and 4-bands Standard(2A)/View Ready Standard (OR2A) 15 cm HD, 30 cm HD, 30 cm, 40 cm, 50/60 cm View Ready Stereo 30 cm, 40 cm, 50/60 cm Map-Ready (Ortho) 1:12,000 Orthorectified 15 cm HD, 30 cm HD, 30 cm, 40 cm, 50/60 cm

    4-Bands being an option from:

    4-Band Multispectral (BLUE, GREEN, RED, NIR1) 4-Band Pan-sharpened (BLUE, GREEN, RED, NIR1) 4-Band Bundle (PAN, BLUE, GREEN, RED, NIR1) 3-Bands Natural Colour (pan-sharpened BLUE, GREEN, RED) 3-Band Colored Infrared (pan-sharpened GREEN, RED, NIR1) Natural Colour / Coloured Infrared (3-Band pan-sharpened) Native 30 cm and 50/60 cm resolution products are processed with MAXAR HD Technology to generate respectively the 15 cm HD and 30 cm HD products: the initial special resolution (GSD) is unchanged but the HD technique intelligently increases the number of pixels and improves the visual clarity achieving aesthetically refined imagery with precise edges and well reconstructed details.

  17. n

    USGS High Resolution Orthoimagery

    • cmr.earthdata.nasa.gov
    • catalog.data.gov
    Updated Jan 29, 2016
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2016). USGS High Resolution Orthoimagery [Dataset]. https://cmr.earthdata.nasa.gov/search/concepts/C1220567548-USGS_LTA.html
    Explore at:
    Dataset updated
    Jan 29, 2016
    Time period covered
    Jan 1, 1970 - Present
    Area covered
    Earth
    Description

    High resolution orthorectified images combine the image characteristics of an aerial photograph with the geometric qualities of a map. An orthoimage is a uniform-scale image where corrections have been made for feature displacement such as building tilt and for scale variations caused by terrain relief, sensor geometry, and camera tilt. A mathematical equation based on ground control points, sensor calibration information, and a digital elevation model is applied to each pixel to rectify the image to obtain the geometric qualities of a map.

    A digital orthoimage may be created from several photographs mosaicked to form the final image. The source imagery may be black-and-white, natural color, or color infrared with a pixel resolution of 1-meter or finer. With orthoimagery, the resolution refers to the distance on the ground represented by each pixel.

  18. n

    Satellite images and road-reference data for AI-based road mapping in...

    • data.niaid.nih.gov
    • dataone.org
    • +1more
    zip
    Updated Apr 4, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sean Sloan; Raiyan Talkhani; Tao Huang; Jayden Engert; William Laurance (2024). Satellite images and road-reference data for AI-based road mapping in Equatorial Asia [Dataset]. http://doi.org/10.5061/dryad.bvq83bkg7
    Explore at:
    zipAvailable download formats
    Dataset updated
    Apr 4, 2024
    Dataset provided by
    James Cook University
    Vancouver Island University
    Authors
    Sean Sloan; Raiyan Talkhani; Tao Huang; Jayden Engert; William Laurance
    License

    https://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html

    Area covered
    Asia
    Description

    For the purposes of training AI-based models to identify (map) road features in rural/remote tropical regions on the basis of true-colour satellite imagery, and subsequently testing the accuracy of these AI-derived road maps, we produced a dataset of 8904 satellite image ‘tiles’ and their corresponding known road features across Equatorial Asia (Indonesia, Malaysia, Papua New Guinea). Methods

    1. INPUT 200 SATELLITE IMAGES

    The main dataset shared here was derived from a set of 200 input satellite images, also provided here. These 200 images are effectively ‘screenshots’ (i.e., reduced-resolution copies) of high-resolution true-colour satellite imagery (~0.5-1m pixel resolution) observed using the Elvis Elevation and Depth spatial data portal (https://elevation.fsdf.org.au/), which here is functionally equivalent to the more familiar Google Earth. Each of these original images was initially acquired at a resolution of 1920x886 pixels. Actual image resolution was coarser than the native high-resolution imagery. Visual inspection of these 200 images suggests a pixel resolution of ~5 meters, given the number of pixels required to span features of familiar scale, such as roads and roofs, as well as the ready discrimination of specific land uses, vegetation types, etc. These 200 images generally spanned either forest-agricultural mosaics or intact forest landscapes with limited human intervention. Sloan et al. (2023) present a map indicating the various areas of Equatorial Asia from which these images were sourced.
    IMAGE NAMING CONVENTION A common naming convention applies to satellite images’ file names: XX##.png where:

    XX – denotes the geographical region / major island of Equatorial Asia of the image, as follows: ‘bo’ (Borneo), ‘su’ (Sumatra), ‘sl’ (Sulawesi), ‘pn’ (Papua New Guinea), ‘jv’ (java), ‘ng’ (New Guinea [i.e., Papua and West Papua provinces of Indonesia])

    – denotes the ith image for a given geographical region / major island amongst the original 200 images, e.g., bo1, bo2, bo3…

    1. INTERPRETING ROAD FEATURES IN THE IMAGES For each of the 200 input satellite images, its road was visually interpreted and manually digitized to create a reference image dataset by which to train, validate, and test AI road-mapping models, as detailed in Sloan et al. (2023). The reference dataset of road features was digitized using the ‘pen tool’ in Adobe Photoshop. The pen’s ‘width’ was held constant over varying scales of observation (i.e., image ‘zoom’) during digitization. Consequently, at relatively small scales at least, digitized road features likely incorporate vegetation immediately bordering roads. The resultant binary (Road / Not Road) reference images were saved as PNG images with the same image dimensions as the original 200 images.

    2. IMAGE TILES AND REFERENCE DATA FOR MODEL DEVELOPMENT

    The 200 satellite images and the corresponding 200 road-reference images were both subdivided (aka ‘sliced’) into thousands of smaller image ‘tiles’ of 256x256 pixels each. Subsequent to image subdivision, subdivided images were also rotated by 90, 180, or 270 degrees to create additional, complementary image tiles for model development. In total, 8904 image tiles resulted from image subdivision and rotation. These 8904 image tiles are the main data of interest disseminated here. Each image tile entails the true-colour satellite image (256x256 pixels) and a corresponding binary road reference image (Road / Not Road).
    Of these 8904 image tiles, Sloan et al. (2023) randomly selected 80% for model training (during which a model ‘learns’ to recognize road features in the input imagery), 10% for model validation (during which model parameters are iteratively refined), and 10% for final model testing (during which the final accuracy of the output road map is assessed). Here we present these data in two folders accordingly:

    'Training’ – contains 7124 image tiles used for model training in Sloan et al. (2023), i.e., 80% of the original pool of 8904 image tiles. ‘Testing’– contains 1780 image tiles used for model validation and model testing in Sloan et al. (2023), i.e., 20% of the original pool of 8904 image tiles, being the combined set of image tiles for model validation and testing in Sloan et al. (2023).

    IMAGE TILE NAMING CONVENTION A common naming convention applies to image tiles’ directories and file names, in both the ‘training’ and ‘testing’ folders: XX##_A_B_C_DrotDDD where

    XX – denotes the geographical region / major island of Equatorial Asia of the original input 1920x886 pixel image, as follows: ‘bo’ (Borneo), ‘su’ (Sumatra), ‘sl’ (Sulawesi), ‘pn’ (Papua New Guinea), ‘jv’ (java), ‘ng’ (New Guinea [i.e., Papua and West Papua provinces of Indonesia])

    – denotes the ith image for a given geographical region / major island amongst the original 200 images, e.g., bo1, bo2, bo3…

    A, B, C and D – can all be ignored. These values, which are one of 0, 256, 512, 768, 1024, 1280, 1536, and 1792, are effectively ‘pixel coordinates’ in the corresponding original 1920x886-pixel input image. They were recorded within the names of image tiles’ sub-directories and file names merely to ensure that names/directory were uniquely named)

    rot – implies an image rotation. Not all image tiles are rotated, so ‘rot’ will appear only occasionally.

    DDD – denotes the degree of image-tile rotation, e.g., 90, 180, 270. Not all image tiles are rotated, so ‘DD’ will appear only occasionally.

    Note that the designator ‘XX##’ is directly equivalent to the filenames of the corresponding 1920x886-pixel input satellite images, detailed above. Therefore, each image tiles can be ‘matched’ with its parent full-scale satellite image. For example, in the ‘training’ folder, the subdirectory ‘Bo12_0_0_256_256’ indicates that its image tile therein (also named ‘Bo12_0_0_256_256’) would have been sourced from the full-scale image ‘Bo12.png’.

  19. Bonn Roof Material + Satellite Imagery Dataset

    • figshare.com
    zip
    Updated Apr 18, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Julian Huang; Yue Lin; Alex Nhancololo (2025). Bonn Roof Material + Satellite Imagery Dataset [Dataset]. http://doi.org/10.6084/m9.figshare.28713194.v2
    Explore at:
    zipAvailable download formats
    Dataset updated
    Apr 18, 2025
    Dataset provided by
    figshare
    Authors
    Julian Huang; Yue Lin; Alex Nhancololo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Bonn
    Description

    This dataset consists of annotated high-resolution aerial imagery of roof materials in Bonn, Germany, in the Ultralytics YOLO instance segmentation dataset format. Aerial imagery was sourced from OpenAerialMap, specifically from the Maxar Open Data Program. Roof material labels and building outlines were sourced from OpenStreetMap. Images and labels are split into training, validation, and test sets, meant for future machine learning models to be trained upon, for both building segmentation and roof type classification.The dataset is intended for applications such as informing studies on thermal efficiency, roof durability, heritage conservation, or socioeconomic analyses. There are six roof material types: roof tiles, tar paper, metal, concrete, gravel, and glass.Note: The data is in a .zip due to file upload limits. Please find a more detailed dataset description in the README.md

  20. a

    Yukon High Resolution Satellite Imagery

    • hub.arcgis.com
    Updated May 22, 2015
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Government of Yukon (2015). Yukon High Resolution Satellite Imagery [Dataset]. https://hub.arcgis.com/datasets/1b86ee0e279044939eb0045cf8a6dad1
    Explore at:
    Dataset updated
    May 22, 2015
    Dataset authored and provided by
    Government of Yukon
    Area covered
    Description

    Yukon high resolution satellite imagery is distributed from the Government of Yukon imagery repository. This is a dynamic service containing satellite imagery for locations in the Yukon, Canada.

    This data is hosted in Yukon Albers equal area projection. It can be viewed and queried in the GeoYukon application: https://mapservices.gov.yk.ca/GeoYukon.

    For more information contact geomatics.help@yukon.ca.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Marwen Bouabid (2024). GAZADeepDAV: A High Resolution Geotagged Satellite Imagery Dataset for Analyzing War-Induced Damage [Dataset]. https://ieee-dataport.org/documents/gazadeepdav-high-resolution-geotagged-satellite-imagery-dataset-analyzing-war-induced

Data from: GAZADeepDAV: A High Resolution Geotagged Satellite Imagery Dataset for Analyzing War-Induced Damage

Related Article
Explore at:
Dataset updated
Oct 9, 2024
Authors
Marwen Bouabid
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

validation

Search
Clear search
Close search
Google apps
Main menu