Facebook
TwitterThe files linked to this reference are the geospatial data created as part of the completion of the baseline vegetation inventory project for the NPS park unit. Current format is ArcGIS file geodatabase but older formats may exist as shapefiles. We converted the photointerpreted data into a format usable in a geographic information system (GIS) by employing three fundamental processes: (1) orthorectify, (2) digitize, and (3) develop the geodatabase. All digital map automation was projected in Universal Transverse Mercator (UTM), Zone 16, using the North American Datum of 1983 (NAD83). Orthorectify: We orthorectified the interpreted overlays by using OrthoMapper, a softcopy photogrammetric software for GIS. One function of OrthoMapper is to create orthorectified imagery from scanned and unrectified imagery (Image Processing Software, Inc., 2002). The software features a method of visual orientation involving a point-and-click operation that uses existing orthorectified horizontal and vertical base maps. Of primary importance to us, OrthoMapper also has the capability to orthorectify the photointerpreted overlays of each photograph based on the reference information provided. Digitize: To produce a polygon vector layer for use in ArcGIS (Environmental Systems Research Institute [ESRI], Redlands, California), we converted each raster-based image mosaic of orthorectified overlays containing the photointerpreted data into a grid format by using ArcGIS. In ArcGIS, we used the ArcScan extension to trace the raster data and produce ESRI shapefiles. We digitally assigned map-attribute codes (both map-class codes and physiognomic modifier codes) to the polygons and checked the digital data against the photointerpreted overlays for line and attribute consistency. Ultimately, we merged the individual layers into a seamless layer. Geodatabase: At this stage, the map layer has only map-attribute codes assigned to each polygon. To assign meaningful information to each polygon (e.g., map-class names, physiognomic definitions, links to NVCS types), we produced a feature-class table, along with other supportive tables and subsequently related them together via an ArcGIS Geodatabase. This geodatabase also links the map to other feature-class layers produced from this project, including vegetation sample plots, accuracy assessment (AA) sites, aerial photo locations, and project boundary extent. A geodatabase provides access to a variety of interlocking data sets, is expandable, and equips resource managers and researchers with a powerful GIS tool.
Facebook
TwitterImage Visit is a configurable app template that allows users to quickly review the attributes of a predetermined sequence of locations in imagery. The app optimizes workflows by loading the next image while the user is still viewing the current image, reducing the delay caused by waiting for the next image to be returned from the server.Image Visit users can do the following:Navigate through a predetermined sequence of locations two ways: use features in a 'Visit' layer (an editable hosted feature layer), or use a web map's bookmarks.Use an optional 'Notes' layer (a second editable hosted feature layer) to add or edit features associated with the Visit locations.If the app uses a Visit layer for navigation, users can edit an optional 'Status' field to set the status of each Visit location as it's processed ('Complete' or 'Incomplete,'' for example).View metadata about the Imagery, Visit, and Notes layers in a dialog window (which displays information based on each layer's web map popup settings).Annotate imagery using editable feature layersPerform image measurement on imagery layers that have mensuration capabilitiesExport an imagery layer to the user's local machine, or as layer in the user’s ArcGIS accountUse CasesAn insurance company checking properties. An insurance company has a set of properties to review after an event like a hurricane. The app would drive the user to each property, and allow the operator to record attributes (the extent of damage, for example). Image analysts checking control points. Organizations that collect aerial photography often have a collection of marked or identifiable control points that they use to check their photographs. The app would drive the user to each of the known points, at a suitable scale, then allow the user to validate the location of the control point in the image. Checking automatically labeled features. In cases where AI is used for object identification, the app would drive the user to identified features to review/correct the classification. Supported DevicesThis application is responsively designed to support use in browsers on desktops, mobile phones, and tablets.Data RequirementsCreating an app with this template requires a web map with at least one imagery layer.Get Started This application can be created in the following ways:Click the Create a Web App button on this pageClick the Download button to access the source code. Do this if you want to host the app on your own server and optionally customize it to add features or change styling.
Facebook
TwitterThis is a collection of all GPS- and computer-generated geospatial data specific to the Alpine Treeline Warming Experiment (ATWE), located on Niwot Ridge, Colorado, USA. The experiment ran between 2008 and 2016, and consisted of three sites spread across an elevation gradient. Geospatial data for all three experimental sites and cone/seed collection locations are included in this package. ––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––– Geospatial files include cone collection, experimental site, seed trap, and other GPS location/terrain data. File types include ESRI shapefiles, ESRI grid files or Arc/Info binary grids, TIFFs (.tif), and keyhole markup language (.kml) files. Trimble-imported data include plain text files (.txt), Trimble COR (CorelDRAW) files, and Trimble SSF (Standard Storage Format) files. Microsoft Excel (.xlsx) and comma-separated values (.csv) files corresponding to the attribute tables of many files within this package are also included. A complete list of files can be found in this document in the “Data File Organization” section in the included Data User's Guide. Maps are also included in this data package for reference and use. These maps are separated into two categories, 2021 maps and legacy maps, which were made in 2010. Each 2021 map has one copy in portable network graphics (.png) format, and the other in .pdf format. All legacy maps are in .pdf format. .png image files can be opened with any compatible programs, such as Preview (Mac OS) and Photos (Windows). All GIS files were imported into geopackages (.gpkg) using QGIS, and double-checked for compatibility and data/attribute integrity using ESRI ArcGIS Pro. Note that files packaged within geopackages will open in ArcGIS Pro with “main.” preceding each file name, and an extra column named “geom” defining geometry type in the attribute table. The contents of each geospatial file remain intact, unless otherwise stated in “niwot_geospatial_data_list_07012021.pdf/.xlsx”. This list of files can be found as an .xlsx and a .pdf in this archive. As an open-source file format, files within gpkgs (TIFF, shapefiles, ESRI grid or “Arc/Info Binary”) can be read using both QGIS and ArcGIS Pro, and any other geospatial softwares. Text and .csv files can be read using TextEdit/Notepad/any simple text-editing software; .csv’s can also be opened using Microsoft Excel and R. .kml files can be opened using Google Maps or Google Earth, and Trimble files are most compatible with Trimble’s GPS Pathfinder Office software. .xlsx files can be opened using Microsoft Excel. PDFs can be opened using Adobe Acrobat Reader, and any other compatible programs. A selection of original shapefiles within this archive were generated using ArcMap with associated FGDC-standardized metadata (xml file format). We are including these original files because they contain metadata only accessible using ESRI programs at this time, and so that the relationship between shapefiles and xml files is maintained. Individual xml files can be opened (without a GIS-specific program) using TextEdit or Notepad. Since ESRI’s compatibility with FGDC metadata has changed since the generation of these files, many shapefiles will require upgrading to be compatible with ESRI’s latest versions of geospatial software. These details are also noted in the “niwot_geospatial_data_list_07012021” file.
Facebook
TwitterWorld Imagery provides one meter or better satellite and aerial imagery for most of the world’s landmass and lower resolution satellite imagery worldwide. The map is currently comprised of the following sources:Worldwide 15-m resolution TerraColor imagery at small and medium map scales.Vantor imagery basemap products around the world: Vivid Premium at 15-cm HD resolution for select metropolitan areas, Vivid Advanced 30-cm HD for more than 1,000 metropolitan areas, and Vivid Standard from 1.2-m to 0.6-cm resolution for the most of the world, with 30-cm HD across the United States and parts of Western Europe. More information on the Vantor products is included below. High-resolution aerial photography contributed by the GIS User Community. This imagery ranges from 30-cm to 3-cm resolution. You can contribute your imagery to this map and have it served by Esri via the Community Maps Program. Vantor Basemap ProductsVivid PremiumProvides committed image currency in a high-resolution, high-quality image layer over defined metropolitan and high-interest areas across the globe. The product provides 15-cm HD resolution imagery.Vivid AdvancedProvides committed image currency in a high-resolution, high-quality image layer over defined metropolitan and high-interest areas across the globe. The product includes a mix of native 30-cm and 30-cm HD resolution imagery.Vivid StandardProvides a visually consistent and continuous image layer over large areas through advanced image mosaicking techniques, including tonal balancing and seamline blending across thousands of image strips. Available from 1.2-m down to 30-cm HD. More on Vantor HD. Imagery UpdatesYou can use the Updates Mode in the World Imagery Wayback app to learn more about recent and pending updates. Accessing this information requires a user login with an ArcGIS organizational account. CitationsThis layer includes imagery provider, collection date, resolution, accuracy, and source of the imagery. With the Identify tool in ArcGIS Desktop or the ArcGIS Online Map Viewer you can see imagery citations. Citations returned apply only to the available imagery at that location and scale. You may need to zoom in to view the best available imagery. Citations can also be accessed in the World Imagery with Metadata web map. UseYou can add this layer to the ArcGIS Online Map Viewer, ArcGIS Desktop, or ArcGIS Pro. To view this layer with a useful reference overlay, open the Imagery Hybrid web map. FeedbackHave you ever seen a problem in the Esri World Imagery Map that you wanted to report? You can use the Imagery Map Feedback web map to provide comments on issues. The feedback will be reviewed by the ArcGIS Online team and considered for one of our updates.
Facebook
TwitterThe files linked to this reference are the geospatial data created as part of the completion of the baseline vegetation inventory project for the NPS park unit. Current format is ArcGIS file geodatabase but older formats may exist as shapefiles. The map units delineated on the orthophotos were derived from the NVC classification as constrained by the limitations of the photography. We combined the preliminary NVC classification with the aerial photo signatures to determine how many plant associations could be recognized on the photos. In most instances, one NVC association corresponded to one map unit. However, sometimes a plant association could not be recognized consistently on the photos or we could see more detail than was recognized by the classification. These problems were overcome by using two separate but related classifications: 1) the NVC for the plot data and 2) map units for the GIS database. The two were related or “crosswalked” by noting when plant associations were lumped into a single map unit or where when associations were split into multiple map units.
Facebook
TwitterThis collection of geo-referenced photos vary with regards to spatial accuracy and resolution. Use the hotlinks below to learn the details of each collection or review MassGIS's new story map explaining all the vintages of aerial photos. Tip: Reviewing that story map might be an easier way to digest the information rather than reviewing the more formal/standard metadata accessible via the hotlinks below.Within the web map certain layers will only be visible at particular zoom extents. If a layer is unavailable to turn on/off, then zoom in or out as needed until the layer becomes active.All photos, except year 1938, are captured during leaf-off (typically late winter/early spring). With the exception of the 1938 & 1990s collection, all photos are in true color. The 1938 & 1990s are in black and white. With regards to Dukes County (which includes the Islands of Martha's Vineyard and the Elizabeth Islands) these are the applicable years of acquisition for those State-wide collections that span multiple years: "1990s collection" -- Only year 1999 for Dukes County"2001-2003 collection" -- Only year 2003 for Dukes County"2008-2009 collection" - Only year 2009 for Dukes County"2011-2012 collection" - Only year 2011 for Dukes County"2013-2014 collection" - Only year 2014 for Dukes CountyPhoto Details (Metadata)1938 Black & White Aerials (georeferenced & hosted by Harvard Forest)1990s Black & White Aerials2001-2003 Color Aerials2005 Color Aerials2008-2009 Color Aerials2011-2012 Color Aerials2013-2014 Color Aerials2015 Satellite Images - Extra Details2019 Color Aerials2021 Color Aerials2023 Color AerialsParcel Lines -- These data are NOT survey grade and are intended for general reference only. The parcel data comply with the MassGIS Level 3 parcel data standard. Each town in Dukes County hires a GIS Consultant to prepare their digital parcel lines and to link the properties to the respective records from the town's assessing database. The linkage is static and not updated in real-time - it is only 'as current' as the day the data was exported from the assessing database. The Martha's Vineyard Commission does not edit nor maintain any assessing data or parcel lines/property bounds. Each town within Dukes County updates their digital parcel data when they see fit (most, typically, update annually). Click on a specific town in this map to see when their parcel data was updated and by whom. Similarly, clicking on a parcel in this "MA Aerial Photos Since 1990s web map" will show you the applicable Fiscal Year the assessing info was exported.
Facebook
TwitterThe Minnesota Geospatial Image Service provides versatile access to Minnesota air photos, hillshades, and scanned topographic maps using a Web Map Service (WMS). Using this service means you don't need to download and store these very large files on your own computer.
For a list of imagery data sets available through this service, see https://www.mngeo.state.mn.us/chouse/wms/wms_image_server_layers.html.
For technical specifications for using this service, see https://www.mngeo.state.mn.us/chouse/wms/wms_image_server_specs.html.
For information on how to use a Web Map Service (WMS), see https://www.mngeo.state.mn.us/chouse/wms/how_to_use_wms.html.
Facebook
TwitterThis series of products from MODIS represents the only daily global composites available and is suitable for use at global and regional levels. This True Color band composition (Bands 1 4 3 | Red, Green, Blue) most accurately shows how we see the earth’s surface with our own eyes. It is a natural looking image that is useful for land surface, oceanic and atmospheric analysis. There are four True Color products in total. For each satellite (Aqua and Terra) there is a 250 meter corrected reflectance product and a 500 meter surface reflectance product. Although the resolution is coarser than other satellites, this allows for a global collection of imagery on a daily basis, which is made available in near real-time. In contrast, Landsat needs 16 days to collect a global composite. Besides the maximum resolution difference, the surface and corrected reflectance products also differ in the algorithm used for atmospheric correction.NASA Global Imagery Browse Services (GIBS)This image layer provides access to a subset of the NASA Global Imagery Browse Services (GIBS), which are a set of standard services to deliver global, full-resolution satellite imagery. The GIBS goal is to enable interactive exploration of NASA's Earth imagery for a broad range of users. The purpose of this image layer, and the other GIBS image services hosted by Esri, is to enable convenient access to this beautiful and useful satellite imagery for users of ArcGIS. The source data used by this image layer is a finished image; it is not recommended for quantitative analysis.Several full resolution, global imagery products are built and served by GIBS in near real-time (usually within 3.5 hours of observation). These products are built from NASA Earth Observing System satellites data courtesy of LANCE data providers and other sources. The MODIS instrument aboard Terra and Aqua satellites, the AIRS instrument aboard Aqua, and the OMI instrument aboard Aura are used as sources. Several of the MODIS global products are made available on this Esri hosted service.This image layer hosted by Esri provides direct access to one of the GIBS image products. The Esri servers do not store any of this data itself. Instead, for each received data request, multiple image tiles are retrieved from GIBS, which are then processed and assembled into the proper image for the response. This processing takes place on-the-fly, for each and every request. This ensures that any update to the GIBS data is immediately available in the Esri mosaic service.Note on Time: The image service supporting this map is time enabled, but time has been disabled on this image layer so that the most recent imagery displays by default. If you would like to view imagery over time, you can update the layer properties to enable time animation and configure time settings. The results can be saved in a web map to use later or share with others.
Facebook
Twitterhttps://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html
A major objective of plant ecology research is to determine the underlying processes responsible for the observed spatial distribution patterns of plant species. Plants can be approximated as points in space for this purpose, and thus, spatial point pattern analysis has become increasingly popular in ecological research. The basic piece of data for point pattern analysis is a point location of an ecological object in some study region. Therefore, point pattern analysis can only be performed if data can be collected. However, due to the lack of a convenient sampling method, a few previous studies have used point pattern analysis to examine the spatial patterns of grassland species. This is unfortunate because being able to explore point patterns in grassland systems has widespread implications for population dynamics, community-level patterns and ecological processes. In this study, we develop a new method to measure individual coordinates of species in grassland communities. This method records plant growing positions via digital picture samples that have been sub-blocked within a geographical information system (GIS). Here, we tested out the new method by measuring the individual coordinates of Stipa grandis in grazed and ungrazed S. grandis communities in a temperate steppe ecosystem in China. Furthermore, we analyzed the pattern of S. grandis by using the pair correlation function g(r) with both a homogeneous Poisson process and a heterogeneous Poisson process. Our results showed that individuals of S. grandis were overdispersed according to the homogeneous Poisson process at 0-0.16 m in the ungrazed community, while they were clustered at 0.19 m according to the homogeneous and heterogeneous Poisson processes in the grazed community. These results suggest that competitive interactions dominated the ungrazed community, while facilitative interactions dominated the grazed community. In sum, we successfully executed a new sampling method, using digital photography and a Geographical Information System, to collect experimental data on the spatial point patterns for the populations in this grassland community.
Methods 1. Data collection using digital photographs and GIS
A flat 5 m x 5 m sampling block was chosen in a study grassland community and divided with bamboo chopsticks into 100 sub-blocks of 50 cm x 50 cm (Fig. 1). A digital camera was then mounted to a telescoping stake and positioned in the center of each sub-block to photograph vegetation within a 0.25 m2 area. Pictures were taken 1.75 m above the ground at an approximate downward angle of 90° (Fig. 2). Automatic camera settings were used for focus, lighting and shutter speed. After photographing the plot as a whole, photographs were taken of each individual plant in each sub-block. In order to identify each individual plant from the digital images, each plant was uniquely marked before the pictures were taken (Fig. 2 B).
Digital images were imported into a computer as JPEG files, and the position of each plant in the pictures was determined using GIS. This involved four steps: 1) A reference frame (Fig. 3) was established using R2V software to designate control points, or the four vertexes of each sub-block (Appendix S1), so that all plants in each sub-block were within the same reference frame. The parallax and optical distortion in the raster images was then geometrically corrected based on these selected control points; 2) Maps, or layers in GIS terminology, were set up for each species as PROJECT files (Appendix S2), and all individuals in each sub-block were digitized using R2V software (Appendix S3). For accuracy, the digitization of plant individual locations was performed manually; 3) Each plant species layer was exported from a PROJECT file to a SHAPE file in R2V software (Appendix S4); 4) Finally each species layer was opened in Arc GIS software in the SHAPE file format, and attribute data from each species layer was exported into Arc GIS to obtain the precise coordinates for each species. This last phase involved four steps of its own, from adding the data (Appendix S5), to opening the attribute table (Appendix S6), to adding new x and y coordinate fields (Appendix S7) and to obtaining the x and y coordinates and filling in the new fields (Appendix S8).
To determine the accuracy of our new method, we measured the individual locations of Leymus chinensis, a perennial rhizome grass, in representative community blocks 5 m x 5 m in size in typical steppe habitat in the Inner Mongolia Autonomous Region of China in July 2010 (Fig. 4 A). As our standard for comparison, we used a ruler to measure the individual coordinates of L. chinensis. We tested for significant differences between (1) the coordinates of L. chinensis, as measured with our new method and with the ruler, and (2) the pair correlation function g of L. chinensis, as measured with our new method and with the ruler (see section 3.2 Data Analysis). If (1) the coordinates of L. chinensis, as measured with our new method and with the ruler, and (2) the pair correlation function g of L. chinensis, as measured with our new method and with the ruler, did not differ significantly, then we could conclude that our new method of measuring the coordinates of L. chinensis was reliable.
We compared the results using a t-test (Table 1). We found no significant differences in either (1) the coordinates of L. chinensis or (2) the pair correlation function g of L. chinensis. Further, we compared the pattern characteristics of L. chinensis when measured by our new method against the ruler measurements using a null model. We found that the two pattern characteristics of L. chinensis did not differ significantly based on the homogenous Poisson process or complete spatial randomness (Fig. 4 B). Thus, we concluded that the data obtained using our new method was reliable enough to perform point pattern analysis with a null model in grassland communities.
Facebook
TwitterMinnesota's original public land survey plat maps were created between 1848 and 1907 during the first government land survey of the state by the U.S. Surveyor General's Office. This collection of more than 3,600 maps includes later General Land Office (GLO) and Bureau of Land Management maps up through 2001. Scanned images of the maps are available in several digital formats and most have been georeferenced.
The survey plat maps, and the accompanying survey field notes, serve as the fundamental legal records for real estate in Minnesota; all property titles and descriptions stem from them. They also are an essential resource for surveyors and provide a record of the state's physical geography prior to European settlement. Finally, they testify to many years of hard work by the surveying community, often under very challenging conditions.
The deteriorating physical condition of the older maps (drawn on paper, linen, and other similar materials) and the need to provide wider public access to the maps, made handling the original records increasingly impractical. To meet this challenge, the Office of the Secretary of State (SOS), the State Archives of the Minnesota Historical Society (MHS), the Minnesota Department of Transportation (MnDOT), MnGeo and the Minnesota Association of County Surveyors collaborated in a digitization project which produced high quality (800 dpi), 24-bit color images of the maps in standard TIFF, JPEG and PDF formats - nearly 1.5 terabytes of data. Funding was provided by MnDOT.
In 2010-11, most of the JPEG plat map images were georeferenced. The intent was to locate the plat images to coincide with statewide geographic data without appreciably altering (warping) the image. This increases the value of the images in mapping software where they can be used as a background layer.
Facebook
TwitterAttribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
License information was derived automatically
Today, deep neural networks are widely used in many computer vision problems, also for geographic information systems (GIS) data. This type of data is commonly used for urban analyzes and spatial planning. We used orthophotographic images of two residential districts from Kielce, Poland for research including urban sprawl automatic analysis with Transformer-based neural network application.Orthophotomaps were obtained from Kielce GIS portal. Then, the map was manually masked into building and building surroundings classes. Finally, the ortophotomap and corresponding classification mask were simultaneously divided into small tiles. This approach is common in image data preprocessing for machine learning algorithms learning phase. Data contains two original orthophotomaps from Wietrznia and Pod Telegrafem residential districts with corresponding masks and also their tiled version, ready to provide as a training data for machine learning models.Transformed-based neural network has undergone a training process on the Wietrznia dataset, targeted for semantic segmentation of the tiles into buildings and surroundings classes. After that, inference of the models was used to test model's generalization ability on the Pod Telegrafem dataset. The efficiency of the model was satisfying, so it can be used in automatic semantic building segmentation. Then, the process of dividing the images can be reversed and complete classification mask retrieved. This mask can be used for area of the buildings calculations and urban sprawl monitoring, if the research would be repeated for GIS data from wider time horizon.Since the dataset was collected from Kielce GIS portal, as the part of the Polish Main Office of Geodesy and Cartography data resource, it may be used only for non-profit and non-commertial purposes, in private or scientific applications, under the law "Ustawa z dnia 4 lutego 1994 r. o prawie autorskim i prawach pokrewnych (Dz.U. z 2006 r. nr 90 poz 631 z późn. zm.)". There are no other legal or ethical considerations in reuse potential.Data information is presented below.wietrznia_2019.jpg - orthophotomap of Wietrznia districtmodel's - used for training, as an explanatory imagewietrznia_2019.png - classification mask of Wietrznia district - used for model's training, as a target imagewietrznia_2019_validation.jpg - one image from Wietrznia district - used for model's validation during training phasepod_telegrafem_2019.jpg - orthophotomap of Pod Telegrafem district - used for model's evaluation after training phasewietrznia_2019 - folder with wietrznia_2019.jpg (image) and wietrznia_2019.png (annotation) images, divided into 810 tiles (512 x 512 pixels each), tiles with no information were manually removed, so the training data would contain only informative tilestiles presented - used for the model during training (images and annotations for fitting the model to the data)wietrznia_2019_vaidation - folder with wietrznia_2019_validation.jpg image divided into 16 tiles (256 x 256 pixels each) - tiles were presented to the model during training (images for validation model's efficiency); it was not the part of the training datapod_telegrafem_2019 - folder with pod_telegrafem.jpg image divided into 196 tiles (256 x 265 pixels each) - tiles were presented to the model during inference (images for evaluation model's robustness)Dataset was created as described below.Firstly, the orthophotomaps were collected from Kielce Geoportal (https://gis.kielce.eu). Kielce Geoportal offers a .pst recent map from April 2019. It is an orthophotomap with a resolution of 5 x 5 pixels, constructed from a plane flight at 700 meters over ground height, taken with a camera for vertical photos. Downloading was done by WMS in open-source QGIS software (https://www.qgis.org), as a 1:500 scale map, then converted to a 1200 dpi PNG image.Secondly, the map from Wietrznia residential district was manually labelled, also in QGIS, in the same scope, as the orthophotomap. Annotation based on land cover map information was also obtained from Kielce Geoportal. There are two classes - residential building and surrounding. Second map, from Pod Telegrafem district was not annotated, since it was used in the testing phase and imitates situation, where there is no annotation for the new data presented to the model.Next, the images was converted to an RGB JPG images, and the annotation map was converted to 8-bit GRAY PNG image.Finally, Wietrznia data files were tiled to 512 x 512 pixels tiles, in Python PIL library. Tiles with no information or a relatively small amount of information (only white background or mostly white background) were manually removed. So, from the 29113 x 15938 pixels orthophotomap, only 810 tiles with corresponding annotations were left, ready to train the machine learning model for the semantic segmentation task. Pod Telegrafem orthophotomap was tiled with no manual removing, so from the 7168 x 7168 pixels ortophotomap were created 197 tiles with 256 x 256 pixels resolution. There was also image of one residential building, used for model's validation during training phase, it was not the part of the training data, but was a part of Wietrznia residential area. It was 2048 x 2048 pixel ortophotomap, tiled to 16 tiles 256 x 265 pixels each.
Facebook
TwitterCulminating more than four years of processing data, NASA and the National Geospatial-Intelligence Agency (NGA) have completed Earth's most extensive global topographic map. The mission is a collaboration among NASA, NGA, and the German and Italian space agencies. For 11 days in February 2000, the space shuttle Endeavour conducted the Shuttle Radar Topography Mission (SRTM) using C-Band and X-Band interferometric synthetic aperture radars to acquire topographic data over 80% of the Earth's land mass, creating the first-ever near-global data set of land elevations. This data was used to produce topographic maps (digital elevation maps) 30 times as precise as the best global maps used today. The SRTM system gathered data at the rate of 40,000 per minute over land. They reveal for the first time large, detailed swaths of Earth's topography previously obscured by persistent cloudiness. The data will benefit scientists, engineers, government agencies and the public with an ever-growing array of uses. The SRTM radar system mapped Earth from 56 degrees south to 60 degrees north of the equator. The resolution of the publicly available data is three arc-seconds (1/1,200th of a degree of latitude and longitude, about 295 feet, at Earth's equator). The final data release covers Australia and New Zealand in unprecedented uniform detail. It also covers more than 1,000 islands comprising much of Polynesia and Melanesia in the South Pacific, as well as islands in the South Indian and Atlantic oceans. SRTM data are being used for applications ranging from land use planning to "virtual" Earth exploration. Currently, the mission's homepage "http://www.jpl.nasa.gov/srtm" provides direct access to recently obtained earth images. The Shuttle Radar Topography Mission C-band data for North America and South America are available to the public. A list of complete public data set is available at "http://www2.jpl.nasa.gov/srtm/dataprod.htm" The data specifications are within the following parameters: 30-meter X 30-meter spatial sampling with 16 meter absolute vertical height accuracy, 10-meter relative vertical height accuracy, and 20-meter absolute horizontal circular accuracy. From the JPL Mission Products Summary, "http://www.jpl.nasa.gov/srtm/dataprelimdescriptions.html". The primary products of the SRTM mission are the digital elevation maps of most of the Earth's surface. Visualized images of these maps are available for viewing online. Below you will find descriptions of the types of images that are being generated:
The SRTM radar contained two types of antenna panels, C-band and X-band. The near-global topographic maps of Earth called Digital Elevation Models (DEMs) are made from the C-band radar data. These data were processed at the Jet Propulsion Laboratory and are being distributed through the United States Geological Survey's EROS Data Center. Data from the X-band radar are used to create slightly higher resolution DEMs but without the global coverage of the C-band radar. The SRTM X-band radar data are being processed and distributed by the German Aerospace Center, DLR.
Facebook
TwitterThis database was prepared using a combination of materials that include aerial photographs, topographic maps (1:24,000 and 1:250,000), field notes, and a sample catalog. Our goal was to translate sample collection site locations at Yellowstone National Park and surrounding areas into a GIS database. This was achieved by transferring site locations from aerial photographs and topographic maps into layers in ArcMap. Each field site is located based on field notes describing where a sample was collected. Locations were marked on the photograph or topographic map by a pinhole or dot, respectively, with the corresponding station or site numbers. Station and site numbers were then referenced in the notes to determine the appropriate prefix for the station. Each point on the aerial photograph or topographic map was relocated on the screen in ArcMap, on a digital topographic map, or an aerial photograph. Several samples are present in the field notes and in the catalog but do not correspond to an aerial photograph or could not be found on the topographic maps. These samples are marked with “No” under the LocationFound field and do not have a corresponding point in the SampleSites feature class. Each point represents a field station or collection site with information that was entered into an attributes table (explained in detail in the entity and attribute metadata sections). Tabular information on hand samples, thin sections, and mineral separates were entered by hand. The Samples table includes everything transferred from the paper records and relates to the other tables using the SampleID and to the SampleSites feature class using the SampleSite field.
Facebook
TwitterThis map provides a preview and information about the National Agriculture Imagery Program (NAIP) image service available on the USDA Farm Production and Conservation Business Center Geospatial Enterprise Office public image server. Under the NAIP folder you will find a cached layer of the Contiguous United States which provides fast rendering and is scaled up to Level 17. NAIP image dates vector services showing when imagery was acquired are available on the NAIP Image Dates Data Hub. Click on the map tack pin to bring up a thumbnail view of the imagery for that area. Click the more info link to view the REST services directory for that image. This directory provides information about the image service and provides links to view the image in the ArcGIS Online map viewer, ArcMap, ArcGIS Javascript, or Google Earth. If you have feedback about NAIP imagery you can provide it by accessing the NAIP Imagery Feedback map.To view the status of the 2024 NAIP inspection view the NAIP Inspection Status Dashboard.For ordering information and other questions please contact our Customer Service Section at geo.sales@usda.gov.For questions and comments about this map please contact Joan Biediger at joan.biediger@usda.gov.
Facebook
TwitterGIS project files and imagery data required to complete the Introduction to Planetary Image Analysis and Geologic Mapping in ArcGIS Pro tutorial. These data cover the area in and around Jezero crater, Mars.
Facebook
TwitterThe files linked to this reference are the geospatial data created as part of the completion of the baseline vegetation inventory project for the NPS park unit. Current format is ArcGIS file geodatabase but older formats may exist as shapefiles. Large scale final map products were created within ArcMap and designed to show both the orthophoto coverage and the vegetation maps. For the vegetation maps, colors were assigned and the polygons labeled with the dominant vegetation and modifier and, where present, the second vegetation and modifier. For the orthophoto maps, the photos were simply plotted at the same scale and area coverage as the vegetation maps. Additional planimetric map data included roads, trails, hydrology, boundaries and a UTM coordinate grid. Legends are designed to provide full definitions of the vegetation and buffer classes and modifiers, as well as information about the park, map projection, data sources and authorship (Figure 19). All maps are projected to the Universal Transverse Mercator Coordinate System, North American Datum of 1984, in the local zone for the specific park. Photo Date: 10/24/2000 Area (ac): 3945 Area (ha): 1597 Completion Date: Oct, 2008 Veg Class: 20 Polygons: 382 Avg Polygon Size: 4.18 Map Scale: 1:9,000
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset encompasses the metadata drawn from preserving and visualizing the Rural Route Nomad Photo and Video Collection. The collection consists of 14,058 born-digital objects shot on over a dozen digital cameras in over 30 countries, on seven continents from the end of 2008 through 2009. Metadata was generated using ExifTool, along with manual means, utilizing OpenRefine and Excel to parse and clean.
The dataset was a result of an overriding project to preserve the digital content of the Rural Route Nomad Collection, and then visualize photographic specs and geographic details with charts, graphs and maps in Tableau. A description of the project as a whole is publicly forthcoming. Visualizations can be found at https://public.tableau.com/app/profile/alan.webber5364.
Facebook
TwitterMosaics are published as ArcGIS image serviceswhich circumvent the need to download or order data. GEO-IDS image services are different from standard web services as they provide access to the raw imagery data. This enhances user experiences by allowing for user driven dynamic area of interest image display enhancement, raw data querying through tools such as the ArcPro information tool, full geospatial analysis, and automation through scripting tools such as ArcPy. Image services are best accessed through the ArcGIS REST APIand REST endpoints (URL's). You can copy the OPS ArcGIS REST API link below into a web browser to gain access to a directory containing all OPS image services. Individual services can be added into ArcPro for display and analysis by using Add Data -> Add Data From Path and copying one of the image service ArcGIS REST endpoint below into the resultant text box. They can also be accessed by setting up an ArcGIS server connectionin ESRI software using the ArcGIS Image Server REST endpoint/URL. Services can also be accessed in open-source software. For example, in QGIS you can right click on the type of service you want to add in the browser pane (e.g., ArcGIS REST Server, WCS, WMS/WMTS) and copy and paste the appropriate URL below into the resultant popup window. All services are in Web Mercator projection. For more information on what functionality is available and how to work with the service, read the Ontario Web Raster Services User Guide. If you have questions about how to use the service, email Geospatial Ontario (GEO) at geospatial@ontario.ca Available Products: ArcGIS REST APIhttps://ws.geoservices.lrc.gov.on.ca/arcgis5/rest/services/AerialImagery/ Image Service ArcGIS REST endpoint / URL'shttps://ws.geoservices.lrc.gov.on.ca/arcgis5/rest/services/AerialImagery/GEO_Imagery_Data_Service_2013to2017/ImageServer https://ws.geoservices.lrc.gov.on.ca/arcgis5/rest/services/AerialImagery/GEO_Imagery_Data_Service_2018to2022/ImageServer https://ws.geoservices.lrc.gov.on.ca/arcgis5/rest/services/AerialImagery/GEO_Imagery_Data_Service_2023to2027/ImageServerWeb Coverage Services (WCS) URL'shttps://ws.geoservices.lrc.gov.on.ca/arcgis5/services/AerialImagery/GEO_Imagery_Data_Service_2013to2017/ImageServer/WCSServer/https://ws.geoservices.lrc.gov.on.ca/arcgis5/services/AerialImagery/GEO_Imagery_Data_Service_2018to2022/ImageServer/WCSServer/https://ws.geoservices.lrc.gov.on.ca/arcgis5/services/AerialImagery/GEO_Imagery_Data_Service_2023to2027/ImageServer/WCSServer/Web Mapping Service (WMS) URL'shttps://ws.geoservices.lrc.gov.on.ca/arcgis5/services/AerialImagery/GEO_Imagery_Data_Service_2013to2017/ImageServer/WMSServer/https://ws.geoservices.lrc.gov.on.ca/arcgis5/services/AerialImagery/GEO_Imagery_Data_Service_2018to2022/ImageServer/WMSServer/https://ws.geoservices.lrc.gov.on.ca/arcgis5/services/AerialImagery/GEO_Imagery_Data_Service_2023to2027/ImageServer/WMSServer/ Metadata for all imagery products available in GEO-IDS can be accessed at the links below:South Central Ontario Orthophotography Project (SCOOP) 2023North-Western Ontario Orthophotography Project (NWOOP) 2022 Central Ontario Orthophotography Project (COOP) 2021 South-Western Ontario Orthophotography Project (SWOOP) 2020 Digital Raster Acquisition Project Eastern Ontario (DRAPE) 2019-2020 South Central Ontario Orthophotography Project (SCOOP) 2018 North-Western Ontario Orthophotography Project (NWOOP) 2017 Central Ontario Orthophotography Project (COOP) 2016 South-Western Ontario Orthophotography Project (SWOOP) 2015 Algonquin Orthophotography Project (2015) Additional Documentation: Ontario Web Raster Services User Guide (Word) Status:Completed: Production of the data has been completed Maintenance and Update Frequency:Annually: Data is updated every year Contact:Geospatial Ontario (GEO), geospatial@ontario.ca
Facebook
TwitterThe files linked to this reference are the geospatial data created as part of the completion of the baseline vegetation inventory project for the NPS park unit. Current format is ArcGIS file geodatabase but older formats may exist as shapefiles. For four of the map units – 3-SDF, 4-SDAF, 27-POHV, and 31-LBY – modeling using GIS principles was also employed. Modeling involves using environmental conditions of a map unit, such as elevation, slope, and aspect, which were determined by the field-collected ecological data. Data satisfying these conditions were obtained from ancillary data sources, such as USGS DEM data. These data were fed into a model that will result in locations (pixels) where all the desired conditions exist. For example, if a certain map unit was a shrubland that predominantly occurs above 8000 feet, on slopes of 3-10%, and on west-facing aspects, the correctly-constructed model will output only locations where this combination of conditions will be found. The resulting areas were then examined manually with the traditional photo interpretation process to confirm that they indeed could be accepted as that map unit. If photo interpretation determines that the areas were not acceptable, then they were changed to a more appropriate map unit.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
In this work, we present a dataset containing a collection of pictures taken during the fieldwork of a farmland abandonment study. Data was taken in 2010 with a compact camera which incorporates GPS and a digital compass sensor. The photographs are taken as a part of a GIS database. Using their Exif metadata we created a layer of Geographic Fields Of View (GeoFOVs) that can be used to perform very specific spatial queries. The dataset contains 2,235 pictures and GIS layers of GeoFOVs contextualizing the agricultural plots being photographed.
Facebook
TwitterThe files linked to this reference are the geospatial data created as part of the completion of the baseline vegetation inventory project for the NPS park unit. Current format is ArcGIS file geodatabase but older formats may exist as shapefiles. We converted the photointerpreted data into a format usable in a geographic information system (GIS) by employing three fundamental processes: (1) orthorectify, (2) digitize, and (3) develop the geodatabase. All digital map automation was projected in Universal Transverse Mercator (UTM), Zone 16, using the North American Datum of 1983 (NAD83). Orthorectify: We orthorectified the interpreted overlays by using OrthoMapper, a softcopy photogrammetric software for GIS. One function of OrthoMapper is to create orthorectified imagery from scanned and unrectified imagery (Image Processing Software, Inc., 2002). The software features a method of visual orientation involving a point-and-click operation that uses existing orthorectified horizontal and vertical base maps. Of primary importance to us, OrthoMapper also has the capability to orthorectify the photointerpreted overlays of each photograph based on the reference information provided. Digitize: To produce a polygon vector layer for use in ArcGIS (Environmental Systems Research Institute [ESRI], Redlands, California), we converted each raster-based image mosaic of orthorectified overlays containing the photointerpreted data into a grid format by using ArcGIS. In ArcGIS, we used the ArcScan extension to trace the raster data and produce ESRI shapefiles. We digitally assigned map-attribute codes (both map-class codes and physiognomic modifier codes) to the polygons and checked the digital data against the photointerpreted overlays for line and attribute consistency. Ultimately, we merged the individual layers into a seamless layer. Geodatabase: At this stage, the map layer has only map-attribute codes assigned to each polygon. To assign meaningful information to each polygon (e.g., map-class names, physiognomic definitions, links to NVCS types), we produced a feature-class table, along with other supportive tables and subsequently related them together via an ArcGIS Geodatabase. This geodatabase also links the map to other feature-class layers produced from this project, including vegetation sample plots, accuracy assessment (AA) sites, aerial photo locations, and project boundary extent. A geodatabase provides access to a variety of interlocking data sets, is expandable, and equips resource managers and researchers with a powerful GIS tool.