44 datasets found
  1. a

    Geospatial Data Extraction Tool

    • data-with-cpaws-nl.hub.arcgis.com
    Updated Mar 28, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Canadian Parks and Wilderness Society (2022). Geospatial Data Extraction Tool [Dataset]. https://data-with-cpaws-nl.hub.arcgis.com/documents/a5b5f96b86ac434083f84434b12a65a3
    Explore at:
    Dataset updated
    Mar 28, 2022
    Dataset authored and provided by
    Canadian Parks and Wilderness Society
    Area covered
    Description

    The Geospatial Data Extraction Guide can be found here. The Geospatial Data Extraction Tool allows for the dynamic extraction of data from the Government of Canadas Open Data Portal. There is a selection of base layers including: Landsat mosaic Canadian Digital Surface Model Canadian Digital Elevation Model National Forest Inventory National Tiling System Grid Coverage National Parks Boundaries National Marine Conservation Areas Automatic Extraction Building Projects Limits The User can select the data to be extracted, including: CanVec Elevation Automatic Extraction Data CanVec CanVec contains more than 60 topographic features organized into 8 themes: Transport Features, Administrative Features, Hydro Features, Land Features, Manmade Features, Elevation Features, Resource Management Features and Toponymic Features.

    This multiscale product originates from the best available geospatial data sources covering Canadian Territory. It offers quality topographic information in vector format complying with international geomatics standards. The document CanVec_Code in the Data Resourced section shows the list of entities and the scales at which they are available.The maximum extraction area is 150000km. Users are able to extract the following data:Lakes and rivers - Hydrographic featuresTransport networks - Transport featuresConstructions and land use - Manmade featuresMines, energy and communication networks - Resources Management FeaturesWooded areas, saturated soils and landscape - Land featuresElevation featuresMap Labels - Toponymic features (50K only)Output Options: OGC GeoPackage, ESRI file Geodatabase, ESRI ShapefileCoordinate System Options: NAD83 CSRS (EPSG:4617), WGS 84 / Pseudo-Mercator (EPSG:3857), NAD83 / Canada Atlas Lambert (EPSG:3979)Option to clip the data: Yes / NoScale Options: 1 / 50,000, 1 / 250,00ElevationElevation data consists of the Canadian Digital Elevation Model (CDDEM) and the Canadian Digital Surface Model (CDSM). These products are available for extraction along with their derived products (Shaded Relief, Color Shaded Relief, Color Relief, Slope Map*, Aspect Map* and Point Data). *Only available for CDEM.The maximum extraction area is 50000km. Users are able to extract the following data:Digital Elevation Model (DEM)Shaded ReliefColor ReliefColor Shaded ReliefSlope mapAspect mapPoint DataPick an azimuth between 0 and 360 Degrees: Direction of light source, between 0 and 360, measured in degrees, clockwise from the north.Pick an altitude between 0 and 90 degrees: Vertical direction of light source, from 0 (horizon) to 90 degrees (zenith).Enter a vertical exaggeration factor: Vertical exaggeration factor.Select the slope's measuring unit: Choice of degrees or percent slope.Coordinate System Options: NAD83 CSRS (EPSG:4617), WGS 84 / Pseudo-Mercator (EPSG:3857), NAD83 / Canada Atlas Lambert (EPSG:3979). Data is stored in geographic coordinates (longitude and latitude). However, it can also be offered in a plane coordinate projection (X and Y) at the time of extraction. Definition for the coordinate system can be found in the metadata.Select the DEM output formats: OGC GeoPackage, ESRI file Geodatabase, ESRI Shapefile. The source data (DEM or DSM) available formats are GeoTIFF and Esri ASCII Grid. The GeoTIFF format specification can be obtained from: https://www.pubdoc.org/fileformat/rasterimage/tiff/geotiff.pdf and https://geotiff.maptools.org/spec/geotiffhome.html.The Esri ASCII Grid format specification can be obtained from:https://desktop.arcgis.com/en/arcmap/10.3/manage-data/raster-and-images/esri-ascii-raster-format.htmSelect the Point Data output format: ASCII Gridded XYZ (xyz), ASCII Gridded CSV (.csv). The Point Data available formats are text CSV (.csv) (comma separated values) and text XYZ (.xyz) (space separated values). The format specification is the same for both (ASCII Gridded XYZ) and can be obtained from: https://www.gdal.org/frmt_xyz.htmlSelect the image resolution: 0.75 arc seconds, 1.5 arc seconds, 3 arc seconds, 6 arc seconds, 12 arc secondsEmail address (yourname@domain.com): When processed results will be deposited to the given email. The email information that you provide on this site is collected in accordance with the federal Privacy Act. You will be notified once your request has been processed and when it is ready for delivery. Informations about your privacy rights.The job status is listed and can be refreshed to see updates.Automatic Extraction DataThe maximum extraction area is 50000km. Users are able to extract the following data:BuildingsOutput Options: OGC GeoPackage, ESRI file Geodatabase, ESRI ShapefileCoordinate System Options: NAD83 CSRS (EPSG:4617), WGS 84 / Pseudo-Mercator (EPSG:3857), NAD83 / Canada Atlas Lambert (EPSG:3979)Email address (yourname@domain.com): When processed results will be deposited to the given email. The email information that you provide on this site is collected in accordance with the federal Privacy Act. You will be notified once your request has been processed and when it is ready for delivery. Informations about your privacy rights.The job status is listed and can be refreshed to see updates.

  2. d

    Individual Water System (IWS) Extract Points, Hawaii, 2022, Hawaii...

    • catalog.data.gov
    Updated Feb 25, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Hawaii State Department of Health (Publisher) (2025). Individual Water System (IWS) Extract Points, Hawaii, 2022, Hawaii Department of Health [Dataset]. https://catalog.data.gov/dataset/individual-water-system-iws-extract-points-hawaii-2022-hawaii-department-of-health13
    Explore at:
    Dataset updated
    Feb 25, 2025
    Dataset provided by
    Hawaii State Department of Health (Publisher)
    Area covered
    Hawaii
    Description

    This feature class contains On-Site Sewage Disposal System points for the state of Hawaii extracted from the state's tabular Individual Water System (IWS) database as of January 2022. The locations are plotted using the centroid of the TMK in which an IWS is found. This extract tends to be more up to date than the county-level OSDS (one-time spatial data creation by DOH) because the status of the IWS is updated, but the spatial locations may be less accurate. These features are incorporated in the U.S. EPA Region 9 Hawaii Wastewater Mapping application, a user interface mapping tool to help manage the Large Capacity Cesspool Program compliance and outreach efforts and assist with inspection targeting in Hawaii. The application can be found on the EPA GeoPlatform at: https://epa.maps.arcgis.com/apps/webappviewer/index.html?id=afd05fc3ab2347b2bcc63c5c20f59926

  3. M

    Met sites

    • data.mfe.govt.nz
    Updated Oct 15, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ministry for the Environment (2017). Met sites [Dataset]. https://data.mfe.govt.nz/document/21219-met-sites/
    Explore at:
    Dataset updated
    Oct 15, 2017
    Dataset authored and provided by
    Ministry for the Environment
    Description

    Coordinates for points used to extract values used to calculate trends

  4. a

    Connecticut 3D Lidar Viewer

    • gemelo-digital-en-arcgis-gemelodigital.hub.arcgis.com
    • arc-gis-hub-home-arcgishub.hub.arcgis.com
    Updated Jan 8, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    UConn Center for Land use Education and Research (2020). Connecticut 3D Lidar Viewer [Dataset]. https://gemelo-digital-en-arcgis-gemelodigital.hub.arcgis.com/maps/788d121c4a1f4980b529f914c8df19f4
    Explore at:
    Dataset updated
    Jan 8, 2020
    Dataset authored and provided by
    UConn Center for Land use Education and Research
    Description

    Statewide 2016 Lidar points colorized with 2018 NAIP imagery as a scene created by Esri using ArcGIS Pro for the entire State of Connecticut. This service provides the colorized Lidar point in interactive 3D for visualization, interaction of the ability to make measurements without downloading.Lidar is referenced at https://cteco.uconn.edu/data/lidar/ and can be downloaded at https://cteco.uconn.edu/data/download/flight2016/. Metadata: https://cteco.uconn.edu/data/flight2016/info.htm#metadata. The Connecticut 2016 Lidar was captured between March 11, 2016 and April 16, 2016. Is covers 5,240 sq miles and is divided into 23, 381 tiles. It was acquired by the Captiol Region Council of Governments with funding from multiple state agencies. It was flown and processed by Sanborn. The delivery included classified point clouds and 1 meter QL2 DEMs. The 2016 Lidar is published on the Connecticut Environmental Conditions Online (CT ECO) website. CT ECO is the collaborative work of the Connecticut Department of Energy and Environmental Protection (DEEP) and the University of Connecticut Center for Land Use Education and Research (CLEAR) to share environmental and natural resource information with the general public. CT ECO's mission is to encourage, support, and promote informed land use and development decisions in Connecticut by providing local, state and federal agencies, and the public with convenient access to the most up-to-date and complete natural resource information available statewide.Process used:Extract Building Footprints from Lidar1. Prepare Lidar - Download 2016 Lidar from CT ECO- Create LAS Dataset2. Extract Building Footprints from LidarUse the LAS Dataset in the Classify Las Building Tool in ArcGIS Pro 2.4.Colorize LidarColorizing the Lidar points means that each point in the point cloud is given a color based on the imagery color value at that exact location.1. Prepare Imagery- Acquire 2018 NAIP tif tiles from UConn (originally from USDA NRCS).- Create mosaic dataset of the NAIP imagery.2. Prepare and Analyze Lidar Points- Change the coordinate system of each of the lidar tiles to the Projected Coordinate System CT NAD 83 (2011) Feet (EPSG 6434). This is because the downloaded tiles come in to ArcGIS as a Custom Projection which cannot be published as a Point Cloud Scene Layer Package.- Convert Lidar to zlas format and rearrange. - Create LAS Datasets of the lidar tiles.- Colorize Lidar using the Colorize LAS tool in ArcGIS Pro. - Create a new LAS dataset with a division of Eastern half and Western half due to size limitation of 500GB per scene layer package. - Create scene layer packages (.slpk) using Create Cloud Point Scene Layer Package. - Load package to ArcGIS Online using Share Package. - Publish on ArcGIS.com and delete the scene layer package to save storage cost.Additional layers added:Visit https://cteco.uconn.edu/projects/lidar3D/layers.htm for a complete list and links. 3D Buildings and Trees extracted by Esri from the lidarShaded Relief from CTECOImpervious Surface 2012 from CT ECONAIP Imagery 2018 from CTECOContours (2016) from CTECOLidar 2016 Download Link derived from https://www.cteco.uconn.edu/data/download/flight2016/index.htm

  5. d

    Reference baselines used to extract shorelines for the West Coast of the...

    • catalog.data.gov
    • data.usgs.gov
    • +2more
    Updated Sep 18, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Reference baselines used to extract shorelines for the West Coast of the United States (ver. 1.1, September 2024) [Dataset]. https://catalog.data.gov/dataset/reference-baselines-used-to-extract-shorelines-for-the-west-coast-of-the-united-states
    Explore at:
    Dataset updated
    Sep 18, 2024
    Dataset provided by
    U.S. Geological Survey
    Area covered
    United States, West Coast of the United States
    Description

    This data release contains reference baselines for primarily open-ocean sandy beaches along the west coast of the United States (California, Oregon and Washington). The slopes were calculated while extracting shoreline position from lidar point cloud data collected between 2002 and 2011. The shoreline positions have been previously published, but the slopes have not. A reference baseline was defined and then evenly-spaced cross-shore beach transects were created. Then all data points within 1 meter of each transect were associated with each transect. Next, it was determined which points were one the foreshore, and then a linear regression was fit through the foreshore points. Beach slope was defined as the slope of the regression. Finally, the regression was evaluated at the elevation of Mean High Water (MHW) to yield the location of the shoreline. In some areas there was more than one lidar survey available; in these areas the slopes from each survey are provided. While most of the slopes are for sandy beaches, there is some slope data from rocky headlands and other steeper beaches. These data files (referenceLine_WestCoast.csv and referenceLine_WestCoast.shp) contain information about the reference baseline, the cross-shore transects, and the Mean High Water values used to estimate the shoreline. The accompanying data files (slopeData_WestCoast.csv and slopeData_WestCoast.shp) contain the slope data. The csv and shapefiles contain the same information, both file types are provided as a convenience to the user.

  6. f

    Sawyer Mill Dam Removal Drone DSM Elevation vs. Conventional Survey Terrain...

    • figshare.com
    xlsx
    Updated Feb 18, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alexandra Evans; Kevin Gardner (2022). Sawyer Mill Dam Removal Drone DSM Elevation vs. Conventional Survey Terrain Analysis for Reservoir Response Paper (2019 & 2020) [Dataset]. http://doi.org/10.6084/m9.figshare.14668176.v2
    Explore at:
    xlsxAvailable download formats
    Dataset updated
    Feb 18, 2022
    Dataset provided by
    figshare
    Authors
    Alexandra Evans; Kevin Gardner
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    These are the calculations used for examining elevation differences between the drone DSMs and conventional survey elevations across terrain types in the Evans et al. Sawyer Mill dam removal reservoir response manuscript. The “Extract Values to Points” tool in ArcGIS Pro extracted the DSM raster values at the XY locations of the surveyed points. Using the surveyed elevations and extracted DSM values across the available areas and flight dates, trends in the drone DSMs’ Z-direction accuracy were examined across different terrain categories: vegetation, dry terrain (e.g. exposed ground or wood), and submerged terrain (e.g. substrate). Elevation values correspond to NAVD88 in meters. The DSMs' and surveyed points' XY were in WGS 84 when used in the “Extract Values to Points” tool. The "Terrain" columns designate the final terrain type categories used in the terrain analysis presented in the manuscript, while the "Terrain/Notes from Field" columns contain transcribed notes from survey field notebooks that were written in the field. Vegetation heights were also from survey field notebooks. Please see the manuscript and spreadsheet for additional information. These materials were made using resources from an NSF EPSCoR funded project “RII Track-2 FEC: Strengthening the scientific basis for decision-making about dams: Multi-scale, coupled-systems research on ecological, social, and economic trade-offs” (a.k.a. "Future of Dams"). Support for this project is provided by the National Science Foundation’s Research Infrastructure Improvement NSF #IIA-1539071. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

  7. V

    Spot Elevations 2013

    • odgavaprod.ogopendata.com
    • data.virginia.gov
    • +3more
    Updated Jul 24, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Chesapeake City (2025). Spot Elevations 2013 [Dataset]. https://odgavaprod.ogopendata.com/dataset/spot-elevations-2013
    Explore at:
    geojson, html, arcgis geoservices rest api, kml, zip, csvAvailable download formats
    Dataset updated
    Jul 24, 2025
    Dataset provided by
    City of Chesapeake, VA
    Authors
    Chesapeake City
    Description

    Spot elevations across the City of Chesapeake. The points were created by using the ArcGIS "Create Random Points" tool and used the extent of the USGS DEM mosaic. The "Extract Values to Points" tool was then used to extract the values of the DEM to the random points. The elevations of the result are given in NAVD88(feet). Random points that fell within the hydropoly feature class (as of 3/16/15) were excluded from the final output. Elevations are based on a "bare-earth."

  8. Data from: Automatic extraction of road intersection points from USGS...

    • figshare.com
    zip
    Updated Nov 11, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Mahmoud Saeedimoghaddam; Tomasz Stepinski (2019). Automatic extraction of road intersection points from USGS historical map series using deep convolutional neural networks [Dataset]. http://doi.org/10.6084/m9.figshare.10282085.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Nov 11, 2019
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Mahmoud Saeedimoghaddam; Tomasz Stepinski
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Tagged image tiles as well as the Faster-RCNN framework for automatic extraction of road intersection points from USGS historical maps of the United States of America. The data and code have been prepared for the paper entitled "Automatic extraction of road intersection points from USGS historical map series using deep convolutional neural networks" submitted to "International Journal of Geographic Information Science". The image tiles have been tagged manually. The Faster RCNN framework (see https://arxiv.org/abs/1611.10012) was captured from:https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md

  9. H

    Extraction of Evapotranspiration Data from the OpenET Database

    • hydroshare.org
    • beta.hydroshare.org
    zip
    Updated Oct 31, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Motasem Abualqumboz (2022). Extraction of Evapotranspiration Data from the OpenET Database [Dataset]. https://www.hydroshare.org/resource/9f04f8dd3e42416e91bb1380042bbdc6
    Explore at:
    zip(177.5 KB)Available download formats
    Dataset updated
    Oct 31, 2022
    Dataset provided by
    HydroShare
    Authors
    Motasem Abualqumboz
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Description

    The purpose of this HydroShare resource is to facilitate the extraction of monthly-averaged Evapotranspiration (ET) data from the OpenET database (https://openetdata.org/). This resource could be used to extract ET data at one point using its latitude & longitude. The resource could also be used to have an average ET value at the watershed scale using a shapefile of the watershed of interest. The OpenET uses the best available science to provide easily accessible satellite-based estimates of evapotranspiration (ET) (https://openetdata.org/about/). The OpenET database provides ET data using the Ensemble method.

  10. g

    Common Ownership Lots as Points | gimi9.com

    • gimi9.com
    Updated Mar 1, 2003
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2003). Common Ownership Lots as Points | gimi9.com [Dataset]. https://gimi9.com/dataset/data-gov_common-ownership-lots-as-points/
    Explore at:
    Dataset updated
    Mar 1, 2003
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset represents parcels not mapped or sourced in Vector Property Map. Please refer to the common ownership lots layer in https://opendata.dc.gov for the most current data on ownership. Property Owner Points. This dataset contains points that represent the approximate location of real property lots within the District of Columbia. Each property point is generated based on a corresponding record maintained within the Office of Tax and Revenue (OTR) Real Property Tax Administration's (RPTA) real property database. Each point contains the full attribution of database fields derived from ITS public release extract. The initial data conversion effort was begun in 1997 as a means to provide RPTA with a digital mapping system which could be maintained to reflect ongoing changes to property lots and ownership. The initial step was to scan RPTA tax square maps from aperture cards at an effective paper resolution of 400 DPI. The resulting images were then georeferenced to DC's 0.2-meter resolution 1995 digital orthophotos. During the georeferencing process, the images were not warped; they were simply scaled and rotated to best fit the orthophotos. The DC tax assessor provided a database of active tax accounts which were placed interactively by an operator using the georeferenced square image and the orthophoto. Centroids were placed on the primary structure visible in the orthophoto within the raster property polygon. The placement was performed within ArcView 3.2 using a customized data production application. Accounts which could not be placed in the first pass were then reviewed by another operator to attempt to find their correct location. The placed points were QC'd through a spatial overlay with the square index to assure a match between the square field value within the property database and the actual square polygon into which the point was placed. Spot checking was then performed to confirm that the centroids fell within the correct raster lot. The centroids were delivered to OTR as a single citywide AutoCAD DWG file. Attribute features with square, suffix, and lot numbers (SSLs) were included as an AutoCAD block.

  11. d

    Multilayer perceptron classifier and shoreline extraction model archive for...

    • catalog.data.gov
    • data.usgs.gov
    Updated Nov 20, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Multilayer perceptron classifier and shoreline extraction model archive for Minnesota Point PlanetScope satellite imagery [Dataset]. https://catalog.data.gov/dataset/multilayer-perceptron-classifier-and-shoreline-extraction-model-archive-for-minnesota-poin
    Explore at:
    Dataset updated
    Nov 20, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    Minnesota Point, Minnesota
    Description

    A site-specific multilayer perceptron model was developed to classify PlanetScope satellite imagery of Minnesota Point, and the classifier was paired with a shoreline delineation routine to extract the position of the land-water interface. Satellite-dervived shorelines were used to measure changes in beach width associated with beach nourishment projects and changing water levels. The model is based on methods documented in Doherty, et al., 2022 (https://doi.org/10.1016/j.envsoft.2022.105512). This model archive includes the source code, the site-specific classifier, the labeled imagery used to train said classifier, and a subset of the input data needed to run the model. The input satellite imagery required to run the model are not included because the End User License Agreement (EULA) for PlanetScope imagery provided through the Commercial Smallsat Data Acquisition program prohibits making the imagery publicly available. The satellite-derived shorelines produced using this model are provided in a separate data release (Roland & Groten, 2024; https://doi:10.5066/P1W7SBUX).

  12. d

    GAL GW Quantile Interpolation 20161013

    • data.gov.au
    • cloud.csiss.gmu.edu
    • +1more
    zip
    Updated Jun 28, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Bioregional Assessment Program (2022). GAL GW Quantile Interpolation 20161013 [Dataset]. https://www.data.gov.au/data/dataset/49f20390-3340-4b08-b1dc-370fb919d34c
    Explore at:
    zip(10827021)Available download formats
    Dataset updated
    Jun 28, 2022
    Dataset authored and provided by
    Bioregional Assessment Program
    License

    Attribution 2.5 (CC BY 2.5)https://creativecommons.org/licenses/by/2.5/
    License information was derived automatically

    Description

    Abstract

    This dataset was derived by the Bioregional Assessment Programme from multiple source datasets. The source datasets are identified in the Lineage field in this metadata statement.

    The processes undertaken to produce this derived dataset are described in the History field in this metadata statement.

    The Groundwater (GW) quantiles are extracted from the Groundwater modelling outputs. Dataset prepared for import into the Impact and Risk Analysis Database.

    Dataset History

    Drawdown percentile and exceedance probability values was extracted from groundwater model outputs. This was performed using a GIS routine to extract groundwater model raster values using the assessment units (as points) attributed with the regional water table aquifer layer and assigning the model value from the corresponding layer to each assessment unit.

    Dataset Citation

    XXXX XXX (2017) GAL GW Quantile Interpolation 20161013. Bioregional Assessment Derived Dataset. Viewed 12 December 2018, http://data.bioregionalassessments.gov.au/dataset/49f20390-3340-4b08-b1dc-370fb919d34c.

    Dataset Ancestors

  13. OpenStreetMap Data French Polynesia

    • americansamoa-data.sprep.org
    • solomonislands-data.sprep.org
    • +13more
    txt, zip
    Updated Jul 16, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Secretariat of the Pacific Regional Environment Programme (2025). OpenStreetMap Data French Polynesia [Dataset]. https://americansamoa-data.sprep.org/dataset/openstreetmap-data-french-polynesia
    Explore at:
    txt, zipAvailable download formats
    Dataset updated
    Jul 16, 2025
    Dataset provided by
    Pacific Regional Environment Programmehttps://www.sprep.org/
    License

    Public Domain Mark 1.0https://creativecommons.org/publicdomain/mark/1.0/
    License information was derived automatically

    Area covered
    French Polynesia, Polynesia, Pacific Region
    Description

    OpenStreetMap (OSM) is a free, editable map & spatial database of the whole world. This dataset is an extract of OpenStreetMap data for French Polynesia in a GIS-friendly format.

    The OSM data has been split into separate layers based on themes (buildings, roads, points of interest, etc), and it comes bundled with a QGIS project and styles, to help you get started with using the data in your maps. This OSM product will be updated weekly.

    The goal is to increase awareness among Pacific GIS users of the richness of OpenStreetMap data in Pacific countries, as well as the gaps, so that they can take advantage of this free resource, become interested in contributing to OSM, and perhaps join the global OSM community.

    OpenStreetMap data is open data, with a very permissive licence. You can download it and use it for any purpose you like, as long as you credit OpenStreetMap and its contributors. You don't have to pay anyone, or ask anyone's permission. When you download and use the data, you're granted permission to do that under the Open Database Licence (ODbL). The only conditions are that you Attribute, Share-Alike, and Keep open.

    The required credit is “© OpenStreetMap contributors”. If you make a map, you should display this credit somewhere. If you provide the data to someone else, you should make sure the license accompanies the data

  14. NT Water Surface Water Licence Extraction Points - Dataset - NTG Open Data...

    • data.nt.gov.au
    Updated Dec 10, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nt.gov.au (2019). NT Water Surface Water Licence Extraction Points - Dataset - NTG Open Data Portal [Dataset]. https://data.nt.gov.au/dataset/nt-water-surface-water-licence-extraction-points
    Explore at:
    Dataset updated
    Dec 10, 2019
    Dataset provided by
    Northern Territory Governmenthttp://nt.gov.au/
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Northern Territory
    Description

    Groundwater Extraction Points This spatial dataset contains point features associated with a groundwater extraction licence granted by the Controller of Water Resources. The spatial dataset describes registered bore details, the groundwater extraction licence number and the groundwater resource details which the bore connects to and may draw water from. The data also includes a web link to the Licence Decision documents. Surface Water Extraction Points This spatial dataset contains point features associated with a surface water extraction licence granted by the Controller of Water Resources. The location of the extraction point is sourced from a data entry in the Water Act Licensing and Permit System (WALAPS). The spatial dataset describes the surface water extraction licence number and includes a web link to the Licence Decision documents. For details about water extraction licensing in the Northern Territory refer to the water licensing portal. https://denr.nt.gov.au/land-resource-management/water/permits-and-licences/water-licensing-portal

  15. a

    CSDCIOP Structure Points

    • maine.hub.arcgis.com
    Updated Feb 26, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    State of Maine (2020). CSDCIOP Structure Points [Dataset]. https://maine.hub.arcgis.com/maps/maine::csdciop-structure-points
    Explore at:
    Dataset updated
    Feb 26, 2020
    Dataset authored and provided by
    State of Maine
    Area covered
    Description

    Feature class that compare the elevations between seawall crests (extracted from available LiDAR datasets from 2010 and 2013) with published FEMA Base Flood Elevations (BFEs) from preliminary FEMA DFIRMS (Panels issued in 2018 and 2019) in coastal York and Cumberland counties (up through Willard Beach in South Portland). The dataset included the development of an inventory of coastal armor structures from a range of different datasets. Feature classes include the following:Steps to create the dataset included:Shoreline structures from the most recent NOAA EVI LANDWARD_SHORETYPE feature class were extracted using the boundaries of York and Cumberland counties. This included 1B: Exposed, Solid Man-Made structures, 8B: Sheltered, Solid Man-Made Structures; 6B: Riprap, and 8C: Sheltered Riprap. This resulted in the creation of Cumberland_ESIL_Structures and York_ESIL_Structures. Note that ESIL uses the MHW line as the feature base.Shoreline structures from the work by Rice (2015) were extracted using the York and Cumberland county boundaries. This resulted in the creation of Cumberland_Rice_Structures and York_Rice_Structures.Additional feature classes for structures were created for York and Cumberland county structures that were missed. This was Slovinsky_York_Structures and Slovinsky_Cumberland_Structures. GoogleEarth imagery was inspected while additional structures were being added to the GIS. 2012 York and Cumberland County imagery was used as the basemap, and structures were classified as bulkheads, rip rap, or dunes (if known). Also, whether or not the structure was in contact with the 2015 HAT was noted.MEDEP was consulted to determine which permit data (both PBR and Individual Permit, IP, data) could be used to help determine where shoreline stabilization projects may have been conducted adjacent to or on coastal bluffs. A file was received for IP data and brought into GIS (DEP_Licensing_Points). This is a point file for shoreline stabilization permits under NRPA.Clip GISVIEW.MEDEP.Permit_By_Rule_Locations to the boundaries of the study area and output DEP_PBR_Points.Join GISVIEW.sde>GISVIEW.MEDEP.PBR_ACTIVITY to the DEP_PBR_Points using the PBR_ID Field. Then, export this file as DEP_PBR_Points2. Using the new ACTIVITY_DESC field, select only those activities that relate to shoreline stabilization projects:PBR_ACTIVITY ACTIVITY_DESC02 Act. Adjacent to a Protected Natural Resource04 Maint Repair & Replacement of Structure08 Shoreline StabilizationSelect by Attributes > PBR_ACTIVITY IN (‘02’, ‘04’, ‘08’) select only those activities likely to be related to shoreline stabilization, and export the selected data as a DEP_PBR_Points3. Then delete 1 and 2, and rename this final product as DEP_PBR_Points.Next, visually inspect the Licensing and PBR files using ArcMap 2012, 2013 imagery, along with Google Earth imagery to determine the extents of armoring along the shoreline.Using EVI and Rice data as indicators, manually inspect and digitize sections of the coastline that are armored. Classify the seaward shoreline type (beach, mudflat, channel, dune, etc.) and the armor type (wall or bulkhead). Bring in the HAT line and, using that and visual indicators, identify whether or not the armored sections are in contact with HAT. Use Google Earth at the same time as digitizing in order to help constrain areas. Merge digitized armoring into Cumberland_York_Merged.Bring the preliminary FEMA DFIRM data in and use “intersect” to assign the different flood zones and elevations to the digitized armored sections. This was done first for Cumberland, then for York Counties. Delete ancillary attributes, as needed. Resulting layer is Cumberland_Structure_FloodZones and York_Structure_FloodZones.Go to NOAA Digital Coast Data Layers and download newest LiDAR data for York and Cumberland county beach, dune, and just inland areas. This includes 2006 and newer topobathy data available from 2010 (entire coast), and selected areas from 2013 and 2014 (Wells, Scarborough, Kennebunk).Mosaic the 2006, 2010, 2013 and 2014 data (with 2013 and 2014 being the first dataset laying on top of the 2010 data) Mosaic this dataset into the sacobaydem_ftNAVD raster (this is from the MEGIS bare-earth model). This will cover almost all of the study area except for armor along several areas in York. Resulting in LidAR206_2010_2013_Mosaic.tif.Using the LiDAR data as a proxy, create a “seaward crest” line feature class which follows along the coast and extracts the approximate highest point (cliff, bank, dune) along the shoreline. This will be used to extract LiDAR data and compare with preliminary flood zone information. The line is called Dune_Crest.Using an added tool Points Along Line, create points at 5 m spacing along each of the armored shoreline feature lines and the dune crest lines. Call the outputs PointsonLines and PointsonDunes.Using Spatial Analyst, Extract LIDAR elevations to the points using the 2006_2010_2013 Mosaic first. Call this LidarPointsonLines1. Select those points which have NULL values, export as this LiDARPointsonLines2. Then rerun Extract Values to Points using just the selected data and the state MEGIS DEM. Convert RASTERVALU to feet by multiplying by 3.2808 (and rename as Elev_ft). Select by Attributes, find all NULL values, and in an edit session, delete them from LiDARPointsonLines. Then, merge the 2 datasets and call it LidarPointsonLines. Do the same above with dune lines and create LidarPointsonDunes.Next, use the Cumberland and York flood zone layers to intersect the points with the appropriate flood zone data. Create ….CumbFIRM and …YorkFIRM files for the dunes and lines.Select those points from the Dunes feature class that are within the X zone – these will NOT have an associated BFE for comparison with the Lidar data. Export the Dune Points as Cumberland_York_Dunes_XZone. Run NEAR and use the merged flood zone feature class (with only V, AE, and AO zones selected). Then, join the flood zone data to the feature class using FID (from the feature class) and OBJECTID (from the flood zone feature class). Export as Cumberland_York_Dunes_XZone_Flood. Delete ancillary columns of data, leaving the original FLD_ZONE (X), Elev_ft, NEAR_DIST (distance, in m, to the nearest flood zone), FLD_ZONE_1 (the near flood zone), and the STATIC_BFE_1 (the nearest static BFE).Do the same as above, except with the Structures file (Cumberland_York_Structures_Lidar_DFIRM_Merged), but also select those features that are within the X zone and the OPEN WATER. Export the points as Cumberland_York_Structures_XZone. Again, run the NEAR using the merged flood zone and only AE, VE, and AO zones selected. Export the file as Cumberland_York_Structures_XZone_Flood.Merge the above feature classes with the original feature classes. Add a field BFE_ELEV_COMPARE. Select all those features whose attributes have a VE or AE flood zone and use field calculator to calculate the difference between the Elev_ft and the BFE (subtracting the STATIC_BFE from Elev_ft). Positive values mean the maximum wall value is higher than the BFE, while negative values mean the max is below the BFE. Then, select the remaining values with switch selection. Calculate the same value but use the NEAR_STATIC_BFE value instead. Select by Attributes>FLD_ZONE=AO, and use the DEPTH value to enter into the above created fields as negative values. Delete ancilary attribute fields, leaving those listed in the _FINAL feature classes described above the process steps section.

  16. w

    Travelling Stock Route Conservation Values

    • data.wu.ac.at
    • researchdata.edu.au
    • +1more
    zip
    Updated Oct 9, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Bioregional Assessment Programme (2018). Travelling Stock Route Conservation Values [Dataset]. https://data.wu.ac.at/schema/data_gov_au/OGQ1NWU3MzEtODcwMi00YjU2LWI3YjgtZTFmNjM1ZjQ2MzI5
    Explore at:
    zip(5846392.0)Available download formats
    Dataset updated
    Oct 9, 2018
    Dataset provided by
    Bioregional Assessment Programme
    License

    Attribution 3.0 (CC BY 3.0)https://creativecommons.org/licenses/by/3.0/
    License information was derived automatically

    Description

    Abstract

    This dataset and its metadata statement were supplied to the Bioregional Assessment Programme by a third party and are presented here as originally supplied.

    This shapefile was constructed by combining crown TSR spatial data, information gathered from Rural Lands Protection Board (RLPB) rangers, and surveyed Conservation and Biodiversity data to compile a layer within 30 RLPB districts in NSW. The layer attempts to spatially reflect current TSRs as accurately as possible with conservation attributes for each one.

    Dataset History

    The initial process in production involved using the most up to date extract of TSR from the crown spatial layer as a base map, as this layer should reasonably accurately spatially reflect the location, size, and attributes of TSR in NSW. This crown spatial layer from which the TSR were extracted is maintained by the NSW Department of Lands. The TSR extract is comprised of approximately 25,000 polygons in the study area. These polygons were then attributed with names, IDs and other attributes from the Long Paddock (LP) points layer produced by the RLPB State Council, which contains approximately 4000 named reserves throughout the study area. This layer reflects the names and ID number by which the reserves were or are currently managed by the RLPB's. This layer was spatially joined with the TSR polygon layer by proximity to produce a polygon layer attributed with RLPB reserve names and ID numbers. This process was repeated for other small datasets in order to link data with the polygon layer and LP reserve names. The next and by far the most time consuming and laborious process in the project was transferring the data gathered from surveys undertaken with RLPB rangers about each reserve (location, spatial extent, name, currency conservation value and biodiversity). This spatial information was annotated on hard copy maps and referenced against the spatial join making manual edits where necessary. Edits were conducted manually as the reference information was only on hard copy paper maps. Any corrections were made to the merged layer to produce an accurate spatial reflection of the RLPB reserves by name and ID. This manual editing process composed the bulk of the time for layer production as all reserves in each RLPB district in the study area had to be checked manually. Any necessary changes had to then be made to correct the spatial location of the reserve and ensure the correct ID was assigned for attributing the conservation data. In approximately 80% of cases the spatial join was correct, although this figure would be less where long chains of TSR polygons exist. The majority of time was devoted to making the numerous additions that needed to be incorporated. A spreadsheet based on the LP point layer was attributed with the LP point [OBJECTID] in order to produce a unique reference for each reserve so that conservation and biodiversity value data could be attributed against each reserve in the spatial layer being produced. Any new reserves were allocated [OBJECTID] number both in the GIS and the spreadsheet in order to create this link. All relevant data was entered into the spreadsheet and then edited to a suitable level to be attached as an attribute table. Field names were chosen and appropriate an interpretable data formats each field. The completed spreadsheet was then linked to the shapefile to produce a polygon TSR spatial layer containing all available conservation and biodiversity information. Any additional attribute were either entered manually or obtained by merging with other layers. Attributes for the final layer were selected for usability by those wishing to query valuable Conservation Value (CV) data for each reserve, along with a number of administrative attributes for locating and querying certain aspects of each parcel. Constant error checking was conducted throughout the process to ensure minimal error being transferred to the production. This was done manually, and also by running numerous spatial and attribute based queries to identify potential errors in the spatial layer being produced. Follow up phone calls were made to the rangers to identify exact localities of reserves where polygons could not be allocated due to missing or ambiguous information. If precise location data was provided, polygons could be added in, either from other crown spatial layers or from cadastre. These polygons were also attributed with the lowest confindex rating, as their status as crown land is unknown or doubtful. In some cases existing GIS layers had been created for certain areas. Murray RLPB has data where 400+ polygons do not exist in the current crown TSR extract. According to the rangers interviewed it was determined the majority of these TSR exist. This data was incorporated in the TSR polygon by merging the two layers and then assigning attributes in the normal way, ie by being given a LP Name and ID and then updated from the marked up hard copy maps. In the confidence index these are given a rating of 1 (see confindex matrix) due to the unknown source of the data and no match with any other crown spatial data. A confidence index matrix (confindex) was produced in order to give the end user of the GIS product an idea as to how the data for each reserve was obtained, its purpose, and an indication to whether it is likely to be a current TSR. The higher the confindex, the more secure the user can be in the data. (See Confidence Index Matrix) This was necessary due to conflicting information from a number of datasets, usually the RLPB ranger (mark up on hard copy map) conflicting with the crown spatial data. If these conflicting reserves were to be deleted, this would lead to a large amount of information loss during the project. If additions were made without sufficient data to determine its crown status, currency, location, etc (which was not available in all cases) the end user may rely on data that has a low level of accuracy. The confindex was produced by determining the value of information and scoring it accordingly, compounding its value if data sources showed a correlation. Where an RLPB LP Name and ID point was not assigned to a polygon due to other points being in closer proximity these names and ID are effectively deleted from the polygon layer. In a number of cases this was correct due to land being revoked, relinquished and/or now freehold. In a number of cases where the TSR is thought to exist and a polygon could not be assigned due to no info available (Lot/DP, close proximity to a crown reserve, further ranger interview provided no info, etc etc). For these cases to ensure no information loss a points layer was compiled from the LP points layer with further info from the marked up hard copy maps to place the point in the most accurate approximate location to where the reserve is though to exist and then all CV data attached to the point. In many of these cases some further investigation could provide an exact location and inclusion in the TSR poly layer. The accuracy of the point is mentioned in the metadata, so that the location is not taken as an absolute location and is only to be used as a guide for the approximate location of the reserve. Topology checks were conducted to eliminate slivers in the layer and to remove duplicate polygons. Where two crown reserves existed on the same land parcel, the duplicate polygon was deleted and unique attributes (Crown Reserve Number, Type, and Purpose) were transferred. Once the polygon layer was satisfactorily completed, a list of the LP points not allocated to polygons was compiled. Any points (reserves) that were said to have been revoked or relinquished were then removed from this list to provide a list of those that are said to be current. An extract of the LP points layer was then produced with only the aforementioned points. These points were then attributed with the same conservation and biodiversity data as the polygon layer, in an attempt to minimise the amount of information loss.

    Dataset Citation

    "NSW Department of Environment, Climate Change and Water" (2010) Travelling Stock Route Conservation Values. Bioregional Assessment Source Dataset. Viewed 09 October 2018, http://data.bioregionalassessments.gov.au/dataset/198900d5-0d06-4bd0-832b-e30a7c4e8873.

  17. a

    CSDCIOP Dune Crest Points

    • maine.hub.arcgis.com
    Updated Feb 26, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    State of Maine (2020). CSDCIOP Dune Crest Points [Dataset]. https://maine.hub.arcgis.com/maps/csdciop-dune-crest-points
    Explore at:
    Dataset updated
    Feb 26, 2020
    Dataset authored and provided by
    State of Maine
    Area covered
    Description

    Feature class that compares the elevations between sand dune crests (extracted from available LiDAR datasets from 2010 and 2013) with published FEMA Base Flood Elevations (BFEs) from preliminary FEMA DFIRMS (Panels issued in 2018 and 2019) in coastal York and Cumberland counties (up through Willard Beach in South Portland). Steps to create the dataset included:Shoreline structures from the most recent NOAA EVI LANDWARD_SHORETYPE feature class were extracted using the boundaries of York and Cumberland counties. This included 1B: Exposed, Solid Man-Made structures, 8B: Sheltered, Solid Man-Made Structures; 6B: Riprap, and 8C: Sheltered Riprap. This resulted in the creation of Cumberland_ESIL_Structures and York_ESIL_Structures. Note that ESIL uses the MHW line as the feature base.Shoreline structures from the work by Rice (2015) were extracted using the York and Cumberland county boundaries. This resulted in the creation of Cumberland_Rice_Structures and York_Rice_Structures.Additional feature classes for structures were created for York and Cumberland county structures that were missed. This was Slovinsky_York_Structures and Slovinsky_Cumberland_Structures. GoogleEarth imagery was inspected while additional structures were being added to the GIS. 2012 York and Cumberland County imagery was used as the basemap, and structures were classified as bulkheads, rip rap, or dunes (if known). Also, whether or not the structure was in contact with the 2015 HAT was noted.MEDEP was consulted to determine which permit data (both PBR and Individual Permit, IP, data) could be used to help determine where shoreline stabilization projects may have been conducted adjacent to or on coastal bluffs. A file was received for IP data and brought into GIS (DEP_Licensing_Points). This is a point file for shoreline stabilization permits under NRPA.Clip GISVIEW.MEDEP.Permit_By_Rule_Locations to the boundaries of the study area and output DEP_PBR_Points.Join GISVIEW.sde>GISVIEW.MEDEP.PBR_ACTIVITY to the DEP_PBR_Points using the PBR_ID Field. Then, export this file as DEP_PBR_Points2. Using the new ACTIVITY_DESC field, select only those activities that relate to shoreline stabilization projects:PBR_ACTIVITY ACTIVITY_DESC02 Act. Adjacent to a Protected Natural Resource04 Maint Repair & Replacement of Structure08 Shoreline StabilizationSelect by Attributes > PBR_ACTIVITY IN (‘02’, ‘04’, ‘08’) select only those activities likely to be related to shoreline stabilization, and export the selected data as a DEP_PBR_Points3. Then delete 1 and 2, and rename this final product as DEP_PBR_Points.Next, visually inspect the Licensing and PBR files using ArcMap 2012, 2013 imagery, along with Google Earth imagery to determine the extents of armoring along the shoreline.Using EVI and Rice data as indicators, manually inspect and digitize sections of the coastline that are armored. Classify the seaward shoreline type (beach, mudflat, channel, dune, etc.) and the armor type (wall or bulkhead). Bring in the HAT line and, using that and visual indicators, identify whether or not the armored sections are in contact with HAT. Use Google Earth at the same time as digitizing in order to help constrain areas. Merge digitized armoring into Cumberland_York_Merged.Bring the preliminary FEMA DFIRM data in and use “intersect” to assign the different flood zones and elevations to the digitized armored sections. This was done first for Cumberland, then for York Counties. Delete ancillary attributes, as needed. Resulting layer is Cumberland_Structure_FloodZones and York_Structure_FloodZones.Go to NOAA Digital Coast Data Layers and download newest LiDAR data for York and Cumberland county beach, dune, and just inland areas. This includes 2006 and newer topobathy data available from 2010 (entire coast), and selected areas from 2013 and 2014 (Wells, Scarborough, Kennebunk).Mosaic the 2006, 2010, 2013 and 2014 data (with 2013 and 2014 being the first dataset laying on top of the 2010 data) Mosaic this dataset into the sacobaydem_ftNAVD raster (this is from the MEGIS bare-earth model). This will cover almost all of the study area except for armor along several areas in York. Resulting in LidAR206_2010_2013_Mosaic.tif.Using the LiDAR data as a proxy, create a “seaward crest” line feature class which follows along the coast and extracts the approximate highest point (cliff, bank, dune) along the shoreline. This will be used to extract LiDAR data and compare with preliminary flood zone information. The line is called Dune_Crest.Using an added tool Points Along Line, create points at 5 m spacing along each of the armored shoreline feature lines and the dune crest lines. Call the outputs PointsonLines and PointsonDunes.Using Spatial Analyst, Extract LIDAR elevations to the points using the 2006_2010_2013 Mosaic first. Call this LidarPointsonLines1. Select those points which have NULL values, export as this LiDARPointsonLines2. Then rerun Extract Values to Points using just the selected data and the state MEGIS DEM. Convert RASTERVALU to feet by multiplying by 3.2808 (and rename as Elev_ft). Select by Attributes, find all NULL values, and in an edit session, delete them from LiDARPointsonLines. Then, merge the 2 datasets and call it LidarPointsonLines. Do the same above with dune lines and create LidarPointsonDunes.Next, use the Cumberland and York flood zone layers to intersect the points with the appropriate flood zone data. Create ….CumbFIRM and …YorkFIRM files for the dunes and lines.Select those points from the Dunes feature class that are within the X zone – these will NOT have an associated BFE for comparison with the Lidar data. Export the Dune Points as Cumberland_York_Dunes_XZone. Run NEAR and use the merged flood zone feature class (with only V, AE, and AO zones selected). Then, join the flood zone data to the feature class using FID (from the feature class) and OBJECTID (from the flood zone feature class). Export as Cumberland_York_Dunes_XZone_Flood. Delete ancillary columns of data, leaving the original FLD_ZONE (X), Elev_ft, NEAR_DIST (distance, in m, to the nearest flood zone), FLD_ZONE_1 (the near flood zone), and the STATIC_BFE_1 (the nearest static BFE).Do the same as above, except with the Structures file (Cumberland_York_Structures_Lidar_DFIRM_Merged), but also select those features that are within the X zone and the OPEN WATER. Export the points as Cumberland_York_Structures_XZone. Again, run the NEAR using the merged flood zone and only AE, VE, and AO zones selected. Export the file as Cumberland_York_Structures_XZone_Flood.Merge the above feature classes with the original feature classes. Add a field BFE_ELEV_COMPARE. Select all those features whose attributes have a VE or AE flood zone and use field calculator to calculate the difference between the Elev_ft and the BFE (subtracting the STATIC_BFE from Elev_ft). Positive values mean the maximum wall value is higher than the BFE, while negative values mean the max is below the BFE. Then, select the remaining values with switch selection. Calculate the same value but use the NEAR_STATIC_BFE value instead. Select by Attributes>FLD_ZONE=AO, and use the DEPTH value to enter into the above created fields as negative values. Delete ancilary attribute fields, leaving those listed in the _FINAL feature classes described above the process steps section.

  18. f

    Comparison of experimental data for extracting feature points.

    • figshare.com
    xls
    Updated Jun 3, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zhong Qu; Xue-Ming Wei; Si-Qi Chen (2023). Comparison of experimental data for extracting feature points. [Dataset]. http://doi.org/10.1371/journal.pone.0210354.t001
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 3, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Zhong Qu; Xue-Ming Wei; Si-Qi Chen
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Comparison of experimental data for extracting feature points.

  19. N

    SRG Badenoch-Strathspey Inbye Survey 2006-2007 (extract) - Points

    • dtechtive.com
    • find.data.gov.scot
    csv, geojson, kml +1
    Updated Aug 2, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NatureScot (2023). SRG Badenoch-Strathspey Inbye Survey 2006-2007 (extract) - Points [Dataset]. https://dtechtive.com/datasets/38452
    Explore at:
    csv(0.079 MB), kml(0.1226 MB), geojson(0.1062 MB), shp(0.0996 MB)Available download formats
    Dataset updated
    Aug 2, 2023
    Dataset provided by
    NatureScot
    Area covered
    United Kingdom
    Description

    Extract from the Badenoch and Strathspey Inbye Survey 2006/2007, providing central point locations for 815 fields found to contain species-rich grassland (SRG) habitats. These were subsequently reassessed in 2020/2021 using the most recent available aerial photography to establish the likelihood that SRG was still present and identify potential causes of loss or change.

  20. ALS LiDAR Point Clouds for Shoreline Extraction (Mrzeżyno) - demo data to...

    • zenodo.org
    zip
    Updated Apr 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jakub Śledziowski; Jakub Śledziowski; Andrzej Giza; Andrzej Giza; Paweł Terefenko; Paweł Terefenko (2025). ALS LiDAR Point Clouds for Shoreline Extraction (Mrzeżyno) - demo data to S-LiNE Toolbox [Dataset]. http://doi.org/10.5281/zenodo.15288488
    Explore at:
    zipAvailable download formats
    Dataset updated
    Apr 27, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Jakub Śledziowski; Jakub Śledziowski; Andrzej Giza; Andrzej Giza; Paweł Terefenko; Paweł Terefenko
    Area covered
    Mrzeżyno
    Description

    This dataset contains two ALS LiDAR point cloud files collected along the coastline of Mrzeżyno, Poland.
    Each file is provided in LAS format, corrected for geoid undulations and ready for shoreline extraction and coastal monitoring analysis.

    Usage notes:
    This dataset was selected as test files from aerial LiDAR for shoreline detection using the S-LiNE Toolbox. The files contain original data and are suitable for tasks such as shoreline detection, dune and beach morphology analysis, and shoreline change studies.

    Dataset contents:
    - 76021_1183277_N-33-67-D-a-1-3-2.las
    - 76021_1183276_N-33-67-D-a-1-3-1.las

    Data acquisition:
    - Coordinate Reference System: EPSG:2180 (ETRS89 / Poland CS92)
    - Acquisition mode: Airborne survey

    Data details:
    Emblem: N-33-67-D-a-1-3-1
    Date of update: 2022-04-16
    Format: LAZ (converted to LAS)
    Spatial characteristics: 4 p/m
    Average height error: 0.15
    Average position error: 0.30
    Horizontal coordinate system: PL-1992
    Vertical coordinate system: PL-EVRF2007-NH
    Archiving module 1:2500
    Work report number: GI-FOTO.6202.10.2022
    Date of inclusion in PZGiK: 2022-10-16

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Canadian Parks and Wilderness Society (2022). Geospatial Data Extraction Tool [Dataset]. https://data-with-cpaws-nl.hub.arcgis.com/documents/a5b5f96b86ac434083f84434b12a65a3

Geospatial Data Extraction Tool

Explore at:
16 scholarly articles cite this dataset (View in Google Scholar)
Dataset updated
Mar 28, 2022
Dataset authored and provided by
Canadian Parks and Wilderness Society
Area covered
Description

The Geospatial Data Extraction Guide can be found here. The Geospatial Data Extraction Tool allows for the dynamic extraction of data from the Government of Canadas Open Data Portal. There is a selection of base layers including: Landsat mosaic Canadian Digital Surface Model Canadian Digital Elevation Model National Forest Inventory National Tiling System Grid Coverage National Parks Boundaries National Marine Conservation Areas Automatic Extraction Building Projects Limits The User can select the data to be extracted, including: CanVec Elevation Automatic Extraction Data CanVec CanVec contains more than 60 topographic features organized into 8 themes: Transport Features, Administrative Features, Hydro Features, Land Features, Manmade Features, Elevation Features, Resource Management Features and Toponymic Features.

This multiscale product originates from the best available geospatial data sources covering Canadian Territory. It offers quality topographic information in vector format complying with international geomatics standards. The document CanVec_Code in the Data Resourced section shows the list of entities and the scales at which they are available.The maximum extraction area is 150000km. Users are able to extract the following data:Lakes and rivers - Hydrographic featuresTransport networks - Transport featuresConstructions and land use - Manmade featuresMines, energy and communication networks - Resources Management FeaturesWooded areas, saturated soils and landscape - Land featuresElevation featuresMap Labels - Toponymic features (50K only)Output Options: OGC GeoPackage, ESRI file Geodatabase, ESRI ShapefileCoordinate System Options: NAD83 CSRS (EPSG:4617), WGS 84 / Pseudo-Mercator (EPSG:3857), NAD83 / Canada Atlas Lambert (EPSG:3979)Option to clip the data: Yes / NoScale Options: 1 / 50,000, 1 / 250,00ElevationElevation data consists of the Canadian Digital Elevation Model (CDDEM) and the Canadian Digital Surface Model (CDSM). These products are available for extraction along with their derived products (Shaded Relief, Color Shaded Relief, Color Relief, Slope Map*, Aspect Map* and Point Data). *Only available for CDEM.The maximum extraction area is 50000km. Users are able to extract the following data:Digital Elevation Model (DEM)Shaded ReliefColor ReliefColor Shaded ReliefSlope mapAspect mapPoint DataPick an azimuth between 0 and 360 Degrees: Direction of light source, between 0 and 360, measured in degrees, clockwise from the north.Pick an altitude between 0 and 90 degrees: Vertical direction of light source, from 0 (horizon) to 90 degrees (zenith).Enter a vertical exaggeration factor: Vertical exaggeration factor.Select the slope's measuring unit: Choice of degrees or percent slope.Coordinate System Options: NAD83 CSRS (EPSG:4617), WGS 84 / Pseudo-Mercator (EPSG:3857), NAD83 / Canada Atlas Lambert (EPSG:3979). Data is stored in geographic coordinates (longitude and latitude). However, it can also be offered in a plane coordinate projection (X and Y) at the time of extraction. Definition for the coordinate system can be found in the metadata.Select the DEM output formats: OGC GeoPackage, ESRI file Geodatabase, ESRI Shapefile. The source data (DEM or DSM) available formats are GeoTIFF and Esri ASCII Grid. The GeoTIFF format specification can be obtained from: https://www.pubdoc.org/fileformat/rasterimage/tiff/geotiff.pdf and https://geotiff.maptools.org/spec/geotiffhome.html.The Esri ASCII Grid format specification can be obtained from:https://desktop.arcgis.com/en/arcmap/10.3/manage-data/raster-and-images/esri-ascii-raster-format.htmSelect the Point Data output format: ASCII Gridded XYZ (xyz), ASCII Gridded CSV (.csv). The Point Data available formats are text CSV (.csv) (comma separated values) and text XYZ (.xyz) (space separated values). The format specification is the same for both (ASCII Gridded XYZ) and can be obtained from: https://www.gdal.org/frmt_xyz.htmlSelect the image resolution: 0.75 arc seconds, 1.5 arc seconds, 3 arc seconds, 6 arc seconds, 12 arc secondsEmail address (yourname@domain.com): When processed results will be deposited to the given email. The email information that you provide on this site is collected in accordance with the federal Privacy Act. You will be notified once your request has been processed and when it is ready for delivery. Informations about your privacy rights.The job status is listed and can be refreshed to see updates.Automatic Extraction DataThe maximum extraction area is 50000km. Users are able to extract the following data:BuildingsOutput Options: OGC GeoPackage, ESRI file Geodatabase, ESRI ShapefileCoordinate System Options: NAD83 CSRS (EPSG:4617), WGS 84 / Pseudo-Mercator (EPSG:3857), NAD83 / Canada Atlas Lambert (EPSG:3979)Email address (yourname@domain.com): When processed results will be deposited to the given email. The email information that you provide on this site is collected in accordance with the federal Privacy Act. You will be notified once your request has been processed and when it is ready for delivery. Informations about your privacy rights.The job status is listed and can be refreshed to see updates.

Search
Clear search
Close search
Google apps
Main menu