This dashboard depicts the status of NG9-1-1 data submitted to Datmark VEP, a data validation software solution.
Data producers or those who maintain park and trail data can use this tool to validate their data against the MetroGIS Park and Trail data standard. The validations within the tool were created to support the Metro Park and Trail datasets (see associated datasets below).
MetroGIS Park and Trail Data Page
https://www.metrogis.org/projects/park-and-trail.aspx
Specific validation information and tool requirements can be found in the following documents included within this resource.
Readme_HowTo.pdf
Readme_Validations.pdf
Our location data powers the most advanced address validation solutions for enterprise backend and frontend systems.
A global, standardized, self-hosted location dataset containing all administrative divisions, cities, and zip codes for 247 countries.
All geospatial data for address data validation is updated weekly to maintain the highest data quality, including challenging countries such as China, Brazil, Russia, and the United Kingdom.
Use cases for the Address Validation at Zip Code Level Database (Geospatial data)
Address capture and address validation
Address autocomplete
Address verification
Reporting and Business Intelligence (BI)
Master Data Mangement
Logistics and Supply Chain Management
Sales and Marketing
Product Features
Dedicated features to deliver best-in-class user experience
Multi-language support including address names in local and foreign languages
Comprehensive city definitions across countries
Data export methodology
Our location data packages are offered in variable formats, including .csv. All geospatial data for address validation are optimized for seamless integration with popular systems like Esri ArcGIS, Snowflake, QGIS, and more.
Why do companies choose our location databases
Enterprise-grade service
Full control over security, speed, and latency
Reduce integration time and cost by 30%
Weekly updates for the highest quality
Seamlessly integrated into your software
Note: Custom address validation packages are available. Please submit a request via the above contact button for more details.
Data producers or those who maintain parcel data can use this tool to validate their data against the state Geospatial Advisory Committee (GAC) Parcel Data Standard. The validations within the tool were originally created as part of a MetroGIS Regional Parcel Dataset workflow.
Counties using this tool can obtain a schema geodatabase from Parcel Data Standard page hosted by MnGeo (link below). All counties, cities or those maintaining authoritative data on a local jurisdiction's behalf, are encouraged to use and modify the tool as needed to support local workflows.
Parcel Data Standard Page
http://www.mngeo.state.mn.us/committee/standards/parcel_attrib/parcel_attrib.html
Specific validation information and tool requirements can be found in the following documents included within this resource.
Readme_HowTo.pdf
Readme_Validations.pdf
This document explains the purpose, logic, and results of Minnesota’s Next Generation 9-1-1 (NG9-1-1) Master Street Address Guide (MSAG) Validation found on the Minnesota NG9-1-1 GIS Data Validation and Aggregation Portal.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
BIEN data validation and standardization tools.
Data producers or those who maintain address points can use this tool to validate their data against the state Geospatial Advisory Council (GAC) Address Point Standard. The validations within the tool were originally created as part of a workflow for counties under the Metropolitan Emergency Services Board (MESB) jurisdiction to pre-check their address points before being submitting data to the aggregated Metro Address Point Dataset. The primary driver behind developing the validations was to support 911 dispatch workflows administered by the MESB.
Counties using this tool can obtain a schema geodatabase from MnGeo (link below). All counties, cities or those maintaining authoritative data on a local jurisdiction’s behalf, are encouraged to use and modify the tool as needed to support local workflows.
Datasets Supported:
Address Points
Road Centerlines
Standards Page
https://www.mngeo.state.mn.us/committee/standards/standards_adopted_devel.html
Specific validation information and tool requirements can be found in the following documents included within this resource.
Readme_HowTo.pdf
Readme_Validations.pdf
The files linked to this reference are the geospatial data created as part of the completion of the baseline vegetation inventory project for the NPS park unit. Current format is ArcGIS file geodatabase but older formats may exist as shapefiles. Our final map product is a geographic information system (GIS) database of vegetation structure and composition across the Crater Lake National Park terrestrial landscape, including wetlands. The database includes photos we took at all relevé, validation, and accuracy assessment plots, as well as the plots that were done in the previous wetlands inventory. We conducted an accuracy assessment of the map by evaluating 698 stratified random accuracy assessment plots throughout the project area. We intersected these field data with the vegetation map, resulting in an overall thematic accuracy of 86.2 %. The accuracy of the Cliff, Scree & Rock Vegetation map unit was difficult to assess, as only 9% of this vegetation type was available for sampling due to lack of access. In addition, fires that occurred during the 2017 accuracy assessment field season affected our sample design and may have had a small influence on the accuracy. Our geodatabase contains the locations where particular associations are found at 600 relevé plots, 698 accuracy assessment plots, and 803 validation plots.
Links to recordings of the Integrated Services Program and 9-1-1 & Geospatial Services Bureau webinar series, including NG9-1-1 GIS topics such as: data preparation; data provisioning and maintenance; boundary best practices; and extract, transform, and load (ETL). Offerings include:Topic: Virginia Next Generation 9-1-1 Dashboard and Resources Update Description: Virginia recently updated the NG9-1-1 Dashboard with some new tabs and information sources and continues to develop new resources to assist the GIS data work. This webinar provides an overview of changes, a demonstration of new functionality, and a guide to finding and using new resources that will benefit Virginia public safety and GIS personnel with roles in their NG9-1-1 projects. Wednesday 16 June 2021. Recording available at: https://vimeo.com/566133775Topic: Emergency Service Boundary GIS Data Layers and Functions in your NG9-1-1 PSAP Description: Law, Fire, and Emergency Medical Service (EMS) Emergency Service Boundary (ESB) polygons are required elements of the NENA NG9-1-1 GIS data model stack that indicate which agency is responsible for primary response. While this requirement must be met in your Virginia NG9-1-1 deployment with AT&T and Intrado, there are quite a few ways you could choose to implement these polygons. PSAPs and their GIS support must work together to understand how this information will come into a NG9-1-1 i3 PSAP and how it will replace traditional ESN information in order to make good choices while implementing these layers. This webinar discusses:the function of ESNs in your legacy 9-1-1 environment, the role of ESBs in NG9-1-1, and how ESB information appears in your NG9-1-1 PSAP. Wednesday, 22 July 2020. Recording available at: https://vimeo.com/441073056#t=360sTopic: "The GIS Folks Handle That": What PSAP Professionals Need to Know about the GIS Project Phase of Next Generation 9-1-1 DeploymentDescription: Next Generation 9-1-1 (NG9-1-1) brings together the worlds of emergency communication and spatial data and mapping. While it may be tempting for PSAPs to outsource cares and concerns about road centerlines and GIS data provisioning to 'the GIS folks', GIS staff are crucial to the future of emergency call routing and location validation. Data required by NG9-1-1 usually builds on data that GIS staff already know and use for other purposes, so the transition requires them to learn more about PSAP operations and uses of core data. The goal of this webinar is to help the PSAP and GIS worlds come together by explaining the role of the GIS Project in the Virginia NG9-1-1 Deployment Steps, exploring how GIS professionals view NG9-1-1 deployment as a project, and fostering a mutual understanding of how GIS will drive NG9-1-1. 29 January 2020. Recording available at: https://vimeo.com/showcase/9791882/video/761225474Topic: Getting Your GIS Data from Here to There: Processes and Best Practices for Extract, Transform and Load (ETL) Description: During the fall of 2019, VITA-ISP staff delivered workshops on "Tools and Techniques for Managing the Growing Role of GIS in Enterprise Software." This session presents information from the workshops related to the process of extracting, transforming, and loading data (ETL), best practices for ETL, and methods for data schema comparison and field mapping as a webinar. These techniques and skills assist GIS staff with their growing role in Next Generation 9-1-1 but also apply to many other projects involving the integration and maintenance of GIS data. 19 February 2020. Recording available at: https://vimeo.com/showcase/9791882/video/761225007Topic: NG9-1-1 GIS Data Provisioning and MaintenanceDescription: VITA ISP pleased to announce an upcoming webinar about the NG9-1-1 GIS Data Provisioning and Maintenance document provided by Judy Doldorf, GISP with the Fairfax County Department of Information Technology and RAC member. This document was developed by members of the NG9-1-1 GIS workgroup within the VITA Regional Advisory Council (RAC) and is intended to provide guidance to local GIS and PSAP authorities on the GIS datasets and associated GIS to MSAG/ALI validation and synchronization required for NG9-1-1 services. The document also provides guidance on geospatial call routing readiness and the short- and long-term GIS data maintenance workflow procedures. In addition, some perspective and insight from the Fairfax County experience in GIS data preparation for the AT&T and West solution will be discussed in this webinar. 31 July 2019. Recording available at: https://vimeo.com/showcase/9791882/video/761224774Topic: NG9-1-1 Deployment DashboardDescription: I invite you to join us for a webinar that will provide an overview of our NG9-1-1 Deployment Dashboard and information about other online ISP resources. The ISP website has been long criticized for being difficult to use and find information. The addition of the Dashboard and other changes to the website are our attempt to address some of these concerns and provide an easier way to find information especially as we undertake NG9-1-1 deployment. The Dashboard includes a status map of all Virginia PSAPs as it relates to the deployment of NG9-1-1, including the total amount of funding requested by the localities and awards approved by the 9-1-1 Services Board. During this webinar, Lyle Hornbaker, Regional Coordinator for Region 5, will navigate through the dashboard and provide tips on how to more effectively utilize the ISP website. 12 June 2019. Recording not currently available. Please see the Virginia Next Generation 9-1-1 Dashboard and Resources Update webinar recording from 16 June 2021. Topic: PSAP Boundary Development Tools and Process RecommendationDescription: This webinar will be presented by Geospatial Program Manager Matt Gerike and VGIN Coordinator Joe Sewash. With the release of the PSAP boundary development tools and PSAP boundary segment compilation guidelines on the VGIN Clearinghouse in March, this webinar demonstrates the development tools, explains the process model, and discusses methods, tools, and resources available for you as you work to complete PSAP boundary segments with your neighbors. 15 May 2019. Recording available at: https://www.youtube.com/watch?v=kI-1DkUQF9Q&feature=youtu.beTopic: NG9-1-1 Data Preparation - Utilizing VITA's GIS Data Report Card ToolDescription: This webinar, presented by VGIN Coordinator Joe Sewash, Geospatial Program Manager Matt Gerike, and Geospatial Analyst Kenny Brevard will provide an overview of the first version of the tools that were released on March 25, 2019. These tools will allow localities to validate their GIS data against the report card rules, the MSAG and ALI checks used in previous report cards, and the analysis listed in the NG9-1-1 migration proposal document. We will also discuss the purpose of the tools, input requirements, initial configuration, how to run them, and how to make sense of your results. 10 April 2019. Recording available at: https://vimeo.com/showcase/9791882/video/761224495Topic: NG9-1-1 PSAP Boundary Best Practice WebinarDescription: During the months of November and December, VITA ISP staff hosted regional training sessions about best practices for PSAP boundaries as they relate to NG9-1-1. These sessions were well attended and very interactive, therefore we feel the need to do a recap and allow those that may have missed the training to attend a makeup session. 30 January 2019. Recording not currently available. Please see the PSAP Boundary Development Tools and Process Recommendation webinar recording from 15 May 2019.Topic: NG9-1-1 GIS Overview for ContractorsDescription: The Commonwealth of Virginia has started its migration to next generation 9-1-1 (NG9-1-1). This migration means that there will be a much greater reliance on geographic information (GIS) to locate and route 9-1-1 calls. VITA ISP has conducted an assessment of current local GIS data and provided each locality with a report. Some of the data from this report has also been included in the localities migration proposal, which identifies what data issues need to be resolved before the locality can migrate to NG9-1-1. Several localities in Virginia utilize a contractor to maintain their GIS data. This webinar is intended for those contractors to review the data in the report, what is included in the migration proposal and how they may be called on to assist the localities they serve. It will still ultimately be up to each locality to determine whether they engage a contractor for assistance, but it is important for the contractor community to understand what is happening and have an opportunity to ask questions about the intent and goals. This webinar will provide such an opportunity. 22 August 2018. Recording not currently available. Please contact us at NG911GIS@vdem.virginia.gov if you are interested in this content.
This document explains the purpose, logic, and results of Minnesota’s Next Generation 9-1-1 (NG9-1-1) Community Name Validation found on the Minnesota NG9-1-1 GIS Data Validation and Aggregation Portal.
This document provides an overview on the provisioning of GIS data to support NG9-1-1 services. This document is intended to provide guidance to local GIS and PSAP authorities on the following: The required GIS datasets to support the i3 Emergency Call Routing Function (ECRF) and Location Validation Function (LVF) The validation processes to synchronize the GIS datasets to the Master Street Address Guide (MSAG) and Automatic Location Information (ALI) datasets Geospatial call routing readiness The short term and long term NG9-1-1 GIS data maintenance workflow proceduresAdditional resources and recommendations on GIS related topics are available on the VGIN 9-1-1 & GIS page.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Mapping of environmental variables often relies on map accuracy assessment through cross-validation with the data used for calibrating the underlying mapping model. When the data points are spatially clustered, conventional cross-validation leads to optimistically biased estimates of map accuracy. Several papers have promoted spatial cross-validation as a means to tackle this over-optimism. Many of these papers blame spatial autocorrelation as the cause of the bias and propagate the widespread misconception that spatial proximity of calibration points to validation points invalidates classical statistical validation of maps. In the paper related to these data, we present and evaluate alternative cross-validation approaches for assessing map accuracy from clustered sample data.
The study area is western Europe, constrained in the north at 52° latitude and at -10° and 24° longitude The projection is IGNF:ETRS89LAEA (Lambert azimuthal equal area projection).
Files:
agb.tif = above ground biomass (AGB) map from version 3 of the 2017 CCI-Biomass product (https://catalogue.ceda.ac.uk/uuid/5f331c418e9f4935b8eb1b836f8a91b8) AGBstack.tif = covariates used for predicting AGB aggArea.tif = coarse grid used for simulation in the model-based methods ocs.tif = soil organic carbon stock (OCS) map (0-30 cm) from Soilgrids (https://www.isric.org/explore/soilgrids) OCSstack.tif = covariates used for predicting OCS strata.xxx = 100 compact geo-strata (ESRI shape) created with the spcosa package; used for generating clustered samples TOTmask.tif = mask of the area covered by the covariates
Details and data sources of the covariates in AGBstack.tif and OCSstack.tif:
Name
Description
Source
Note
ai
Aridity Index
https://chelsa-climate.org/downloads/
Version 2.1
bio1
Mean annual air temperature [°C]
https://chelsa-climate.org/downloads/
Version 2.1
bio5
Mean daily maximum air temperature of the warmest month [°C]
https://chelsa-climate.org/downloads/
Version 2.1
bio7
Annual range of air temperature [°C]
https://chelsa-climate.org/downloads/
Version 2.1
bio12
Annual precipitation [kg/m2]
https://chelsa-climate.org/downloads/
Version 2.1
bio15
Precipitation seasonality [kg/m2]
https://chelsa-climate.org/downloads/
Version 2.1
gdd10
Growing degree days heat sum above 10°C
https://chelsa-climate.org/downloads/
Version 2.1
clay
Clay content [g/kg] of the 0-5cm layer
Only used for AGB
sand
Sand content [g/kg] of the 0-5cm layer
https://soilgrids.org/
as above
pH
Acidity (Ph(water)) of the 0-5cm layer
https://soilgrids.org/
as above
glc2017
Landcover 2017
https://land.copernicus.eu/global/products/lc, reclassified to: closed forest, open forest, natural non-forest veg., bare & sparse veg. cropland, built-up, water
Categorical variable
dem
Elevation
https://www.eea.europa.eu/data-and-maps/data/copernicus-land-monitoring-service-eu-dem
cosasp
Cosine of slope aspect
Computed with the terra package from elevation
Computed @25m resolution; next aggregated to 0.5km
sinasp
Sine of slope aspect
Computed with the terra package from elevation
as above
slope
Slope
Computed with the terra package from elevation
as above
TPI
Topographic position index
Computed with the terra package from elevation
as above
TRI
Terrain ruggedness index
Computed with the terra package from elevation
as above
TWI
Topographic wetness index
Computed with SAGA from 500m resolution (aggregated) dem
gedi
Forest height
https://glad.umd.edu/dataset/gedi
Zone: NAFR
xcoord
X coordinate
Using a mask created from the other covariates
ycoord
Y coordinate
Using a mask created from the other covariates
Dcoast
Distance from coast
Using a land mask created from the other covariates
GIS quality control checks are intended to identify issues in the source data that may impact a variety of9-1-1 end use systems.The primary goal of the initial CalOES NG9-1-1 implementation is to facilitate 9-1-1 call routing. Thesecondary goal is to use the data for telephone record validation through the LVF and the GIS-derivedMSAG.With these goals in mind, the GIS QC checks, and the impact of errors found by them are categorized asfollows in this document:Provisioning Failure Errors: GIS data issues resulting in ingest failures (results in no provisioning of one or more layers)Tier 1 Critical errors: Impact on initial 9-1-1 call routing and discrepancy reportingTier 2 Critical errors: Transition to GIS derived MSAGTier 3 Warning-level errors: Impact on routing of call transfersTier 4 Other errors: Impact on PSAP mapping and CAD systemsGeoComm's GIS Data Hub is configurable to stop GIS data that exceeds certain quality control check error thresholdsfrom provisioning to the SI (Spatial Interface) and ultimately to the ECRFs, LVFs and the GIS derivedMSAG.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This project consists of 11 files: 1) a zipped folder with a geodatabase containing seven raster files and two shapefiles, 2) a zipped folder containing the same layers found in the geodatabase, but as standalone files, 3) 9 .xml files containing the metadata for the spatial datasets in the zipped folders. These datasets were generated in ArcPro 3.0.3. (ESRI). Six raster files (drainaged, geology, nlcd, precipitation, slope, solitexture) present spatially distributed information, ranked according to the relative importance of each class for groundwater recharge. The scale used for these datasets is 1-9, where low scale values are assigned to datasets with low relative importance for groundwater recharge, while high scale values are assigned to datasets with high relative importance for groundwater recharge. The seventh raster file contains the groundwater recharge potential map for the Anchor River Watershed. This map was calculated using the six raster datasets mentioned previously. Here, the values assigned represent Very Low to Very High groundwater recharge potential (scale 1 - 5, 1 being Very Low and 5 being Very High). Finally, the two shapefiles represent the groundwater wells and the polygons used for model validation. This data is part of the manuscript titled: Mapping Groundwater Recharge Potential in High Latitude Landscapes using Public Data, Remote Sensing, and Analytic Hierarchy Process, published in the journal remote sensing.
Racial identification is a critical factor in understanding a multitude of important outcomes in many fields. However, inferring an individual’s race from ecological data is prone to bias and error. This process was only recently improved via Bayesian Improved Surname Geocoding (BISG). With surname and geographic-based demographic data, it is possible to more accurately estimate individual racial identification than ever before. However, the level of geography used in this process varies widely. Whereas some existing work makes use of geocoding to place individuals in precise census blocks, a substantial portion either skips geocoding altogether or relies on estimation using surname or county-level analyses. Presently, the tradeoffs of such variation are unknown. In this letter we quantify those tradeoffs through a validation of BISG on Georgia’s voter file using both geocoded and non-geocoded processes and introduce a new level of geography--ZIP codes--to this method. We find that when estimating the racial identification of White and Black voters, non-geocoded ZIP code-based estimates are acceptable alternatives. However, census blocks provide the most accurate estimations when imputing racial identification for Asian and Hispanic voters. Our results document the most efficient means to sequentially conduct BISG analysis to maximize racial identification estimation while simultaneously minimizing data missingness and bias.
MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
This layer contains the most current locations and up-to-date non-spatial attributes for Kentucky's Public, K-12 Schools. The Front Door project implemented by the Kentucky Department of Education allowed school teams to capture the geographic coordinates digitally using The Commonwealth Map. Following data validation, the Kentucky Division of Geographic Information utilizes this layer in the mapping services it provides to the communities.Download: Ky Open GIS Data
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This is the static test data from the study "Global Geolocated Realtime Data of Interfleet Urban Transit Bus Iding" collected by GRD-TRT-BUF-4I. Updated versions are available here.test-data-a.csv was collected from December 31, 2023 00:01:30 UTC to January 1, 2024 00:01:30 UTC.test-data-b.csv was collected from January 4, 2024 01:30:30 UTC to January 5, 2024 01:30:30 UTC.test-data-c.csv was collected from January 10, 2024 16:05:30 UTC to January 11, 2024 16:05:30 UTC.test-data-d.csv was collected from January 15, 2024 22:30:21 UTC to January 16, 2024 22:30:17 UTC.test-data-e.csv was collected from February 16, 2024 22:30:21 UTC to February 17, 2024 22:30:20 UTC.test-data-f.csv was collected from February 21, 2024 22:30:21 UTC to February 22, 2024 22:30:20 UTC.
GIS (Geographic Information System) data, which includes spatial data such as maps, satellite imagery, and other geospatial data, is typically created using various techniques and methods to ensure its accuracy, completeness, and reliability. The process of creating GIS data for use in metadata involves several key steps, which may include: Data Collection: The first step in creating GIS data for metadata is data collection. This may involve gathering data from various sources, such as field surveys, remote sensing, aerial photography, or existing datasets. Data can be collected using GPS (Global Positioning System) receivers, satellite imagery, LiDAR (Light Detection and Ranging) technology, or other data acquisition methods.Data Validation and Quality Control: Once data is collected, it goes through validation and quality control processes to ensure its accuracy and reliability. This may involve comparing data against known standards or specifications, checking for data errors or inconsistencies, and validating data attributes to ensure they meet the desired accuracy requirements.Data Processing and Analysis: After validation and quality control, data may be processed and analyzed to create meaningful information. This may involve data integration, data transformation, spatial analysis, and other geoprocessing techniques to derive new datasets or generate metadata.Metadata Creation: Metadata, which is descriptive information about the GIS data, is created based on established standards or guidelines. This may include information such as data source, data quality, data format, spatial extent, projection information, and other relevant details that provide context and documentation about the GIS data.Metadata Documentation: Once metadata is created, it needs to be documented in a standardized format. This may involve using metadata standards such as ISO 19115, FGDC (Federal Geographic Data Committee), or other industry-specific standards. Metadata documentation typically includes information about the data source, data lineage, data quality, spatial reference system, attributes, and other relevant information that describes the GIS data and its characteristics.Data Publishing: Finally, GIS data and its associated metadata may be published or made accessible to users through various means, such as online data portals, web services, or other data dissemination methods. Metadata is often used to facilitate data discovery, evaluation, and use, providing users with the necessary information to understand and utilize the GIS data effectively.Overall, the process of creating GIS data for use in metadata involves data collection, validation, processing, analysis, metadata creation, documentation, and data publishing, following established standards or guidelines to ensure accuracy, reliability, and interoperability of the GIS data.
This document explains the purpose, logic, and results of Minnesota’s Next Generation 9-1-1 (NG9-1-1) ESN Validation found on the Minnesota NG9-1-1 GIS Data Validation and Aggregation Portal.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The GRASS GIS database containing the input raster layers needed to reproduce the results from the manuscript entitled:
"Mapping forests with different levels of naturalness using machine learning and landscape data mining" (under review)
Abstract:
To conserve biodiversity, it is imperative to maintain and restore sufficient amounts of functional habitat networks. Hence, locating remaining forests with natural structures and processes over landscapes and large regions is a key task. We integrated machine learning (Random Forest) and wall-to-wall open landscape data to scan all forest landscapes in Sweden with a 1 ha spatial resolution with respect to the relative likelihood of hosting High Conservation Value Forests (HCVF). Using independent spatial stand- and plot-level validation data we confirmed that our predictions (ROC AUC in the range of 0.89 - 0.90) correctly represent forests with different levels of naturalness, from deteriorated to those with high and associated biodiversity conservation values. Given ambitious national and international conservation objectives, and increasingly intensive forestry, our model and the resulting wall-to-wall mapping fills an urgent gap for assessing fulfilment of evidence-based conservation targets, spatial planning, and designing forest landscape restoration.
This database was compiled from the following sources:
source: https://geodata.naturvardsverket.se/nedladdning/skogliga_vardekarnor_2016.zip
source: https://www.lantmateriet.se/en/geodata/geodata-products/product-list/terrain-model-download-grid-50/
source: https://glad.earthengine.app
source: https://doi.org/10.6084/m9.figshare.9828827.v2
source: https://www.scb.se/en/services/open-data-api/open-geodata/grid-statistics/
To learn more about the GRASS GIS database structure, see:
This dashboard depicts the status of NG9-1-1 data submitted to Datmark VEP, a data validation software solution.