51 datasets found
  1. a

    Delaware County GIS Data Extract Web Map

    • gisdata-delco.hub.arcgis.com
    • hub.arcgis.com
    Updated Jun 9, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Delaware County, Ohio (2020). Delaware County GIS Data Extract Web Map [Dataset]. https://gisdata-delco.hub.arcgis.com/maps/506aa1f8a7a6457097bca43691436674
    Explore at:
    Dataset updated
    Jun 9, 2020
    Dataset authored and provided by
    Delaware County, Ohio
    Area covered
    Description

    Web map used in Delaware County GIS Data Extract application that allows users to extract Delaware County, Ohio GIS data in various formats.

  2. d

    Outscraper Google Maps Scraper

    • datarade.ai
    .json, .csv, .xls
    Updated Dec 9, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2021). Outscraper Google Maps Scraper [Dataset]. https://datarade.ai/data-products/outscraper-google-maps-scraper-outscraper
    Explore at:
    .json, .csv, .xlsAvailable download formats
    Dataset updated
    Dec 9, 2021
    Area covered
    United States
    Description

    Are you looking to identify B2B leads to promote your business, product, or service? Outscraper Google Maps Scraper might just be the tool you've been searching for. This powerful software enables you to extract business data directly from Google's extensive database, which spans millions of businesses across countless industries worldwide.

    Outscraper Google Maps Scraper is a tool built with advanced technology that lets you scrape a myriad of valuable information about businesses from Google's database. This information includes but is not limited to, business names, addresses, contact information, website URLs, reviews, ratings, and operational hours.

    Whether you are a small business trying to make a mark or a large enterprise exploring new territories, the data obtained from the Outscraper Google Maps Scraper can be a treasure trove. This tool provides a cost-effective, efficient, and accurate method to generate leads and gather market insights.

    By using Outscraper, you'll gain a significant competitive edge as it allows you to analyze your market and find potential B2B leads with precision. You can use this data to understand your competitors' landscape, discover new markets, or enhance your customer database. The tool offers the flexibility to extract data based on specific parameters like business category or geographic location, helping you to target the most relevant leads for your business.

    In a world that's growing increasingly data-driven, utilizing a tool like Outscraper Google Maps Scraper could be instrumental to your business' success. If you're looking to get ahead in your market and find B2B leads in a more efficient and precise manner, Outscraper is worth considering. It streamlines the data collection process, allowing you to focus on what truly matters – using the data to grow your business.

    https://outscraper.com/google-maps-scraper/

    As a result of the Google Maps scraping, your data file will contain the following details:

    Query Name Site Type Subtypes Category Phone Full Address Borough Street City Postal Code State Us State Country Country Code Latitude Longitude Time Zone Plus Code Rating Reviews Reviews Link Reviews Per Scores Photos Count Photo Street View Working Hours Working Hours Old Format Popular Times Business Status About Range Posts Verified Owner ID Owner Title Owner Link Reservation Links Booking Appointment Link Menu Link Order Links Location Link Place ID Google ID Reviews ID

    If you want to enrich your datasets with social media accounts and many more details you could combine Google Maps Scraper with Domain Contact Scraper.

    Domain Contact Scraper can scrape these details:

    Email Facebook Github Instagram Linkedin Phone Twitter Youtube

  3. ScrapeHero Data Cloud - Free and Easy to use

    • datarade.ai
    .json, .csv
    Updated Feb 8, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Scrapehero (2022). ScrapeHero Data Cloud - Free and Easy to use [Dataset]. https://datarade.ai/data-products/scrapehero-data-cloud-free-and-easy-to-use-scrapehero
    Explore at:
    .json, .csvAvailable download formats
    Dataset updated
    Feb 8, 2022
    Dataset provided by
    ScrapeHero
    Authors
    Scrapehero
    Area covered
    Bhutan, Portugal, Bahamas, Ghana, Dominica, Slovakia, Anguilla, Niue, Bahrain, Chad
    Description

    The Easiest Way to Collect Data from the Internet Download anything you see on the internet into spreadsheets within a few clicks using our ready-made web crawlers or a few lines of code using our APIs

    We have made it as simple as possible to collect data from websites

    Easy to Use Crawlers Amazon Product Details and Pricing Scraper Amazon Product Details and Pricing Scraper Get product information, pricing, FBA, best seller rank, and much more from Amazon.

    Google Maps Search Results Google Maps Search Results Get details like place name, phone number, address, website, ratings, and open hours from Google Maps or Google Places search results.

    Twitter Scraper Twitter Scraper Get tweets, Twitter handle, content, number of replies, number of retweets, and more. All you need to provide is a URL to a profile, hashtag, or an advance search URL from Twitter.

    Amazon Product Reviews and Ratings Amazon Product Reviews and Ratings Get customer reviews for any product on Amazon and get details like product name, brand, reviews and ratings, and more from Amazon.

    Google Reviews Scraper Google Reviews Scraper Scrape Google reviews and get details like business or location name, address, review, ratings, and more for business and places.

    Walmart Product Details & Pricing Walmart Product Details & Pricing Get the product name, pricing, number of ratings, reviews, product images, URL other product-related data from Walmart.

    Amazon Search Results Scraper Amazon Search Results Scraper Get product search rank, pricing, availability, best seller rank, and much more from Amazon.

    Amazon Best Sellers Amazon Best Sellers Get the bestseller rank, product name, pricing, number of ratings, rating, product images, and more from any Amazon Bestseller List.

    Google Search Scraper Google Search Scraper Scrape Google search results and get details like search rank, paid and organic results, knowledge graph, related search results, and more.

    Walmart Product Reviews & Ratings Walmart Product Reviews & Ratings Get customer reviews for any product on Walmart.com and get details like product name, brand, reviews, and ratings.

    Scrape Emails and Contact Details Scrape Emails and Contact Details Get emails, addresses, contact numbers, social media links from any website.

    Walmart Search Results Scraper Walmart Search Results Scraper Get Product details such as pricing, availability, reviews, ratings, and more from Walmart search results and categories.

    Glassdoor Job Listings Glassdoor Job Listings Scrape job details such as job title, salary, job description, location, company name, number of reviews, and ratings from Glassdoor.

    Indeed Job Listings Indeed Job Listings Scrape job details such as job title, salary, job description, location, company name, number of reviews, and ratings from Indeed.

    LinkedIn Jobs Scraper Premium LinkedIn Jobs Scraper Scrape job listings on LinkedIn and extract job details such as job title, job description, location, company name, number of reviews, and more.

    Redfin Scraper Premium Redfin Scraper Scrape real estate listings from Redfin. Extract property details such as address, price, mortgage, redfin estimate, broker name and more.

    Yelp Business Details Scraper Yelp Business Details Scraper Scrape business details from Yelp such as phone number, address, website, and more from Yelp search and business details page.

    Zillow Scraper Premium Zillow Scraper Scrape real estate listings from Zillow. Extract property details such as address, price, Broker, broker name and more.

    Amazon product offers and third party sellers Amazon product offers and third party sellers Get product pricing, delivery details, FBA, seller details, and much more from the Amazon offer listing page.

    Realtor Scraper Premium Realtor Scraper Scrape real estate listings from Realtor.com. Extract property details such as Address, Price, Area, Broker and more.

    Target Product Details & Pricing Target Product Details & Pricing Get product details from search results and category pages such as pricing, availability, rating, reviews, and 20+ data points from Target.

    Trulia Scraper Premium Trulia Scraper Scrape real estate listings from Trulia. Extract property details such as Address, Price, Area, Mortgage and more.

    Amazon Customer FAQs Amazon Customer FAQs Get FAQs for any product on Amazon and get details like the question, answer, answered user name, and more.

    Yellow Pages Scraper Yellow Pages Scraper Get details like business name, phone number, address, website, ratings, and more from Yellow Pages search results.

  4. Data Bundle for PyPSA-Eur: An Open Optimisation Model of the European...

    • zenodo.org
    • data.niaid.nih.gov
    xz, zip
    Updated Jul 17, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jonas Hörsch; Fabian Hofmann; David Schlachtberger; Philipp Glaum; Fabian Neumann; Fabian Neumann; Tom Brown; Iegor Riepin; Bobby Xiong; Jonas Hörsch; Fabian Hofmann; David Schlachtberger; Philipp Glaum; Tom Brown; Iegor Riepin; Bobby Xiong (2024). Data Bundle for PyPSA-Eur: An Open Optimisation Model of the European Transmission System [Dataset]. http://doi.org/10.5281/zenodo.12760663
    Explore at:
    zip, xzAvailable download formats
    Dataset updated
    Jul 17, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Jonas Hörsch; Fabian Hofmann; David Schlachtberger; Philipp Glaum; Fabian Neumann; Fabian Neumann; Tom Brown; Iegor Riepin; Bobby Xiong; Jonas Hörsch; Fabian Hofmann; David Schlachtberger; Philipp Glaum; Tom Brown; Iegor Riepin; Bobby Xiong
    Description

    PyPSA-Eur is an open model dataset of the European power system at the transmission network level that covers the full ENTSO-E area. It can be built using the code provided at https://github.com/PyPSA/PyPSA-eur.

    It contains alternating current lines at and above 220 kV voltage level and all high voltage direct current lines, substations, an open database of conventional power plants, time series for electrical demand and variable renewable generator availability, and geographic potentials for the expansion of wind and solar power.

    Not all data dependencies are shipped with the code repository, since git is not suited for handling large changing files. Instead we provide separate data bundles to be downloaded and extracted as noted in the documentation.

    This is the full data bundle to be used for rigorous research. It includes large bathymetry and natural protection area datasets.

    While the code in PyPSA-Eur is released as free software under the MIT, different licenses and terms of use apply to the various input data, which are summarised below:

    corine/*

    Access to data is based on a principle of full, open and free access as established by the Copernicus data and information policy Regulation (EU) No 1159/2013 of 12 July 2013. This regulation establishes registration and licensing conditions for GMES/Copernicus users and can be found here. Free, full and open access to this data set is made on the conditions that:

    • When distributing or communicating Copernicus dedicated data and Copernicus service information to the public, users shall inform the public of the source of that data and information.

    • Users shall make sure not to convey the impression to the public that the user's activities are officially endorsed by the Union.

    • Where that data or information has been adapted or modified, the user shall clearly state this.

    • The data remain the sole property of the European Union. Any information and data produced in the framework of the action shall be the sole property of the European Union. Any communication and publication by the beneficiary shall acknowledge that the data were produced “with funding by the European Union”.

    eez/*

    Marine Regions’ products are licensed under CC-BY-NC-SA. Please contact us for other uses of the Licensed Material beyond license terms. We kindly request our users not to make our products available for download elsewhere and to always refer to marineregions.org for the most up-to-date products and services.

    natura/*

    EEA standard re-use policy: unless otherwise indicated, re-use of content on the EEA website for commercial or non-commercial purposes is permitted free of charge, provided that the source is acknowledged (https://www.eea.europa.eu/legal/copyright). Copyright holder: Directorate-General for Environment (DG ENV).

    naturalearth/*

    All versions of Natural Earth raster + vector map data found on this website are in the public domain. You may use the maps in any manner, including modifying the content and design, electronic dissemination, and offset printing. The primary authors, Tom Patterson and Nathaniel Vaughn Kelso, and all other contributors renounce all financial claim to the maps and invites you to use them for personal, educational, and commercial purposes.

    No permission is needed to use Natural Earth. Crediting the authors is unnecessary.

    NUTS_2013_60M_SH/*

    In addition to the general copyright and licence policy applicable to the whole Eurostat website, the following specific provisions apply to the datasets you are downloading. The download and usage of these data is subject to the acceptance of the following clauses:

    1. The Commission agrees to grant the non-exclusive and not transferable right to use and process the Eurostat/GISCO geographical data downloaded from this page (the "data").

    2. The permission to use the data is granted on condition that: the data will not be used for commercial purposes; the source will be acknowledged. A copyright notice, as specified below, will have to be visible on any printed or electronic publication using the data downloaded from this page.

    gebco/GEBCO_2014_2D.nc

    The GEBCO Grid is placed in the public domain and may be used free of charge. Use of the GEBCO Grid indicates that the user accepts the conditions of use and disclaimer information given below.

    Users are free to:

    • Copy, publish, distribute and transmit The GEBCO Grid

    • Adapt The GEBCO Grid

    • Commercially exploit The GEBCO Grid, by, for example, combining it with other information, or by including it in their own product or application

    Users must:

    • Acknowledge the source of The GEBCO Grid. A suitable form of attribution is given in the documentation that accompanies The GEBCO Grid.

    • Not use The GEBCO Grid in a way that suggests any official status or that GEBCO, or the IHO or IOC, endorses any particular application of The GEBCO Grid.

    • Not mislead others or misrepresent The GEBCO Grid or its source.

    je-e-21.03.02.xls

    Information on the websites of the Federal Authorities is accessible to the public. Downloading, copying or integrating content (texts, tables, graphics, maps, photos or any other data) does not entail any transfer of rights to the content.

    Copyright and any other rights relating to content available on the websites of the Federal Authorities are the exclusive property of the Federal Authorities or of any other expressly mentioned owners.

    Any reproduction requires the prior written consent of the copyright holder. The source of the content (statistical results) should always be given.

  5. S

    New York State Cooling Tower Registry Weekly Extract Map

    • data.ny.gov
    csv, xlsx, xml
    Updated Nov 25, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Center for Environmental Health (2025). New York State Cooling Tower Registry Weekly Extract Map [Dataset]. https://data.ny.gov/widgets/unmf-baqa
    Explore at:
    xlsx, csv, xmlAvailable download formats
    Dataset updated
    Nov 25, 2025
    Dataset authored and provided by
    Center for Environmental Health
    Area covered
    New York
    Description

    The dataset contains the location, legionella sampling results and operational status of cooling towers operating in New York State. The data is self-reported by the cooling tower operator according to Public Health Law, Section 225(5)(a) Subpart 4-1 – Cooling Towers. This dataset is a weekly snapshot of the data collected from the NYS Cooling Tower Registry Website it does not contain historical data.

  6. D

    Shaded relief WebMercator 'slippy map' tiles based on NASA Shuttle Radar...

    • darus.uni-stuttgart.de
    Updated Jan 29, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Max Franke (2024). Shaded relief WebMercator 'slippy map' tiles based on NASA Shuttle Radar Topography Mission Global 1 arc second V003 topographic height data [Dataset]. http://doi.org/10.18419/DARUS-3837
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Jan 29, 2024
    Dataset provided by
    DaRUS
    Authors
    Max Franke
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset contains WebMercator tiles which contain gray-scale shaded relief (hill shades), and nothing else. The tiles have a resolution of 256×256px, suitable for web mapping libraries such as Leaflet. The hill shades are generated from SRTM altitude data, which cover the land area between 60° northern and 58° southern latitude, and which lies in the public domain. Map material without political or infrastructural features can be desirable, for example, in use cases where historical data is visualized on a map. The concrete motivation for generating this map material was the Dhimmis & Muslims project (project page, home page, GitHub, DaRUS dataset), which analyzed peaceful coexistence of religious groups in the medieval Middle East. A particular goal with creating the dataset was to have map material available under a permissive license for screenshots and publications, instead of relying on proprietary mapping services such as Mapbox. Teaser image: The hillshades of Cyprus on zoom level 9. This image is hosted externally by GitHub, but is also present in the repository as teaser.png. Coverage. The dataset covers zoom level 0 (entire world in one tile) to 12 (entire world in 4096×4096 tiles). The total size of the dataset is 22,369,621 tiles. However, of those, 19,753,304 tiles (88.3%) are empty, either because the landscape there is fully flat (i.e., on water), or because they lie fully outside the latitude range covered by the SRTM altitude data. The empty tiles are not stored. Instead, a singular placeholder file is stored in the repository, alongside a list of the empty tiles. During extraction, the placeholder empty tile can be symbolically linked in the file system to all the places where it is needed. The total size of the non-empty tiles is about 103GB. Files. Besides the placeholder file and the list of empty tiles, the repository also contains a manifest file. This file lists all non-empty tiles by the ZIP file they are contained in. The tiles themselves are grouped into ZIP files by the following schema: All tiles from levels 0 to 5 are contained in one ZIP file. All tiles of level N, N≥6 are contained in a ZIP file which is named after the tile of level N-6 (block level) that contains the tile in question, named tiles_.zip. Hence, all tiles of level 6 are contained in a singular ZIP file named tiles_6_0_0_0.zip. The tiles of level 7 are split up into four group ZIP files named tiles_7_1_{0,1}_{0,1}.zip, the tiles of level 8 into 16 group ZIP files named tiles_8_2_{0..3}_{0..3}.zip, and so on. Both the manifest file and the commands to generate the distribution of tiles on ZIP files can be generated using the linked software repository. Usage. The tile ZIP files can be downloaded and extracted. By serving the extracted directory structure in a web server, a slippy map tile server can be created. The linked software repository also contains a command-line utility that generates the required shell commands to download the ZIP files, extract them, and softlink (ln -s) the empty tiles to the appropriate places. This command-line utility can also optionally read in a GeoJSON file of an area of interest. In this case, only tiles within that area are downloaded in a higher zoom level, whereas tiles completely outside the area are only downloaded to a lower zoom level; both zoom levels are also configurable. See the documentation in the repository and the command-line utility’s help (-h) output for more details.

  7. d

    Map data from landslides triggered by Hurricane Maria in two study areas in...

    • catalog.data.gov
    • data.usgs.gov
    • +1more
    Updated Nov 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2025). Map data from landslides triggered by Hurricane Maria in two study areas in the Las Marías Municipality, Puerto Rico, All [Dataset]. https://catalog.data.gov/dataset/map-data-from-landslides-triggered-by-hurricane-maria-in-two-study-areas-in-the-las-marias
    Explore at:
    Dataset updated
    Nov 27, 2025
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    Puerto Rico
    Description

    In late September 2017, intense precipitation associated with Hurricane Maria caused extensive landsliding across Puerto Rico. Much of the Las Marias municipality in central-western Puerto Rico was severely impacted by landslides., Landslide density in this region was mapped as greater than 25 landslides/km2 (Bessette-Kirton et al., 2019). In order to better understand the controlling variables of landslide occurrence and runout in this region, two 2.5-km2 study areas were selected and all landslides within were manually mapped in detail using remote-sensing data. Included in the data release are five separate shapefiles: geographic areas representing the mapping extent of the four distinct areas (map areas, filename: map_areas), initiation location polygons (source areas, filename: SourceArea), polygons of the entire impacted area consisting of source, transport, and deposition (affected areas, filename: AffectArea), points on the furthest upslope extent of the landslide source areas (headscarp point, filename: HSPoint), and lines reflecting the approximate travel paths from the furthest upslope extent to the furthest downslope extent of the landslides (runout lines, filename: RunoutLine). These shapefiles contain qualitative attributes interpreted from the aerial imagery (such as geomorphic setting and impact of human activity) and qualitative attributes extracted from the geospatial data (such as source area length, width, and depth), as well as attributes extracted from other sources (such as geology and soil properties). A table detailing each attribute, attribute abbreviations, the possible choices for each attribute, and a short description of each attribute is provided as a table in the file labeled AttributeDescription.docx. The headscarp point shapefile attribute tables contain closest distance between headscarp and paved road (road_d_m; road data from U.S. Census Bureau, 2015). The runout line shapefile attribute table reflects if the landslide was considered independently unmappable past a road or river (term_drain), the horizontal length of the runout (length_m), the fall height from the headscarp to termination (h_m), the ratio of fall height to runout length (hlratio), distance to nearest paved road (road_d_m), and the watershed area upslope from the upper end of the runout line (wtrshd_m2). All quantitative metrics were calculated using tools available in ESRI ArcMap v. 10.6. The source area shapefile attribute table reflects general source area vegetation (vegetat) and land use (land_use), whether the slide significantly disaggregated during movement (flow), the failure mode (failmode), if the slide was a reactivation of a previous one (reactivate), if the landslide directly impacted the occurrence of another slide (ls_complex), the proportion of source material that left the source area (sourc_evac), the state of the remaining material (remaining), the curvature of the source area (sourc_curv), potential human impact on landslide occurrence (human_caus), potential landslide impact on human society (human_effc), if a building exists within 10 meters of the source area (buildng10m), if a road exists within 50 meters of the source area (road50m), the planimetric area of the source area (area_m2), the dimension of the source area perpendicular to the direction of motion (width_m), the dimension of the source area parallel to the direction of motion (length_m), the geologic formation of the source area (FMATN; from Bawiec, W.J., 1998), the soil type of the source area (MUNAME; from Acevido, G., 2020), the root-zone (0-100 cm deep) soil moisture estimated by the NASA SMAP mission for 9:30 am Atlantic Standard Time on 21 September 2017 (the day after Hurricane MarÃa) (smap; NASA, 2017), the average precipitation amount in the source area for the duration of the hurricane (pptn_mm; from Ramos-Scharrón, C.E., and Arima, E., 2019), the source area mean slope (mn_slp_d), the source area median slope (mdn_slp_d), the average depth change of material from the source area after the landslide (mn_dpth_m), the median depth change of material from the source area after the landslide (mdn_dpt_m), the sum of the volumetric change of material in the source area after the landslide (ldr_sm_m3), the major geomorphic landform of the source (maj_ldfrm), and the landcover of the source area (PRGAP_CL; from Homer, C. C. Huang, L. Yang, B. Wylie and M. Coan, 2004). The affected area shapefile attribute table reflects the general affected area vegetation type (vegetat), the major geomorphic landform on which the landslide occurred (maj_ldfrm), whether the slide disaggregated during movement (flow), the general land use (land_use), the planimetric area of the affected area (area_m2), the dominant geologic formation of the affected area (FMATN; from Bawiec, W.J., 1998), the dominant soil type of the affected area (MUNAME; from Acevido, G., 2020), the sum of the volumetric change of material in all the contributing source areas for the affected area (Sum_ldr_sm), the average volumetric change of material in all the contributing source areas for the affected area (Avg_ldr_sm), if the landslide was considered independently unmappable past a road or river (term_drain), the number of contributing source areas to the affected area (num_srce), and the dominant landcover of the affected area (PRGAP_CL; from Homer, C. C. Huang, L. Yang, B. Wylie and M. Coan, 2004). Mapping was conducted using aerial imagery collected between 9-15 October 2017 at 25-cm resolution (Quantum Spatial, Inc., 2017), a 1-m-resolution pre-event lidar digital elevation model (DEM) (U.S. Geological Survey, 2018), and a 1-m-resolution post-event lidar DEM (U.S. Geological Survey, 2020). In order to accurately determine the extent of the mapped landslides and to verify the georeferencing of the aerial imagery, aerial photographs were overlain with each DEM as well as a pre- and post-event lidar difference (2016-2018), and corrections were made as needed. Additional data sources described in the AttributeDescription document and metadata were used to extract spatial data once mapping was complete and results were appended to the shapefile attribute tables. Data in this release are provided as ArcGIS point (HSPoint), line (RunoutLine), and polygon (AffectArea and SourceArea) feature class files. Bessette-Kirton, E.K., Cerovski-Darriau, C., Schulz, W.H., Coe, J.A., Kean, J.W., Godt, J.W, Thomas, M.A., and Hughes, K. Stephen, 2019, Landslides Triggered by Hurricane Maria: Assessment of an Extreme Event in Puerto Rico: GSA Today, v. 29, doi:10.1130/GSATG383A.1 U.S. Census Bureau, 2015, 2015 TIGER/Line Shapefiles, State, Puerto Rico, primary and secondary roads State-based Shapefile: United States Census Bureau, accessed September 12, 2019, at http://www2.census.gov/geo/tiger/TIGER2015/ PRISECROADS/tl_2015_72_prisecroads.zip. Bawiec, W.J., 1998, Geology, geochemistry, geophysics, mineral occurrences and mineral resource assessment for the Commonwealth of Puerto Rico: U.S. Geological Survey Open-File Report 98-38, https://pubs.usgs.gov/of/1998/of98-038/ (accessed May 2020). Acevido, G., 2020, Soil Survey of Arecibo Area of Norther Puerto Rico: United States Department of Agriculture, Soil Conservation Service. National Aeronautics and Space Administration [NASA], 2017, SMAP L4 Global 3-hourly 9 km EASE-Grid Surface and Root Zone Soil Moisture Analysis Update, Version 4: National Snow & Ice Data Center web page, accessed September 12, 2019, at https://nsidc.org/data/SPL4SMAU/versions/4. Ramos-Scharrón, C.E., and Arima, E., 2019, Hurricane MarÃa’s precipitation signature in Puerto Rico—A conceivable presage of rains to come: Scientific Reports, v. 9, no. 1, article no. 15612, accessed February 28, 2020, at https://doi.org/10.1038/ s41598-019-52198-2. Homer, C. C. Huang, L. Yang, B. Wylie and M. Coan, 2004, Development of a 2001 National Landcover Database for the United States: Photogrammetric Engineering and Remote Sensing, Vol. 70, No. 7, July 2004, pp. 829-840. Quantum Spatial, Inc., 2017, FEMA PR Imagery: https://s3.amazonaws.com/fema-cap-imagery/Others/Maria (accessed October 2017). U.S. Geological Survey, 2018, USGS NED Original Product Resolution PR Puerto Rico 2015: http://nationalmap.gov/elevation.html (accessed October 2018). U.S. Geological Survey, 2020, USGS NED Original Product Resolution PR Puerto Rico 2018: http://nationalmap.gov/elevation.html (accessed June 2020). Any use of trade, firm, or product names is for descriptive purposes only and does not imply endorsement by the U.S. Government

  8. Flood Hazard Areas (Only FEMA - digitized data)

    • geodata.vermont.gov
    • anrgeodata.vermont.gov
    • +6more
    Updated Dec 15, 2015
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Vermont Agency of Natural Resources (2015). Flood Hazard Areas (Only FEMA - digitized data) [Dataset]. https://geodata.vermont.gov/datasets/VTANR::flood-hazard-areas-only-fema-digitized-data
    Explore at:
    Dataset updated
    Dec 15, 2015
    Dataset provided by
    Vermont Agency Of Natural Resourceshttp://www.anr.state.vt.us/
    Authors
    Vermont Agency of Natural Resources
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Area covered
    Description

    The entire Vermont extent of the National Flood Hazard Layer (NFHL) as acquired 12/15/15 from the FEMA Map Service Center msc.fema.gov upon publication 12/2/2015 and converted to VSP.The FEMA DFIRM NFHL database compiles all available officially-digitized Digital Flood Insurance Rate Maps. This extract from the FEMA Map Service Center includes all of such data in Vermont including counties and a few municipalities. This data includes the most recent map update for Bennington County effective 12/2/2015.DFIRM - Letter of Map Revision (LOMR) DFIRM X-Sections DFIRM Floodways Special Flood Hazard Areas (All Available)

  9. w

    OpenStreetMap

    • data.wu.ac.at
    • data.europa.eu
    Updated Sep 26, 2015
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    London Datastore Archive (2015). OpenStreetMap [Dataset]. https://data.wu.ac.at/odso/datahub_io/NzA2Y2FjYWMtNTFlZS00YjU3LTlkNTQtOGU3ZTA1YTBkZDlk
    Explore at:
    text/html; charset=iso-8859-1(0.0)Available download formats
    Dataset updated
    Sep 26, 2015
    Dataset provided by
    London Datastore Archive
    License

    http://reference.data.gov.uk/id/open-government-licencehttp://reference.data.gov.uk/id/open-government-licence

    Description

    http://www.openstreetmap.org/images/osm_logo.png" alt=""/> OpenStreetMap (openstreetmap.org) is a global collaborative mapping project, which offers maps and map data released with an open license, encouraging free re-use and re-distribution. The data is created by a large community of volunteers who use a variety of simple on-the-ground surveying techniques, and wiki-syle editing tools to collaborate as they create the maps, in a process which is open to everyone. The project originated in London, and an active community of mappers and developers are based here. Mapping work in London is ongoing (and you can help!) but the coverage is already good enough for many uses.

    Browse the map of London on OpenStreetMap.org

    Downloads:

    The whole of England updated daily:

    For more details of downloads available from OpenStreetMap, including downloading the whole planet, see 'planet.osm' on the wiki.

    Data access APIs:

    Download small areas of the map by bounding-box. For example this URL requests the data around Trafalgar Square:
    http://api.openstreetmap.org/api/0.6/map?bbox=-0.13062,51.5065,-0.12557,51.50969

    Data filtered by "tag". For example this URL returns all elements in London tagged shop=supermarket:
    http://www.informationfreeway.org/api/0.6/*[shop=supermarket][bbox=-0.48,51.30,0.21,51.70]

    The .osm format

    The format of the data is a raw XML represention of all the elements making up the map. OpenStreetMap is composed of interconnected "nodes" and "ways" (and sometimes "relations") each with a set of name=value pairs called "tags". These classify and describe properties of the elements, and ultimately influence how they get drawn on the map. To understand more about tags, and different ways of working with this data format refer to the following pages on the OpenStreetMap wiki.

    Simple embedded maps

    Rather than working with raw map data, you may prefer to embed maps from OpenStreetMap on your website with a simple bit of javascript. You can also present overlays of other data, in a manner very similar to working with google maps. In fact you can even use the google maps API to do this. See OSM on your own website for details and links to various javascript map libraries.

    Help build the map!

    The OpenStreetMap project aims to attract large numbers of contributors who all chip in a little bit to help build the map. Although the map editing tools take a little while to learn, they are designed to be as simple as possible, so that everyone can get involved. This project offers an exciting means of allowing local London communities to take ownership of their part of the map.

    Read about how to Get Involved and see the London page for details of OpenStreetMap community events.

  10. d

    Site 37 Mississippi River Bathymetry and Velocimetry Data at Structure A5076...

    • catalog.data.gov
    • data.usgs.gov
    • +1more
    Updated Nov 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2025). Site 37 Mississippi River Bathymetry and Velocimetry Data at Structure A5076 on Missouri State Highway 34 at Cape Girardeau, Missouri, June 2014 and July 2018 [Dataset]. https://catalog.data.gov/dataset/site-37-mississippi-river-bathymetry-and-velocimetry-data-at-structure-a5076-on-missouri-s
    Explore at:
    Dataset updated
    Nov 27, 2025
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    Cape Girardeau, Mississippi River, Missouri
    Description

    These data are high-resolution bathymetry (riverbed elevation) and depth-averaged velocities in ASCII format, generated from hydrographic and velocimetric surveys of the Mississippi River near structure A5076 on Missouri State Highway 34 at Cape Girardeau, Missouri, in 2014 and 2018. Hydrographic data were collected using a high-resolution multibeam echosounder mapping system (MBMS), which consists of a multibeam echosounder (MBES) and an inertial navigation system (INS) mounted on a marine survey vessel. Data were collected as the vessel traversed the river along planned survey lines distributed throughout the reach. Data collection software integrated and stored the depth data from the MBES and the horizontal and vertical position and attitude data of the vessel from the INS in real time. Data processing required computer software to extract bathymetry data from the raw data files and to summarize and map the information. Velocity data were collected using an acoustic Doppler current profiler (ADCP) mounted on a survey vessel equipped with a differential global positioning system (DGPS). Data were collected as the vessel traversed the river along planned transect lines distributed throughout the reach. Velocity data were processed using the Velocity Mapping Toolbox (Parsons and others, 2013), and smoothed using neighboring nodes.

  11. r

    Black Footed Ferret Gap Distribution Biodiversity map

    • opendata.rcmrd.org
    Updated Aug 27, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    kpc5864_pennstate (2024). Black Footed Ferret Gap Distribution Biodiversity map [Dataset]. https://opendata.rcmrd.org/maps/7f5703ed1c014d76b216b179d1e04ddf
    Explore at:
    Dataset updated
    Aug 27, 2024
    Dataset authored and provided by
    kpc5864_pennstate
    Area covered
    Description

    This web map shows the latest collection of research-grade species observations contributed by iNaturalist users through the iNaturalist social network app and website. These Open Data observations can be used by the GIS community to better understand biodiversity, sustainability, migration patterns, invasive and threatened species distributions, and climate change adaptations, among many other use cases.Consumption Best PracticesDue to the high volume of observations, the service limits individual point visibility to only draw at the largest scales, using multi-scale H3 hexbins to summarize predominant observations at smaller scales.Small subsets of iNaturalist observations (128,000) can be copied from the service for use in analysis, data enrichment, or other visualizations. For larger iNaturalist archive requests or for access to iNaturalist Project datasets, use the iNaturalist website, or the iNaturalist AWS S3 Open Data extract, from which this service was derived.Source: iNaturalist AWS S3 Open DataUpdate Frequency: Monthly, end of the monthSpatial Reference: WGS 1984 (WKID 4326)Area Covered: WorldAttribute InformationTaxonomy: Each observation contains its taxonomic hierarchy (Kingdom, Phylum, Class, Order, Family, Genus, Species), as well as its Scientific Name and Common Name (where available).iNaturalist Taxon Category: 12 logical taxonomic groups used by the iNaturalist community are used to symbolize like-observations. User Information: All observations are credited to the iNaturalist User ID, User Login, and User Name (where provided)Media and Licenses: Direct URL links are provided to one original-resolution image from the iNaturalist observation. Creative Commons licensing also indicates the sharing and attribution of any photographic media associated with a user observation.Dates: Observations include an Observed on Date and a Modified on Date provided by iNaturalist. In addition, these date fields were added to simplify the filtering and visualization of observations by year or month:Observed on Month (integer)Observed on Year (integer)Note about Research Grade ObservationsOnly Verifiable and Research Grade observations are included in this service. An observation is Verifiable if it meets these requirements:Has a dateIs georeferenced (has lat/lon coordinates)Has photographs or soundsIsn’t of a captive or cultivated organismIn addition, a Verifiable observation moves from "Needs ID" to "Research Grade" in iNaturalist when at least 2 species-level identifications (and 2/3 of all suggested identifications) are in agreement. See here for more information on how iNaturalist assesses data quality.Note about Location PrivacyTo protect the livelihood of endangered or threatened species, the X/Y locations of some iNaturalist observations are automatically obscured to a random location in a 400 square-kilometer grid cell. Similarly, users can choose to obscure the location of their observations in the iNaturalist app settings for personal privacy reasons. The result is that you may see dense, blocky aggregations of observations as you navigate around the map – or observations that appear in unusual places (e.g., an endangered coastal plant that has been relocated out in the ocean.)Additional iNaturalist ResourcesiNaturalist GuidesiNaturalist statistics and observationsiNaturalist ForumiNaturalist within the pressRevisions and Layer details:The layer used in this map is provided for informational purposes and is not monitored 24/7 for accuracy and currency. Any changes or deletions made to user observations through the iNaturalist app or website will not be reflected in this service until the next monthly update.If you would like to be alerted to potential issues or simply see when this service will update next, please visit our Live Feed Status page!

  12. H

    Digital Elevation Models and GIS in Hydrology (M2)

    • hydroshare.org
    • beta.hydroshare.org
    • +1more
    zip
    Updated Jun 7, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Irene Garousi-Nejad; Belize Lane (2021). Digital Elevation Models and GIS in Hydrology (M2) [Dataset]. http://doi.org/10.4211/hs.9c4a6e2090924d97955a197fea67fd72
    Explore at:
    zip(88.2 MB)Available download formats
    Dataset updated
    Jun 7, 2021
    Dataset provided by
    HydroShare
    Authors
    Irene Garousi-Nejad; Belize Lane
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Description

    This resource contains data inputs and a Jupyter Notebook that is used to introduce Hydrologic Analysis using Terrain Analysis Using Digital Elevation Models (TauDEM) and Python. TauDEM is a free and open-source set of Digital Elevation Model (DEM) tools developed at Utah State University for the extraction and analysis of hydrologic information from topography. This resource is part of a HydroLearn Physical Hydrology learning module available at https://edx.hydrolearn.org/courses/course-v1:Utah_State_University+CEE6400+2019_Fall/about

    In this activity, the student learns how to (1) derive hydrologically useful information from Digital Elevation Models (DEMs); (2) describe the sequence of steps involved in mapping stream networks, catchments, and watersheds; and (3) compute an approximate water balance for a watershed-based on publicly available data.

    Please note that this exercise is designed for the Logan River watershed, which drains to USGS streamflow gauge 10109000 located just east of Logan, Utah. However, this Jupyter Notebook and the analysis can readily be applied to other locations of interest. If running the terrain analysis for other study sites, you need to prepare a DEM TIF file, an outlet shapefile for the area of interest, and the average annual streamflow and precipitation data. - There are several sources to obtain DEM data. In the U.S., the DEM data (with different spatial resolutions) can be obtained from the National Elevation Dataset available from the national map (http://viewer.nationalmap.gov/viewer/). Another DEM data source is the Shuttle Radar Topography Mission (https://www2.jpl.nasa.gov/srtm/), an international research effort that obtained digital elevation models on a near-global scale (search for Digital Elevation at https://www.usgs.gov/centers/eros/science/usgs-eros-archive-products-overview?qt-science_center_objects=0#qt-science_center_objects). - If not already available, you can generate the outlet shapefile by applying basic terrain analysis steps in geospatial information system models such as ArcGIS or QGIS. - You also need to obtain average annual streamflow and precipitation data for the watershed of interest to assess the annual water balance and calculate the runoff ratio in this exercise. In the U.S., the streamflow data can be obtained from the USGS NWIS website (https://waterdata.usgs.gov/nwis) and the precipitation from PRISM (https://prism.oregonstate.edu/normals/). Note that using other datasets may require preprocessing steps to make data ready to use for this exercise.

  13. Z

    Data from: ICDAR 2021 Competition on Historical Map Segmentation — Dataset

    • data.niaid.nih.gov
    • zenodo.org
    Updated May 30, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Chazalon, Joseph; Carlinet, Edwin; Chen, Yizi; Perret, Julien; Duménieu, Bertrand; Mallet, Clément; Géraud, Thierry (2021). ICDAR 2021 Competition on Historical Map Segmentation — Dataset [Dataset]. https://data.niaid.nih.gov/resources?id=ZENODO_4817661
    Explore at:
    Dataset updated
    May 30, 2021
    Dataset provided by
    EPITA Research and Development Laboratory
    Univ. Gustave Eiffel, IGN-ENSG, LaSTIG
    LaDéHiS, CRH, EHESS
    Authors
    Chazalon, Joseph; Carlinet, Edwin; Chen, Yizi; Perret, Julien; Duménieu, Bertrand; Mallet, Clément; Géraud, Thierry
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    ICDAR 2021 Competition on Historical Map Segmentation — Dataset

    This is the dataset of the ICDAR 2021 Competition on Historical Map Segmentation (“MapSeg”). This competition ran from November 2020 to April 2021. Evaluation tools are freely available but distributed separately.

    Official competition website: https://icdar21-mapseg.github.io/

    The competition report can be cited as:

    Joseph Chazalon, Edwin Carlinet, Yizi Chen, Julien Perret, Bertrand Duménieu, Clément Mallet, Thierry Géraud, Vincent Nguyen, Nam Nguyen, Josef Baloun, Ladislav Lenc, and Pavel Král, "ICDAR 2021 Competition on Historical Map Segmentation", in Proceedings of the 16th International Conference on Document Analysis and Recognition (ICDAR'21), September 5-10, 2021, Lausanne, Switzerland.

    BibTeX entry:

    @InProceedings{chazalon.21.icdar.mapseg, author = {Joseph Chazalon and Edwin Carlinet and Yizi Chen and Julien Perret and Bertrand Duménieu and Clément Mallet and Thierry Géraud and Vincent Nguyen and Nam Nguyen and Josef Baloun and Ladislav Lenc and and Pavel Král}, title = {ICDAR 2021 Competition on Historical Map Segmentation}, booktitle = {Proceedings of the 16th International Conference on Document Analysis and Recognition (ICDAR'21)}, year = {2021}, address = {Lausanne, Switzerland}, }

    We thank the City of Paris for granting us with the permission to use and reproduce the atlases used in this work.

    The images of this dataset are extracted from a series of 9 atlases of the City of Paris produced between 1894 and 1937 by the Map Service (“Service du plan”) of the City of Paris, France, for the purpose of urban management and planning. For each year, a set of approximately 20 sheets forms a tiled view of the city, drawn at 1/5000 scale using trigonometric triangulation.

    Sample citation of original documents:

    Atlas municipal des vingt arrondissements de Paris. 1894, 1895, 1898, 1905, 1909, 1912, 1925, 1929, and 1937. Bibliothèque de l’Hôtel de Ville. City of Paris. France.

    Motivation

    This competition aims as encouraging research in the digitization of historical maps. In order to be usable in historical studies, information contained in such images need to be extracted. The general pipeline involves multiples stages; we list some essential ones here:

    segment map content: locate the area of the image which contains map content;

    extract map object from different layers: detect objects like roads, buildings, building blocks, rivers, etc. to create geometric data;

    georeference the map: by detecting objects at known geographic coordinate, compute the transformation to turn geometric objects into geographic ones (which can be overlaid on current maps).

    Task overview

    Task 1: Detection of building blocks

    Task 2: Segmentation of map content within map sheets

    Task 3: Localization of graticule lines intersections

    Please refer to the enclosed README.md file or to the official website for the description of tasks and file formats.

    Evaluation metrics and tools

    Evaluation metrics are described in the competition report and tools are available at https://github.com/icdar21-mapseg/icdar21-mapseg-eval and should also be archived using Zenodo.

  14. d

    Site 25 Missouri River Bathymetry and Velocimetry Data at Structures...

    • catalog.data.gov
    • data.usgs.gov
    • +1more
    Updated Nov 26, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2025). Site 25 Missouri River Bathymetry and Velocimetry Data at Structures A3292/L0561 on Interstate 70 near St. Louis, Missouri, October 2010 through May 2016 [Dataset]. https://catalog.data.gov/dataset/site-25-missouri-river-bathymetry-and-velocimetry-data-at-structures-a3292-l0561-on-inters
    Explore at:
    Dataset updated
    Nov 26, 2025
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    Interstate 70, Missouri River, St. Louis, Missouri
    Description

    These data are high-resolution bathymetry (riverbed elevation) and depth-averaged velocities in ASCII format, generated from hydrographic and velocimetric surveys of the Missouri River near structures A3292/L0561 on Interstate 70 near St. Louis, Missouri, in 2010, 2011 and 2016. Hydrographic data were collected using a high-resolution multibeam echosounder mapping system (MBMS), which consists of a multibeam echosounder (MBES) and an inertial navigation system (INS) mounted on a marine survey vessel. Data were collected as the vessel traversed the river along planned survey lines distributed throughout the reach. Data collection software integrated and stored the depth data from the MBES and the horizontal and vertical position and attitude data of the vessel from the INS in real time. Data processing required computer software to extract bathymetry data from the raw data files and to summarize and map the information. Velocity data were collected using an acoustic Doppler current profiler (ADCP) mounted on a survey vessel equipped with a differential global positioning system (DGPS). Data were collected as the vessel traversed the river along planned transect lines distributed throughout the reach. Velocity data were processed using the Velocity Mapping Toolbox (Parsons and other, 2013), and smoothed using neighboring nodes.

  15. d

    Site 35 Mississippi River Bathymetry and Velocimetry Data at Structures...

    • catalog.data.gov
    • data.usgs.gov
    • +1more
    Updated Nov 25, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2025). Site 35 Mississippi River Bathymetry and Velocimetry Data at Structures A4936/A1850 on Interstate 255 near St. Louis, Missouri, October 2008 through May 2016 [Dataset]. https://catalog.data.gov/dataset/site-35-mississippi-river-bathymetry-and-velocimetry-data-at-structures-a4936-a1850-on-int
    Explore at:
    Dataset updated
    Nov 25, 2025
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    St. Louis, Interstate 255, Mississippi River, Missouri
    Description

    These data are high-resolution bathymetry (riverbed elevation) and depth-averaged velocities in ASCII format, generated from hydrographic and velocimetric surveys of the Mississippi River near structures A4936/A1850 on Interstate 255 near St. Louis, Missouri, in 2008, 2009, 2010 and 2016. Hydrographic data were collected using a high-resolution multibeam echosounder mapping system (MBMS), which consists of a multibeam echosounder (MBES) and an inertial navigation system (INS) mounted on a marine survey vessel. Data were collected as the vessel traversed the river along planned survey lines distributed throughout the reach. Data collection software integrated and stored the depth data from the MBES and the horizontal and vertical position and attitude data of the vessel from the INS in real time. Data processing required computer software to extract bathymetry data from the raw data files and to summarize and map the information. Velocity data were collected using an acoustic Doppler current profiler (ADCP) mounted on a survey vessel equipped with a differential global positioning system (DGPS). Data were collected as the vessel traversed the river along planned transect lines distributed throughout the reach. Velocity data were processed using the Velocity Mapping Toolbox (Parsons and other, 2013), and smoothed using neighboring nodes.

  16. d

    Site 32 Mississippi River Bathymetry Data at Structure K0932 on U.S. Highway...

    • catalog.data.gov
    • data.usgs.gov
    • +1more
    Updated Nov 26, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2025). Site 32 Mississippi River Bathymetry Data at Structure K0932 on U.S. Highway 54 at Louisiana, Missouri, June 2014 [Dataset]. https://catalog.data.gov/dataset/site-32-mississippi-river-bathymetry-data-at-structure-k0932-on-u-s-highway-54-at-louisian
    Explore at:
    Dataset updated
    Nov 26, 2025
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    Louisiana, U.S. 54, Mississippi River, Missouri
    Description

    These data are high-resolution bathymetry (riverbed elevation) in ASCII XYZ format, generated from the June 6, 2014, hydrographic and velocimetric survey of the Mississippi River near structure K0932 on U.S. Highway 54 at Louisiana, Missouri. Hydrographic data were collected using a high-resolution multibeam echosounder mapping system (MBMS), which consists of a multibeam echosounder (MBES) and an inertial navigation system (INS) mounted on a marine survey vessel. Data were collected as the vessel traversed the river along planned survey lines distributed throughout the reach. Data collection software integrated and stored the depth data from the MBES and the horizontal and vertical position and attitude data of the vessel from the INS in real time. Data processing required computer software to extract bathymetry data from the raw data files and to summarize and map the information.

  17. U.S. Great Lakes Collaborative Benthic Habitat Mapping Project Map: GLRI...

    • noaa.hub.arcgis.com
    Updated Feb 21, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NOAA GeoPlatform (2025). U.S. Great Lakes Collaborative Benthic Habitat Mapping Project Map: GLRI Characterization [Dataset]. https://noaa.hub.arcgis.com/maps/7d8c21f1e7164175bf1189943be761b5
    Explore at:
    Dataset updated
    Feb 21, 2025
    Dataset provided by
    National Oceanic and Atmospheric Administrationhttp://www.noaa.gov/
    Authors
    NOAA GeoPlatform
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Area covered
    Description

    THIS MAP IS NOT AUTHORITATIVE. SEE TERMS OF USE BELOW.This web map was developed by the National Oceanic and Atmospheric Administration’s (NOAA) Office for Coastal Management and is featured in the U.S. Great Lakes Collaborative Benthic Habitat Mapping Common Operating Dashboard in support of the Collaborative Benthic Habitat Mapping in the Nearshore Waters of the Great Lakes Basin Project. This multi-year, multi-agency project is funded through the Great Lakes Restoration Initiative (GLRI) and focuses on new bathymetric data (airborne lidar and vessel based sonar) acquisition, validation, and benthic habitat characterization mapping of the nearshore waters (0-80 meters) in the U.S. Great Lakes. This project also contributes to the regional Lakebed 2030 campaign, which aims to have high-density bathymetric data available for the entirety of the Great Lakes by 2030. This web map contains data layers reflecting the current status of bathy data coverage in the nearshore (0-80 meters) of the U.S. Great Lakes, including acquisition (lidar and multibeam sonar), ground-truthing/validation, and benthic habitat mapping and characterization. Acquisition layers include coverage areas that have been acquired and are available for public use (green) as well as those that have been acquired, but are not yet available or are still in progress (orange). The nearshore water depth layers (0-25 and 25-80 meters) were created using the National Centers for Environmental Information (NCEI) Great Lakes Bathymetry (3-second resolution) grid extracts. The 0 to 25 meter nearshore water depth layer represents areas where bathymetric lidar data acquisition could ideally be conducted, depending on water condition and turbidity. The 25 to 80 meter layer shows locations where acoustic data acquisition can occur. See below for information on additional data layers. All data originally projected in the following coordinate system: EPSG:3175, NAD 1983 Great Lakes and St Lawrence Albers.This map will continue to be updated as new information is made available.Source Data for Bathy Coverage Layers - Acquired/Available:Topobathy and Bathy Lidar (NOAA's Data Access Viewer: https://coast.noaa.gov/dataviewer/#/; U.S. Interagency Elevation Inventory (USIEI): https://coast.noaa.gov/inventory/). Multibeam Sonar (National Centers for Environmental Information (NCEI) Bathymetric Data Viewer: https://www.ncei.noaa.gov/maps/bathymetry/; NOAA's Data Access Viewer: https://coast.noaa.gov/dataviewer/#/; U.S. Interagency Elevation Inventory (USIEI): https://coast.noaa.gov/inventory/; USGS ScienceBaseCatalog: https://www.sciencebase.gov/catalog/item/656e229bd34e7ca10833f950)Source Data for Bathy Coverage Layers - GLRI AOIs (2020-2024):Acquisition: NOAA Office for Coastal ManagementValidation/CMECS Characterizations: NOAA National Centers for Coastal Ocean Science (NCCOS)Source Data for Bathy Coverage Layers - In Progress and Planned:NOAA Office of Coast Survey Plans: https://gis.charttools.noaa.gov/arcgis/rest/services/Hydrographic_Services/Planned_Survey_Areas/MapServer/0NOAA Office for Coastal ManagementSource Data for Nearshore Water Depths:NOAA's National Centers for Environmental Information (NCEI) Great Lakes Bathymetry (3-second resolution) grid extracts: https://www.ncei.noaa.gov/maps/grid-extract/Source Data for Spatial Prioritization Layers:Great Lakes Spatial Priorities Study Results Jun 2021. https://gis.charttools.noaa.gov/arcgis/rest/services/IOCM/GreatLakes_SPS_Results_Jun_2021/MapServerMapping priorities within the proposed Wisconsin Lake Michigan National Marine Sanctuary (2018). https://gis.ngdc.noaa.gov/arcgis/rest/services/nccos/BiogeographicAssessments_WILMPrioritizationResults/MapServerThunder Bay National Marine Sanctuary Spatial Prioritization Results (2020). https://gis.ngdc.noaa.gov/arcgis/rest/services/nccos/BiogeographicAssessments_TBNMSPrioritizationResults/MapServerSource Data for Supplemental Data Layers:International Boundary Commission U.S./Canada Boundary (version 1.3 from 2018): https://www.internationalboundarycommission.org/en/maps-coordinates/coordinates.phpNational Oceanic and Atmospheric Administration (NOAA) HydroHealth 2018 Survey: https://wrecks.nauticalcharts.noaa.gov/arcgis/rest/services/Hydrographic_Services/HydroHealth_2018/ImageServerNational Oceanic and Atmospheric Administration (NOAA) Marine Protected Areas (MPA) Inventory 2023-2024: https://www.fisheries.noaa.gov/inport/item/69506National Oceanic and Atmospheric Administration (NOAA) National Marine Sanctuary Program Boundaries (2021): https://services2.arcgis.com/C8EMgrsFcRFL6LrL/arcgis/rest/services/ONMS_2021_Boundaries/FeatureServerNational Oceanic and Atmospheric Administration (NOAA) U.S. Bathymetry Gap Analysis: https://noaa.maps.arcgis.com/home/item.html?id=4d7d925fc96d47d9ace970dd5040df0aU.S. Environment Protection Agency (EPA) Areas of Concern: https://services.arcgis.com/cJ9YHowT8TU7DUyn/arcgis/rest/services/epa_areas_of_concern_glahf_viewlayer/FeatureServerU.S. Geological Survey (USGS) Great Lakes Subbasins: https://www.sciencebase.gov/catalog/item/530f8a0ee4b0e7e46bd300dd Latest update: February 20, 2025

  18. M

    U.S. Wind Turbine Database, Minnesota and National

    • gisdata.mn.gov
    ags_mapserver, csv +6
    Updated Jun 24, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Geospatial Information Office (2025). U.S. Wind Turbine Database, Minnesota and National [Dataset]. https://gisdata.mn.gov/dataset/util-uswtdb
    Explore at:
    webapp, jpeg, html, csv, gpkg, ags_mapserver, fgdb, shpAvailable download formats
    Dataset updated
    Jun 24, 2025
    Dataset provided by
    Geospatial Information Office
    Area covered
    Minnesota
    Description

    This dataset provides locations and technical specifications of wind turbines in the United States, almost all of which are utility-scale. Utility-scale turbines are ones that generate power and feed it into the grid, supplying a utility with energy. They are usually much larger than turbines that would feed a homeowner or business.

    The data formats downloadable from the Minnesota Geospatial Commons contain just the Minnesota turbines. Data, maps and services accessed from the USWTDB website provide nationwide turbines.

    The regularly updated database has wind turbine records that have been collected, digitized, and locationally verified. Turbine data were gathered from the Federal Aviation Administration's (FAA) Digital Obstacle File (DOF) and Obstruction Evaluation Airport Airspace Analysis (OE-AAA), the American Wind Energy Association (AWEA), Lawrence Berkeley National Laboratory (LBNL), and the United States Geological Survey (USGS), and were merged and collapsed into a single data set.

    Verification of the turbine positions was done by visual interpretation using high-resolution aerial imagery in Esri ArcGIS Desktop. A locational error of plus or minus 10 meters for turbine locations was tolerated. Technical specifications for turbines were assigned based on the wind turbine make and models as provided by manufacturers and project developers directly, and via FAA datasets, information on the wind project developer or turbine manufacturer websites, or other online sources. Some facility and turbine information on make and model did not exist or was difficult to obtain. Thus, uncertainty may exist for certain turbine specifications. Similarly, some turbines were not yet built, not built at all, or for other reasons cannot be verified visually. Location and turbine specifications data quality are rated and a confidence is recorded for both. None of the data are field verified.

    The U.S. Wind Turbine Database website provides the national data in many different formats: shapefile, CSV, GeoJSON, web services (cached and dynamic), API, and web viewer. See: https://eerscmap.usgs.gov/uswtdb/

    The web viewer provides many options to search; filter by attribute, date and location; and customize the map display. For details and screenshots of these options, see: https://eerscmap.usgs.gov/uswtdb/help/

    ------------
    This metadata record was adapted by the Minnesota Geospatial Information Office (MnGeo) from the national version of the metadata. It describes the Minnesota extract of the shapefile data that has been projected from geographic to UTM coordinates and converted to Esri file geodatabase (fgdb) format. There may be more recent updates available on the national website. Accessing the data via the national web services or API will always provide the most recent data.

  19. U

    Site 34 Mississippi River Bathymetry and Velocimetry Data at Structure A1500...

    • data.usgs.gov
    • catalog.data.gov
    • +1more
    Updated May 25, 2016
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Richard Huizinga (2016). Site 34 Mississippi River Bathymetry and Velocimetry Data at Structure A1500 on Interstate 55 in St. Louis, Missouri, October 2010 and May 2016 [Dataset]. http://doi.org/10.5066/F71C1VCC
    Explore at:
    Dataset updated
    May 25, 2016
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Authors
    Richard Huizinga
    License

    U.S. Government Workshttps://www.usa.gov/government-works
    License information was derived automatically

    Time period covered
    Oct 20, 2010 - May 25, 2016
    Area covered
    Interstate 55, St. Louis, Mississippi River, Missouri
    Description

    These data are high-resolution bathymetry (riverbed elevation) and depth-averaged velocities in ASCII format, generated from hydrographic and velocimetric surveys of the Mississippi River near structure A1500 on Interstate 55 in St. Louis, Missouri, in 2010 and 2016. Hydrographic data were collected using a high-resolution multibeam echosounder mapping system (MBMS), which consists of a multibeam echosounder (MBES) and an inertial navigation system (INS) mounted on a marine survey vessel. Data were collected as the vessel traversed the river along planned survey lines distributed throughout the reach. Data collection software integrated and stored the depth data from the MBES and the horizontal and vertical position and attitude data of the vessel from the INS in real time. Data processing required computer software to extract bathymetry data from the raw data files and to summarize and map the information. Velocity data were collected using an acoustic Doppler current profil ...

  20. RTEM Hackaton API and Data Science Tutorials

    • kaggle.com
    zip
    Updated Apr 14, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Pony Biam (2022). RTEM Hackaton API and Data Science Tutorials [Dataset]. https://www.kaggle.com/datasets/ponybiam/onboard-api-intro
    Explore at:
    zip(14011904 bytes)Available download formats
    Dataset updated
    Apr 14, 2022
    Authors
    Pony Biam
    License

    Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
    License information was derived automatically

    Description

    RTEM Hackathon Tutorials

    This data set and associated notebooks are meant to give you a head start in accessing the RTEM Hackathon by showing some examples of data extraction, processing, cleaning, and visualisation. Data availabe in this Kaggle page is only a selected part of the whole data set extracted for the tutorials. A series of Video Tutorials are associated with this dataset and notebooks and is found on the Onboard YouTube channel.

    Part 1 - Onboard API and Onboard API Wrapper Introduction

    An introduction to the API usage and how to retrieve data from it. This notebook is outlined in several YouTube videos that discuss: - how to get started with your account and get oriented to the Kaggle environment, - get acquainted with the Onboard API, - and start using the Onboard API wrapper to extract and explore data.

    Part 2 - Meta-data and Point Exploration Demo

    How to query data points meta-data, process them and visually explore them. This notebook is outlined in several YouTube videos that discuss: - how to get started exploring building metadata/points, - select/merge point lists and export as CSV - and visualize and explore the point lists

    Part 3 - Time-series Data Extraction and Exploration Demo

    How to query time-series from data points, process and visually explore them. This notebook is outlined in several YouTube videos that discuss: - how to load and filter time-series data from sensors - resample and transform time-series data - and create heat maps and boxplots of data for exploration

    Part 4 - Example of starting point for analysis for RTEM and possible directions of analysis

    A quick example of a starting point towards the analysis of the data for some sort of solution and reference to a paper that might help get an overview of the possible directions your team can go in. This notebook is outlined in several YouTube videos that discuss: - overview of use cases and judging criteria - an example of a real-world hypothesis - further development of that simple example

    More information about the data and competition can be found on the RTEM Hackathon website.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Delaware County, Ohio (2020). Delaware County GIS Data Extract Web Map [Dataset]. https://gisdata-delco.hub.arcgis.com/maps/506aa1f8a7a6457097bca43691436674

Delaware County GIS Data Extract Web Map

Explore at:
Dataset updated
Jun 9, 2020
Dataset authored and provided by
Delaware County, Ohio
Area covered
Description

Web map used in Delaware County GIS Data Extract application that allows users to extract Delaware County, Ohio GIS data in various formats.

Search
Clear search
Close search
Google apps
Main menu