100+ datasets found
  1. d

    Mobile Bay National Estuary Program (MBNEP) Habitat Mapping in Mobile Bay...

    • catalog.data.gov
    • datasets.ai
    Updated Jul 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (Point of Contact) (2025). Mobile Bay National Estuary Program (MBNEP) Habitat Mapping in Mobile Bay from 2016-01-17 to 2016-02-10 (NCEI Accession 0183634) [Dataset]. https://catalog.data.gov/dataset/mobile-bay-national-estuary-program-mbnep-habitat-mapping-in-mobile-bay-from-2016-01-17-to-2016
    Explore at:
    Dataset updated
    Jul 1, 2025
    Dataset provided by
    (Point of Contact)
    Area covered
    Mobile Bay
    Description

    This dataset includes geospatial files providing an updated habitat classification map covering wetland and upland coastal habitats throughout Mobile and Baldwin counties in Alabama (approximately 3,671 square miles).

  2. F

    Parking lot locations and utilization samples in the Hannover Linden-Nord...

    • data.uni-hannover.de
    geojson, png
    Updated Apr 17, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Institut für Kartographie und Geoinformatik (2024). Parking lot locations and utilization samples in the Hannover Linden-Nord area from LiDAR mobile mapping surveys [Dataset]. https://data.uni-hannover.de/dataset/parking-locations-and-utilization-from-lidar-mobile-mapping-surveys
    Explore at:
    png(445868), geojson(233948), geojson(4361255), png(10065), png(1370680), geojson(1348252), png(1288581)Available download formats
    Dataset updated
    Apr 17, 2024
    Dataset authored and provided by
    Institut für Kartographie und Geoinformatik
    License

    Attribution-NonCommercial 3.0 (CC BY-NC 3.0)https://creativecommons.org/licenses/by-nc/3.0/
    License information was derived automatically

    Area covered
    Linden - Nord, Hanover
    Description

    Work in progress: data might be changed

    The data set contains the locations of public roadside parking spaces in the northeastern part of Hanover Linden-Nord. As a sample data set, it explicitly does not provide a complete, accurate or correct representation of the conditions! It was collected and processed as part of the 5GAPS research project on September 22nd and October 6th 2022 as a basis for further analysis and in particular as input for simulation studies.

    Vehicle Detections

    Based on the mapping methodology of Bock et al. (2015) and processing of Leichter et al. (2021), the utilization was determined using vehicle detections in segmented 3D point clouds. The corresponding point clouds were collected by driving over the area on two half-days using a LiDAR mobile mapping system, resulting in several hours between observations. Accordingly, these are only a few sample observations. The trips are made in such a way that combined they cover a synthetic day from about 8-20 clock.

    The collected point clouds were georeferenced, processed, and automatically segmented semantically (see Leichter et al., 2021). To automatically extract cars, those points with car labels were clustered by observation epoch and bounding boxes were estimated for the clusters as a representation of car instances. The boxes serve both to filter out unrealistically small and large objects, and to rudimentarily complete the vehicle footprint that may not be fully captured from all sides.

    https://data.uni-hannover.de/dataset/0945cd36-6797-44ac-a6bd-b7311f0f96bc/resource/807618b6-5c38-4456-88a1-cb47500081ff/download/detection_map.png" alt="Overview map of detected vehicles" title="Overview map of detected vehicles"> Figure 1: Overview map of detected vehicles

    Parking Areas

    The public parking areas were digitized manually using aerial images and the detected vehicles in order to exclude irregular parking spaces as far as possible. They were also tagged as to whether they were aligned parallel to the road and assigned to a use at the time of recording, as some are used for construction sites or outdoor catering, for example. Depending on the intended use, they can be filtered individually.

    https://data.uni-hannover.de/dataset/0945cd36-6797-44ac-a6bd-b7311f0f96bc/resource/16b14c61-d1d6-4eda-891d-176bdd787bf5/download/parking_area_example.png" alt="Example parking area occupation pattern" title="Visualization of example parking areas on top of an aerial image [by LGLN]"> Figure 2: Visualization of example parking areas on top of an aerial image [by LGLN]

    Parking Occupancy

    For modelling the parking occupancy, single slots are sampled as center points every 5 m from the parking areas. In this way, they can be integrated into a street/routing graph, for example, as prepared in Wage et al. (2023). Own representations can be generated from the parking area and vehicle detections. Those parking points were intersected with the vehicle boxes to identify occupancy at the respective epochs.

    https://data.uni-hannover.de/dataset/0945cd36-6797-44ac-a6bd-b7311f0f96bc/resource/ca0b97c8-2542-479e-83d7-74adb2fc47c0/download/datenpub-bays.png" alt="Overview map of parking slots' average load" title="Overview map of parking slots' average load"> Figure 3: Overview map of average parking lot load

    However, unoccupied spaces cannot be determined quite as trivially the other way around, since no detected vehicle can result just as from no measurement/observation. Therefore, a parking space is only recorded as unoccupied if a vehicle was detected at the same time in the neighborhood on the same parking lane and therefore it can be assumed that there is a measurement.

    To close temporal gaps, interpolations were made by hour for each parking slot, assuming that between two consecutive observations with an occupancy the space was also occupied in between - or if both times free also free in between. If there was a change, this is indicated by a proportional value. To close spatial gaps, unobserved spaces in the area are drawn randomly from the ten closest occupation patterns around.

    This results in an exemplary occupancy pattern of a synthetic day. Depending on the application, the value could be interpreted as occupancy probability or occupancy share.

    https://data.uni-hannover.de/dataset/0945cd36-6797-44ac-a6bd-b7311f0f96bc/resource/184a1f75-79ab-4d0e-bb1b-8ed170678280/download/occupation_example.png" alt="Example parking area occupation pattern" title="Example parking area occupation pattern"> Figure 4: Example parking area occupation pattern

    References

    • F. Bock, D. Eggert and M. Sester (2015): On-street Parking Statistics Using LiDAR Mobile Mapping, 2015 IEEE 18th International Conference on Intelligent Transportation Systems, Gran Canaria, Spain, 2015, pp. 2812-2818. https://doi.org/10.1109/ITSC.2015.452
    • A. Leichter, U. Feuerhake, and M. Sester (2021): Determination of Parking Space and its Concurrent Usage Over Time Using Semantically Segmented Mobile Mapping Data, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B2-2021, 185–192. https://doi.org/10.5194/isprs-archives-XLIII-B2-2021-185-2021
    • O. Wage, M. Heumann, and L. Bienzeisler (2023): Modeling and Calibration of Last-Mile Logistics to Study Smart-City Dynamic Space Management Scenarios. In 1st ACM SIGSPATIAL International Workshop on Sustainable Mobility (SuMob ’23), November 13, 2023, Hamburg, Germany. ACM, New York, NY, USA, 4 pages. https://doi.org/10.1145/3615899.3627930
  3. g

    Shoreline Mapping Program of PORT OF MOBILE, AL, AL1101

    • gimi9.com
    • datasets.ai
    • +2more
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Shoreline Mapping Program of PORT OF MOBILE, AL, AL1101 [Dataset]. https://gimi9.com/dataset/data-gov_shoreline-mapping-program-of-port-of-mobile-al-al11011
    Explore at:
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Area covered
    Mobile, Alabama
    Description

    These data provide an accurate high-resolution shoreline compiled from imagery of PORT OF MOBILE, AL . This vector shoreline data is based on an office interpretation of imagery that may be suitable as a geographic information system (GIS) data layer. This metadata describes information for both the line and point shapefiles. The NGS attribution scheme 'Coastal Cartographic Object Attribute Source Table (C-COAST)' was developed to conform the attribution of various sources of shoreline data into one attribution catalog. C-COAST is not a recognized standard, but was influenced by the International Hydrographic Organization's S-57 Object-Attribute standard so the data would be more accurately translated into S-57. This resource is a member of https://www.fisheries.noaa.gov/inport/item/39808

  4. Google Maps Dataset

    • brightdata.com
    .json, .csv, .xlsx
    Updated Jan 8, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Bright Data (2023). Google Maps Dataset [Dataset]. https://brightdata.com/products/datasets/google-maps
    Explore at:
    .json, .csv, .xlsxAvailable download formats
    Dataset updated
    Jan 8, 2023
    Dataset authored and provided by
    Bright Datahttps://brightdata.com/
    License

    https://brightdata.com/licensehttps://brightdata.com/license

    Area covered
    Worldwide
    Description

    The Google Maps dataset is ideal for getting extensive information on businesses anywhere in the world. Easily filter by location, business type, and other factors to get the exact data you need. The Google Maps dataset includes all major data points: timestamp, name, category, address, description, open website, phone number, open_hours, open_hours_updated, reviews_count, rating, main_image, reviews, url, lat, lon, place_id, country, and more.

  5. Pharos Repo (2023-2024)

    • figshare.com
    Updated Jul 22, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Carson Moore; Thomas Scherr (2024). Pharos Repo (2023-2024) [Dataset]. http://doi.org/10.6084/m9.figshare.26351764.v1
    Explore at:
    text/x-script.pythonAvailable download formats
    Dataset updated
    Jul 22, 2024
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Carson Moore; Thomas Scherr
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Cleaned dataset for the Pharos application 2023-2024 data collection period (May 2023-March 2024). This dataset includes the full recurring network measurement (RNM), landmark (LM) datasets, as well as the county geographies used for the study catchment area. Also included in this dataset are a text document containing the necessary requirements, as well as python script to clean and visualize the collected data replicating the methods used in our published analysis.

  6. Z

    Dataset for Millimeter-wave Mobile Sensing and Environment Mapping: Models,...

    • data.niaid.nih.gov
    • zenodo.org
    Updated Jul 19, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Valkama, Mikko (2024). Dataset for Millimeter-wave Mobile Sensing and Environment Mapping: Models, Algorithms and Validation [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_4475160
    Explore at:
    Dataset updated
    Jul 19, 2024
    Dataset provided by
    Rastorgueva-Foi, Elizaveta
    Valkama, Mikko
    Turunen, Matias
    Baquero Barneto, Carlos
    Keskin, Musa Furkan
    Riihonen, Taneli
    Talvitie, Jukka
    Wymeersch, Henk
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Dataset of paper "Millimeter-wave Mobile Sensing and Environment Mapping: Models, Algorithms and Validation".

    The measurement data contains indoor mapping results using millimeter-wave 5G NR signals at 28 GHz. The measurement campaign was conducted in an indoor office environment in Hervanta Campus of Tampere University. Six different sets of measurements contain the range profiles after the proposed radar processing. The shared data contains the IQ data of both transmit and receive signals used during the measurement campaign.

    The file "main.m" shows how to process and plot the shared data.

  7. Z

    Robot@Home2, a robotic dataset of home environments

    • data.niaid.nih.gov
    Updated Apr 4, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ambrosio-Cestero, Gregorio (2024). Robot@Home2, a robotic dataset of home environments [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_3901563
    Explore at:
    Dataset updated
    Apr 4, 2024
    Dataset provided by
    Ambrosio-Cestero, Gregorio
    Ruiz-Sarmiento, José Raul
    González-Jiménez, Javier
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The Robot-at-Home dataset (Robot@Home, paper here) is a collection of raw and processed data from five domestic settings compiled by a mobile robot equipped with 4 RGB-D cameras and a 2D laser scanner. Its main purpose is to serve as a testbed for semantic mapping algorithms through the categorization of objects and/or rooms.

    This dataset is unique in three aspects:

    The provided data were captured with a rig of 4 RGB-D sensors with an overall field of view of 180°H. and 58°V., and with a 2D laser scanner.

    It comprises diverse and numerous data: sequences of RGB-D images and laser scans from the rooms of five apartments (87,000+ observations were collected), topological information about the connectivity of these rooms, and 3D reconstructions and 2D geometric maps of the visited rooms.

    The provided ground truth is dense, including per-point annotations of the categories of the objects and rooms appearing in the reconstructed scenarios, and per-pixel annotations of each RGB-D image within the recorded sequences

    During the data collection, a total of 36 rooms were completely inspected, so the dataset is rich in contextual information of objects and rooms. This is a valuable feature, missing in most of the state-of-the-art datasets, which can be exploited by, for instance, semantic mapping systems that leverage relationships like pillows are usually on beds or ovens are not in bathrooms.

    Robot@Home2

    Robot@Home2, is an enhanced version aimed at improving usability and functionality for developing and testing mobile robotics and computer vision algorithms. It consists of three main components. Firstly, a relational database that states the contextual information and data links, compatible with Standard Query Language. Secondly,a Python package for managing the database, including downloading, querying, and interfacing functions. Finally, learning resources in the form of Jupyter notebooks, runnable locally or on the Google Colab platform, enabling users to explore the dataset without local installations. These freely available tools are expected to enhance the ease of exploiting the Robot@Home dataset and accelerate research in computer vision and robotics.

    If you use Robot@Home2, please cite the following paper:

    Gregorio Ambrosio-Cestero, Jose-Raul Ruiz-Sarmiento, Javier Gonzalez-Jimenez, The Robot@Home2 dataset: A new release with improved usability tools, in SoftwareX, Volume 23, 2023, 101490, ISSN 2352-7110, https://doi.org/10.1016/j.softx.2023.101490.

    @article{ambrosio2023robotathome2,title = {The Robot@Home2 dataset: A new release with improved usability tools},author = {Gregorio Ambrosio-Cestero and Jose-Raul Ruiz-Sarmiento and Javier Gonzalez-Jimenez},journal = {SoftwareX},volume = {23},pages = {101490},year = {2023},issn = {2352-7110},doi = {https://doi.org/10.1016/j.softx.2023.101490},url = {https://www.sciencedirect.com/science/article/pii/S2352711023001863},keywords = {Dataset, Mobile robotics, Relational database, Python, Jupyter, Google Colab}}

    Version historyv1.0.1 Fixed minor bugs.v1.0.2 Fixed some inconsistencies in some directory names. Fixes were necessary to automate the generation of the next version.v2.0.0 SQL based dataset. Robot@Home v1.0.2 has been packed into a sqlite database along with RGB-D and scene files which have been assembled into a hierarchical structured directory free of redundancies. Path tables are also provided to reference files in both v1.0.2 and v2.0.0 directory hierarchies. This version has been automatically generated from version 1.0.2 through the toolbox.v2.0.1 A forgotten foreign key pair have been added.v.2.0.2 The views have been consolidated as tables which allows a considerable improvement in access time.v.2.0.3 The previous version does not include the database. In this version the database has been uploaded.v.2.1.0 Depth images have been updated to 16-bit. Additionally, both the RGB images and the depth images are oriented in the original camera format, i.e. landscape.

  8. Indoor3Dmapping dataset

    • zenodo.org
    • data.niaid.nih.gov
    zip
    Updated Mar 20, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Armando Arturo Sánchez Alcázar; Giovanni Pintore; Giovanni Pintore; Matteo Sgrenzaroli; Armando Arturo Sánchez Alcázar; Matteo Sgrenzaroli (2022). Indoor3Dmapping dataset [Dataset]. http://doi.org/10.5281/zenodo.6367381
    Explore at:
    zipAvailable download formats
    Dataset updated
    Mar 20, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Armando Arturo Sánchez Alcázar; Giovanni Pintore; Giovanni Pintore; Matteo Sgrenzaroli; Armando Arturo Sánchez Alcázar; Matteo Sgrenzaroli
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Data Organization
    Under the root directory for the whole acquisition, there is a positions.csv file and 3 subdirectories: img, dense, and sparse. The mobile mapping 3D dataset was generated walking around an indoor space and each corresponds to a unique pose along the trajectory of this motion. This version of the dataset contains a total of 99 unique poses. There is a separation of 1 meter between each adjacent pose.

    root
    ├── img
    │ ├── 

    positions.csv

    • File format: One ASCII file.
    • File structure Rows: Each image is one record.
    • File structure Columns: Comma separated headers, with exact order described below.
      • Filename, column 0: Panorama file name as on disk, without file extension.
      • Timestamps, column 1: Absolute time at which the panorama was captured, Decimal notation, without thousands separator (microseconds).
      • X,Y,Z, columns 2 through 4: Position of the panoramic camera in decimal notation, without thousands separator (meters).
      • w,x,y,z, columns 5 through 8: Rotation of the camera, quaternion.

    sparse

    • Set of equirectangular rendered depth images.
    • 1920x960 resolution
    • 16-bit grayscale PNG
    • White → 0 m
    • Black → ≥ 16 m or absent geometry
    • Occlusions: If a pixel was hit by several rays, only the value of the closest one is represented.

    dense

    • Set of equirectangular rendered depth images.
    • 1920x960 resolution
    • 16-bit grayscale PNG
    • White → 0 m
    • Black → ≥ 16 m or absent geometry
    • Occlusions: If a pixel was hit by several rays, only the value of the closest one is represented.

    img
    A set of equirectangular panoramic images was taken with a 360° color camera in 1920x960 resolution. They follow the same trajectory.

  9. d

    Google Map Data, Google Map Data Scraper, Business location Data- Scrape All...

    • datarade.ai
    Updated May 23, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    APISCRAPY (2022). Google Map Data, Google Map Data Scraper, Business location Data- Scrape All Publicly Available Data From Google Map & Other Platforms [Dataset]. https://datarade.ai/data-products/google-map-data-google-map-data-scraper-business-location-d-apiscrapy
    Explore at:
    .bin, .json, .xml, .csv, .xls, .sql, .txtAvailable download formats
    Dataset updated
    May 23, 2022
    Dataset authored and provided by
    APISCRAPY
    Area covered
    Switzerland, Serbia, Albania, Gibraltar, Svalbard and Jan Mayen, Japan, Bulgaria, Macedonia (the former Yugoslav Republic of), United States of America, Denmark
    Description

    APISCRAPY, your premier provider of Map Data solutions. Map Data encompasses various information related to geographic locations, including Google Map Data, Location Data, Address Data, and Business Location Data. Our advanced Google Map Data Scraper sets us apart by extracting comprehensive and accurate data from Google Maps and other platforms.

    What sets APISCRAPY's Map Data apart are its key benefits:

    1. Accuracy: Our scraping technology ensures the highest level of accuracy, providing reliable data for informed decision-making. We employ advanced algorithms to filter out irrelevant or outdated information, ensuring that you receive only the most relevant and up-to-date data.

    2. Accessibility: With our data readily available through APIs, integration into existing systems is seamless, saving time and resources. Our APIs are easy to use and well-documented, allowing for quick implementation into your workflows. Whether you're a developer building a custom application or a business analyst conducting market research, our APIs provide the flexibility and accessibility you need.

    3. Customization: We understand that every business has unique needs and requirements. That's why we offer tailored solutions to meet specific business needs. Whether you need data for a one-time project or ongoing monitoring, we can customize our services to suit your needs. Our team of experts is always available to provide support and guidance, ensuring that you get the most out of our Map Data solutions.

    Our Map Data solutions cater to various use cases:

    1. B2B Marketing: Gain insights into customer demographics and behavior for targeted advertising and personalized messaging. Identify potential customers based on their geographic location, interests, and purchasing behavior.

    2. Logistics Optimization: Utilize Location Data to optimize delivery routes and improve operational efficiency. Identify the most efficient routes based on factors such as traffic patterns, weather conditions, and delivery deadlines.

    3. Real Estate Development: Identify prime locations for new ventures using Business Location Data for market analysis. Analyze factors such as population density, income levels, and competition to identify opportunities for growth and expansion.

    4. Geospatial Analysis: Leverage Map Data for spatial analysis, urban planning, and environmental monitoring. Identify trends and patterns in geographic data to inform decision-making in areas such as land use planning, resource management, and disaster response.

    5. Retail Expansion: Determine optimal locations for new stores or franchises using Location Data and Address Data. Analyze factors such as foot traffic, proximity to competitors, and demographic characteristics to identify locations with the highest potential for success.

    6. Competitive Analysis: Analyze competitors' business locations and market presence for strategic planning. Identify areas of opportunity and potential threats to your business by analyzing competitors' geographic footprint, market share, and customer demographics.

    Experience the power of APISCRAPY's Map Data solutions today and unlock new opportunities for your business. With our accurate and accessible data, you can make informed decisions, drive growth, and stay ahead of the competition.

    [ Related tags: Map Data, Google Map Data, Google Map Data Scraper, B2B Marketing, Location Data, Map Data, Google Data, Location Data, Address Data, Business location data, map scraping data, Google map data extraction, Transport and Logistic Data, Mobile Location Data, Mobility Data, and IP Address Data, business listings APIs, map data, map datasets, map APIs, poi dataset, GPS, Location Intelligence, Retail Site Selection, Sentiment Analysis, Marketing Data Enrichment, Point of Interest (POI) Mapping]

  10. D

    Detroit Street View Terrestrial LiDAR (2020-2022)

    • detroitdata.org
    • data.detroitmi.gov
    • +1more
    Updated Apr 18, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    City of Detroit (2023). Detroit Street View Terrestrial LiDAR (2020-2022) [Dataset]. https://detroitdata.org/dataset/detroit-street-view-terrestrial-lidar-2020-2022
    Explore at:
    arcgis geoservices rest api, zip, csv, gdb, gpkg, txt, html, geojson, kml, xlsxAvailable download formats
    Dataset updated
    Apr 18, 2023
    Dataset provided by
    City of Detroit
    Area covered
    Detroit
    Description

    Detroit Street View (DSV) is an urban remote sensing program run by the Enterprise Geographic Information Systems (EGIS) Team within the Department of Innovation and Technology at the City of Detroit. The mission of Detroit Street View is ‘To continuously observe and document Detroit’s changing physical environment through remote sensing, resulting in freely available foundational data that empowers effective city operations, informed decision making, awareness, and innovation.’ LiDAR (as well as panoramic imagery) is collected using a vehicle-mounted mobile mapping system.

    Due to variations in processing, index lines are not currently available for all existing LiDAR datasets, including all data collected before September 2020. Index lines represent the approximate path of the vehicle within the time extent of the given LiDAR file. The actual geographic extent of the LiDAR point cloud varies dependent on line-of-sight.

    Compressed (LAZ format) point cloud files may be requested by emailing gis@detroitmi.gov with a description of the desired geographic area, any specific dates/file names, and an explanation of interest and/or intended use. Requests will be filled at the discretion and availability of the Enterprise GIS Team. Deliverable file size limitations may apply and requestors may be asked to provide their own online location or physical media for transfer.

    LiDAR was collected using an uncalibrated Trimble MX2 mobile mapping system. The data is not quality controlled, and no accuracy assessment is provided or implied. Results are known to vary significantly. Users should exercise caution and conduct their own comprehensive suitability assessments before requesting and applying this data.

    Sample Dataset: https://detroitmi.maps.arcgis.com/home/item.html?id=69853441d944442f9e79199b57f26fe3

    DSV Logo

  11. d

    Shoreline Data Rescue Project of Mobile Bay, Alabama, AL26C01

    • catalog.data.gov
    • fisheries.noaa.gov
    Updated Oct 31, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NGS Communications and Outreach Branch (Point of Contact, Custodian) (2024). Shoreline Data Rescue Project of Mobile Bay, Alabama, AL26C01 [Dataset]. https://catalog.data.gov/dataset/shoreline-data-rescue-project-of-mobile-bay-alabama-al26c011
    Explore at:
    Dataset updated
    Oct 31, 2024
    Dataset provided by
    NGS Communications and Outreach Branch (Point of Contact, Custodian)
    Area covered
    Mobile Bay, Alabama
    Description

    These data were automated to provide an accurate high-resolution historical shoreline of Mobile Bay, Alabama suitable as a geographic information system (GIS) data layer. These data are derived from shoreline maps that were produced by the NOAA National Ocean Service including its predecessor agencies which were based on an office interpretation of imagery and/or field survey. The NGS attribution scheme 'Coastal Cartographic Object Attribute Source Table (C-COAST)' was developed to conform the attribution of various sources of shoreline data into one attribution catalog. C-COAST is not a recognized standard, but was influenced by the International Hydrographic Organization's S-57 Object-Attribute standard so the data would be more accurately translated into S-57. This resource is a member of https://www.fisheries.noaa.gov/inport/item/39808

  12. f

    Camera-LiDAR Datasets

    • figshare.com
    zip
    Updated Aug 14, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jennifer Leahy (2024). Camera-LiDAR Datasets [Dataset]. http://doi.org/10.6084/m9.figshare.26660863.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Aug 14, 2024
    Dataset provided by
    figshare
    Authors
    Jennifer Leahy
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The datasets are original and specifically collected for research aimed at reducing registration errors between Camera-LiDAR datasets. Traditional methods often struggle with aligning 2D-3D data from sources that have different coordinate systems and resolutions. Our collection comprises six datasets from two distinct setups, designed to enhance versatility in our approach and improve matching accuracy across both high-feature and low-feature environments.Survey-Grade Terrestrial Dataset:Collection Details: Data was gathered across various scenes on the University of New Brunswick campus, including low-feature walls, high-feature laboratory rooms, and outdoor tree environments.Equipment: LiDAR data was captured using a Trimble TX5 3D Laser Scanner, while optical images were taken with a Canon EOS 5D Mark III DSLR camera.Mobile Mapping System Dataset:Collection Details: This dataset was collected using our custom-built Simultaneous Localization and Multi-Sensor Mapping Robot (SLAMM-BOT) in several indoor mobile scenes to validate our methods.Equipment: Data was acquired using a Velodyne VLP-16 LiDAR scanner and an Arducam IMX477 Mini camera, controlled via a Raspberry Pi board.

  13. d

    Shoreline Mapping Program of GRAND BAY TO PENSACOLA MOBILE BAY, AL, AL9701

    • catalog.data.gov
    • fisheries.noaa.gov
    Updated Oct 31, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NGS Communications and Outreach Branch (Point of Contact, Custodian) (2024). Shoreline Mapping Program of GRAND BAY TO PENSACOLA MOBILE BAY, AL, AL9701 [Dataset]. https://catalog.data.gov/dataset/shoreline-mapping-program-of-grand-bay-to-pensacola-mobile-bay-al-al9701
    Explore at:
    Dataset updated
    Oct 31, 2024
    Dataset provided by
    NGS Communications and Outreach Branch (Point of Contact, Custodian)
    Area covered
    Pensacola, Mobile Bay
    Description

    These data were automated to provide an accurate high-resolution composite shoreline of GRAND BAY TO PENSACOLA MOBILE BAY, AL suitable as a geographic information system (GIS) data layer. These data are derived from shoreline maps that were produced by the NOAA National Ocean Service including its predecessor agencies. This metadata describes information for both the line and point shapefiles. The NGS's attribution scheme 'Coastal Cartographic Object Attribute Source Table (C-COAST) was developed to conform the attribution of various sources of shoreline data into one attribution catalog. C-COAST is not a recognized standard but was influenced by the International Hydrographic Organization's S-57 Object-Attribute standard so that the data would be more accurately translated into S-57.

  14. 2014 Mobile County, AL DMC 4-Band 8 Bit Imagery

    • catalog.data.gov
    • fisheries.noaa.gov
    Updated Oct 31, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NOAA Office for Coastal Management (Point of Contact, Custodian) (2024). 2014 Mobile County, AL DMC 4-Band 8 Bit Imagery [Dataset]. https://catalog.data.gov/dataset/2014-mobile-county-al-dmc-4-band-8-bit-imagery2
    Explore at:
    Dataset updated
    Oct 31, 2024
    Dataset provided by
    National Oceanic and Atmospheric Administrationhttp://www.noaa.gov/
    Area covered
    Mobile County, Alabama
    Description

    This dataset was developed to support Mobile County. The project area is 1402 square miles of Mobile County land. The scope of work involved data acquisition and processing, and the development of digital color orthophotography at 0.5 foot pixel resolution, meeting ASPRS Class II Standards for 1"=100' scale mapping for the entire project area. Original contact information: Contact Name: Scott Kearney Contact Org: City of Mobile Title: GIS Manager Phone: (251) 208-7942 Email: kearney@cityofmobile.org

  15. r

    Performance mobile tiles

    • redivis.com
    Updated Jun 21, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Environmental Impact Data Collaborative (2022). Performance mobile tiles [Dataset]. https://redivis.com/datasets/est2-5wy4yk56m
    Explore at:
    Dataset updated
    Jun 21, 2022
    Dataset authored and provided by
    Environmental Impact Data Collaborative
    Description

    The table Performance mobile tiles is part of the dataset Speedtest by Ookla Global Fixed and Mobile Network Performance Maps, available at https://redivis.com/datasets/est2-5wy4yk56m. It contains 3893022 rows across 7 variables.

  16. D

    Data from: Developing a SLAM-based backpack mobile mapping system for indoor...

    • phys-techsciences.datastations.nl
    bin, exe, zip
    Updated Feb 22, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    S. Karam; S. Karam (2022). Developing a SLAM-based backpack mobile mapping system for indoor mapping [Dataset]. http://doi.org/10.17026/DANS-XME-KEPM
    Explore at:
    bin(11456605), zip(21733), exe(17469035), exe(18190303), exe(447), bin(20142672), bin(62579), exe(17513963), bin(45862), exe(17284627), bin(6856377), bin(9279586), exe(17548337), exe(199), exe(17969103), bin(235037), exe(18250973), bin(192189), bin(14741220), bin(3471971), bin(127397), bin(338998), exe(23702808)Available download formats
    Dataset updated
    Feb 22, 2022
    Dataset provided by
    DANS Data Station Physical and Technical Sciences
    Authors
    S. Karam; S. Karam
    License

    https://doi.org/10.17026/fp39-0x58https://doi.org/10.17026/fp39-0x58

    Description

    These files are to support the published journal and thesis about the IMU and LIDAR SLAM for indoor mapping. They include datasets and functions used for point clouds generation. Date Submitted: 2022-02-21

  17. P

    Newer College Dataset

    • paperswithcode.com
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Newer College Dataset [Dataset]. https://paperswithcode.com/dataset/newer-college
    Explore at:
    Description

    The Newer College Dataset is a large dataset with a variety of mobile mapping sensors collected using a handheld device carried at typical walking speeds for nearly 2.2 km through New College, Oxford. The dataset includes data from two commercially available devices - a stereoscopic-inertial camera and a multi-beam 3D LiDAR, which also provides inertial measurements. Additionally, the authors used a tripod-mounted survey grade LiDAR scanner to capture a detailed millimeter-accurate 3D map of the test location (containing ∼290 million points).

    Using the map the authors inferred centimeter-accurate 6 Degree of Freedom (DoF) ground truth for the position of the device for each LiDAR scan to enable better evaluation of LiDAR and vision localisation, mapping and reconstruction systems. The dataset combines both built environments, open spaces and vegetated areas so as to test localization and mapping systems such as vision-based navigation, visual and LiDAR SLAM, 3D LIDAR reconstruction and appearance-based place recognition.

  18. World Transportation

    • wifire-data.sdsc.edu
    csv, esri rest +4
    Updated Jun 9, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Esri (2021). World Transportation [Dataset]. https://wifire-data.sdsc.edu/dataset/world-transportation
    Explore at:
    geojson, kml, esri rest, csv, zip, htmlAvailable download formats
    Dataset updated
    Jun 9, 2021
    Dataset provided by
    Esrihttp://esri.com/
    Area covered
    World
    Description

    This map presents transportation data, including highways, roads, railroads, and airports for the world.

    The map was developed by Esri using Esri highway data; Garmin basemap layers; HERE street data for North America, Europe, Australia, New Zealand, South America and Central America, India, most of the Middle East and Asia, and select countries in Africa. Data for Pacific Island nations and the remaining countries of Africa was sourced from OpenStreetMap contributors. Specific country list and documentation of Esri's process for including OSM data is available to view.

    You can add this layer on top of any imagery, such as the Esri World Imagery map service, to provide a useful reference overlay that also includes street labels at the largest scales. (At the largest scales, the line symbols representing the streets and roads are automatically hidden and only the labels showing the names of streets and roads are shown). Imagery With Labels basemap in the basemap dropdown in the ArcGIS web and mobile clients does not include this World Transportation map. If you use the Imagery With Labels basemap in your map and you want to have road and street names, simply add this World Transportation layer into your map. It is designed to be drawn underneath the labels in the Imagery With Labels basemap, and that is how it will be drawn if you manually add it into your web map.

  19. F

    i.c.sens Visual-Inertial-LiDAR Dataset

    • data.uni-hannover.de
    bag, jpeg, pdf, png +2
    Updated Dec 12, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    i.c.sens (2024). i.c.sens Visual-Inertial-LiDAR Dataset [Dataset]. https://data.uni-hannover.de/dataset/i-c-sens-visual-inertial-lidar-dataset
    Explore at:
    txt(285), png(650007), jpeg(153522), txt(1049), jpeg(129333), rviz(6412), bag(7419679751), bag(9980268682), bag(9982003259), bag(9960305979), pdf(21788288), jpeg(556618), bag(9971699339), bag(9896857478), bag(9939783847), bag(9969171093)Available download formats
    Dataset updated
    Dec 12, 2024
    Dataset authored and provided by
    i.c.sens
    License

    Attribution-NonCommercial 3.0 (CC BY-NC 3.0)https://creativecommons.org/licenses/by-nc/3.0/
    License information was derived automatically

    Description

    The i.c.sens Visual-Inertial-LiDAR Dataset is a data set for the evaluation of dead reckoning or SLAM approaches in the context of mobile robotics. It consists of street-level monocular RGB camera images, a front-facing 180° point cloud, angular velocities, accelerations and an accurate ground truth trajectory. In total, we provide around 77 GB of data resulting from a 15 minutes drive, which is split into 8 rosbags of 2 minutes (10 GB) each. Besides, the intrinsic camera parameters and the extrinsic transformations between all sensor coordinate systems are given. Details on the data and its usage can be found in the provided documentation file.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/0ff90ef9-fa61-4ee3-b69e-eb6461abc57b/download/sensor_platform_small.jpg" alt="">

    Image credit: Sören Vogel

    The data set was acquired in the context of the measurement campaign described in Schoen2018. Here, a vehicle, which can be seen below, was equipped with a self-developed sensor platform and a commercially available Riegl VMX-250 Mobile Mapping System. This Mobile Mapping System consists of two laser scanners, a camera system and a localization unit containing a highly accurate GNSS/IMU system.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/2a1226b8-8821-4c46-b411-7d63491963ed/download/vehicle_small.jpg" alt="">

    Image credit: Sören Vogel

    The data acquisition took place in May 2019 during a sunny day in the Nordstadt of Hannover (coordinates: 52.388598, 9.716389). The route we took can be seen below. This route was completed three times in total, which amounts to a total driving time of 15 minutes.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/8a570408-c392-4bd7-9c1e-26964f552d6c/download/google_earth_overview_small.png" alt="">

    The self-developed sensor platform consists of several sensors. This dataset provides data from the following sensors:

    • Velodyne HDL-64 LiDAR
    • LORD MicroStrain 3DM-GQ4-45 GNSS aided IMU
    • Pointgrey GS3-U3-23S6C-C RGB camera

    To inspect the data, first start a rosmaster and launch rviz using the provided configuration file:

    roscore & rosrun rviz rviz -d icsens_data.rviz
    

    Afterwards, start playing a rosbag with

    rosbag play icsens-visual-inertial-lidar-dataset-{number}.bag --clock
    

    Below we provide some exemplary images and their corresponding point clouds.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/dc1563c0-9b5f-4c84-b432-711916cb204c/download/combined_examples_small.jpg" alt="">

    Related publications:

    • R. Voges, C. S. Wieghardt, and B. Wagner, “Finding Timestamp Offsets for a Multi-Sensor System Using Sensor Observations,” Photogrammetric Engineering & Remote Sensing, vol. 84, no. 6, pp. 357–366, 2018.

    • R. Voges and B. Wagner, “RGB-Laser Odometry Under Interval Uncertainty for Guaranteed Localization,” in Book of Abstracts of the 11th Summer Workshop on Interval Methods (SWIM 2018), Rostock, Germany, Jul. 2018.

    • R. Voges and B. Wagner, “Timestamp Offset Calibration for an IMU-Camera System Under Interval Uncertainty,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, Oct. 2018.

    • R. Voges and B. Wagner, “Extrinsic Calibration Between a 3D Laser Scanner and a Camera Under Interval Uncertainty,” in Book of Abstracts of the 12th Summer Workshop on Interval Methods (SWIM 2019), Palaiseau, France, Jul. 2019.

    • R. Voges, B. Wagner, and V. Kreinovich, “Efficient Algorithms for Synchronizing Localization Sensors Under Interval Uncertainty,” Reliable Computing (Interval Computations), vol. 27, no. 1, pp. 1–11, 2020.

    • R. Voges, B. Wagner, and V. Kreinovich, “Odometry under Interval Uncertainty: Towards Optimal Algorithms, with Potential Application to Self-Driving Cars and Mobile Robots,” Reliable Computing (Interval Computations), vol. 27, no. 1, pp. 12–20, 2020.

    • R. Voges and B. Wagner, “Set-Membership Extrinsic Calibration of a 3D LiDAR and a Camera,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, Oct. 2020, accepted.

    • R. Voges, “Bounded-Error Visual-LiDAR Odometry on Mobile Robots Under Consideration of Spatiotemporal Uncertainties,” PhD thesis, Gottfried Wilhelm Leibniz Universität, 2020.

  20. Data from: Robot@Home, a robotic dataset for semantic mapping of home...

    • zenodo.org
    application/gzip
    Updated Sep 28, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    José Raul Ruiz-Sarmiento; Cipriano Galindo; Javier González-Jiménez; Gregorio Ambrosio-Cestero; José Raul Ruiz-Sarmiento; Cipriano Galindo; Javier González-Jiménez; Gregorio Ambrosio-Cestero (2023). Robot@Home, a robotic dataset for semantic mapping of home environments [Dataset]. http://doi.org/10.5281/zenodo.3901564
    Explore at:
    application/gzipAvailable download formats
    Dataset updated
    Sep 28, 2023
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    José Raul Ruiz-Sarmiento; Cipriano Galindo; Javier González-Jiménez; Gregorio Ambrosio-Cestero; José Raul Ruiz-Sarmiento; Cipriano Galindo; Javier González-Jiménez; Gregorio Ambrosio-Cestero
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The Robot-at-Home dataset (Robot@Home, paper here) is a collection of raw and processed data from five domestic settings compiled by a mobile robot equipped with 4 RGB-D cameras and a 2D laser scanner. Its main purpose is to serve as a testbed for semantic mapping algorithms through the categorization of objects and/or rooms.

    This dataset is unique in three aspects:

    • The provided data were captured with a rig of 4 RGB-D sensors with an overall field of view of 180°H. and 58°V., and with a 2D laser scanner.
    • It comprises diverse and numerous data: sequences of RGB-D images and laser scans from the rooms of five apartments (87,000+ observations were collected), topological information about the connectivity of these rooms, and 3D reconstructions and 2D geometric maps of the visited rooms.
    • The provided ground truth is dense, including per-point annotations of the categories of the objects and rooms appearing in the reconstructed scenarios, and per-pixel annotations of each RGB-D image within the recorded sequences

    During the data collection, a total of 36 rooms were completely inspected, so the dataset is rich in contextual information of objects and rooms. This is a valuable feature, missing in most of the state-of-the-art datasets, which can be exploited by, for instance, semantic mapping systems that leverage relationships like pillows are usually on beds or ovens are not in bathrooms.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
(Point of Contact) (2025). Mobile Bay National Estuary Program (MBNEP) Habitat Mapping in Mobile Bay from 2016-01-17 to 2016-02-10 (NCEI Accession 0183634) [Dataset]. https://catalog.data.gov/dataset/mobile-bay-national-estuary-program-mbnep-habitat-mapping-in-mobile-bay-from-2016-01-17-to-2016

Mobile Bay National Estuary Program (MBNEP) Habitat Mapping in Mobile Bay from 2016-01-17 to 2016-02-10 (NCEI Accession 0183634)

Explore at:
Dataset updated
Jul 1, 2025
Dataset provided by
(Point of Contact)
Area covered
Mobile Bay
Description

This dataset includes geospatial files providing an updated habitat classification map covering wetland and upland coastal habitats throughout Mobile and Baldwin counties in Alabama (approximately 3,671 square miles).

Search
Clear search
Close search
Google apps
Main menu