4 datasets found
  1. Data from: 3DHD CityScenes: High-Definition Maps in High-Density Point...

    • zenodo.org
    • data.niaid.nih.gov
    • +1more
    bin, pdf
    Updated Jul 16, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Christopher Plachetka; Benjamin Sertolli; Jenny Fricke; Marvin Klingner; Tim Fingscheidt; Christopher Plachetka; Benjamin Sertolli; Jenny Fricke; Marvin Klingner; Tim Fingscheidt (2024). 3DHD CityScenes: High-Definition Maps in High-Density Point Clouds [Dataset]. http://doi.org/10.5281/zenodo.7085090
    Explore at:
    bin, pdfAvailable download formats
    Dataset updated
    Jul 16, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Christopher Plachetka; Benjamin Sertolli; Jenny Fricke; Marvin Klingner; Tim Fingscheidt; Christopher Plachetka; Benjamin Sertolli; Jenny Fricke; Marvin Klingner; Tim Fingscheidt
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Overview

    3DHD CityScenes is the most comprehensive, large-scale high-definition (HD) map dataset to date, annotated in the three spatial dimensions of globally referenced, high-density LiDAR point clouds collected in urban domains. Our HD map covers 127 km of road sections of the inner city of Hamburg, Germany including 467 km of individual lanes. In total, our map comprises 266,762 individual items.

    Our corresponding paper (published at ITSC 2022) is available here.
    Further, we have applied 3DHD CityScenes to map deviation detection here.

    Moreover, we release code to facilitate the application of our dataset and the reproducibility of our research. Specifically, our 3DHD_DevKit comprises:

    • Python tools to read, generate, and visualize the dataset,
    • 3DHDNet deep learning pipeline (training, inference, evaluation) for
      map deviation detection and 3D object detection.

    The DevKit is available here:

    https://github.com/volkswagen/3DHD_devkit.

    The dataset and DevKit have been created by Christopher Plachetka as project lead during his PhD period at Volkswagen Group, Germany.

    When using our dataset, you are welcome to cite:

    @INPROCEEDINGS{9921866,
      author={Plachetka, Christopher and Sertolli, Benjamin and Fricke, Jenny and Klingner, Marvin and 
      Fingscheidt, Tim},
      booktitle={2022 IEEE 25th International Conference on Intelligent Transportation Systems (ITSC)}, 
      title={3DHD CityScenes: High-Definition Maps in High-Density Point Clouds}, 
      year={2022},
      pages={627-634}}

    Acknowledgements

    We thank the following interns for their exceptional contributions to our work.

    • Benjamin Sertolli: Major contributions to our DevKit during his master thesis
    • Niels Maier: Measurement campaign for data collection and data preparation

    The European large-scale project Hi-Drive (www.Hi-Drive.eu) supports the publication of 3DHD CityScenes and encourages the general publication of information and databases facilitating the development of automated driving technologies.

    The Dataset

    After downloading, the 3DHD_CityScenes folder provides five subdirectories, which are explained briefly in the following.

    1. Dataset

    This directory contains the training, validation, and test set definition (train.json, val.json, test.json) used in our publications. Respective files contain samples that define a geolocation and the orientation of the ego vehicle in global coordinates on the map.

    During dataset generation (done by our DevKit), samples are used to take crops from the larger point cloud. Also, map elements in reach of a sample are collected. Both modalities can then be used, e.g., as input to a neural network such as our 3DHDNet.

    To read any JSON-encoded data provided by 3DHD CityScenes in Python, you can use the following code snipped as an example.

    import json
    
    json_path = r"E:\3DHD_CityScenes\Dataset\train.json"
    with open(json_path) as jf:
      data = json.load(jf)
    print(data)

    2. HD_Map

    Map items are stored as lists of items in JSON format. In particular, we provide:

    • traffic signs,
    • traffic lights,
    • pole-like objects,
    • construction site locations,
    • construction site obstacles (point-like such as cones, and line-like such as fences),
    • line-shaped markings (solid, dashed, etc.),
    • polygon-shaped markings (arrows, stop lines, symbols, etc.),
    • lanes (ordinary and temporary),
    • relations between elements (only for construction sites, e.g., sign to lane association).

    3. HD_Map_MetaData

    Our high-density point cloud used as basis for annotating the HD map is split in 648 tiles. This directory contains the geolocation for each tile as polygon on the map. You can view the respective tile definition using QGIS. Alternatively, we also provide respective polygons as lists of UTM coordinates in JSON.

    Files with the ending .dbf, .prj, .qpj, .shp, and .shx belong to the tile definition as “shape file” (commonly used in geodesy) that can be viewed using QGIS. The JSON file contains the same information provided in a different format used in our Python API.

    4. HD_PointCloud_Tiles

    The high-density point cloud tiles are provided in global UTM32N coordinates and are encoded in a proprietary binary format. The first 4 bytes (integer) encode the number of points contained in that file. Subsequently, all point cloud values are provided as arrays. First all x-values, then all y-values, and so on. Specifically, the arrays are encoded as follows.

    • x-coordinates: 4 byte integer
    • y-coordinates: 4 byte integer
    • z-coordinates: 4 byte integer
    • intensity of reflected beams: 2 byte unsigned integer
    • ground classification flag: 1 byte unsigned integer

    After reading, respective values have to be unnormalized. As an example, you can use the following code snipped to read the point cloud data. For visualization, you can use the pptk package, for instance.

    import numpy as np
    import pptk
    
    file_path = r"E:\3DHD_CityScenes\HD_PointCloud_Tiles\HH_001.bin"
    pc_dict = {}
    key_list = ['x', 'y', 'z', 'intensity', 'is_ground']
    type_list = ['

    5. Trajectories

    We provide 15 real-world trajectories recorded during a measurement campaign covering the whole HD map. Trajectory samples are provided approx. with 30 Hz and are encoded in JSON.

    These trajectories were used to provide the samples in train.json, val.json. and test.json with realistic geolocations and orientations of the ego vehicle.

    • OP1 – OP5 cover the majority of the map with 5 trajectories.
    • RH1 – RH10 cover the majority of the map with 10 trajectories.

    Note that OP5 is split into three separate parts, a-c. RH9 is split into two parts, a-b. Moreover, OP4 mostly equals OP1 (thus, we speak of 14 trajectories in our paper). For completeness, however, we provide all recorded trajectories here.

  2. d

    Habitat Sampling Initiative - Datasets - data.wa.gov.au

    • catalogue.data.wa.gov.au
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Habitat Sampling Initiative - Datasets - data.wa.gov.au [Dataset]. https://catalogue.data.wa.gov.au/dataset/habitat-sampling-initiative
    Explore at:
    Area covered
    Western Australia
    Description

    An overview of benthic habitat surveys in Western Australia, combining surveys from multiple State Government agencies, research institutions and Universities. Disclaimer: The map is in development and does not show real or comprehensive survey data until this message disappears. Contributing data Attendees of the Managing Coastal Vulnerability workshop can: Register your account and contact us. We will give you write permission by making you admin or editor of your organisation, and member of the Habitat Sampling Initiative Group. Add metadata for your data by creating a dataset, attach a GeoJSON (QGIS video tutorial, save as CRS EPSG 4326/WGS84) or KML file of your surveyed transects (including survey date or period in site attributes if possible) as resources, and add the dataset to the group Habitat Sampling Initiative. Add the link to your access-restricted data on Pawsey as another resource to the dataset. Add any other public data resource here if and when appropriate. Use the CKAN API to upload metadata from your existing catalogues following these examples. Discovering data The following resources give an interactive overview of all Habitat Sampling Initiative datasets:

  3. d

    Habitat Sampling Initiative

    • data.gov.au
    html
    Updated May 25, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department of Biodiversity, Conservation and Attractions (2022). Habitat Sampling Initiative [Dataset]. https://data.gov.au/dataset/ds-wa-f299f8c6-5913-4628-9dfb-514a46725204
    Explore at:
    htmlAvailable download formats
    Dataset updated
    May 25, 2022
    Dataset provided by
    Department of Biodiversity, Conservation and Attractions
    Description

    An overview of benthic habitat surveys in Western Australia, combining surveys from multiple State Government agencies, research institutions and Universities. Disclaimer: The map is in development …Show full descriptionAn overview of benthic habitat surveys in Western Australia, combining surveys from multiple State Government agencies, research institutions and Universities. Disclaimer: The map is in development and does not show real or comprehensive survey data until this message disappears. Contributing data Attendees of the Managing Coastal Vulnerability workshop can: Register your account and contact us. We will give you write permission by making you admin or editor of your organisation, and member of the Habitat Sampling Initiative Group. Add metadata for your data by creating a dataset, attach a GeoJSON (QGIS video tutorial, save as CRS EPSG 4326/WGS84) or KML file of your surveyed transects (including survey date or period in site attributes if possible) as resources, and add the dataset to the group Habitat Sampling Initiative. Add the link to your access-restricted data on Pawsey as another resource to the dataset. Add any other public data resource here if and when appropriate. Use the CKAN API to upload metadata from your existing catalogues following these examples. Discovering data The following resources give an interactive overview of all Habitat Sampling Initiative datasets:

  4. d

    500 Cities: City Boundaries

    • catalog.data.gov
    • healthdata.gov
    • +5more
    Updated Feb 3, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Centers for Disease Control and Prevention (2025). 500 Cities: City Boundaries [Dataset]. https://catalog.data.gov/dataset/500-cities-city-boundaries
    Explore at:
    Dataset updated
    Feb 3, 2025
    Dataset provided by
    Centers for Disease Control and Prevention
    Description

    This city boundary shapefile was extracted from Esri Data and Maps for ArcGIS 2014 - U.S. Populated Place Areas. This shapefile can be joined to 500 Cities city-level Data (GIS Friendly Format) in a geographic information system (GIS) to make city-level maps.

  5. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Christopher Plachetka; Benjamin Sertolli; Jenny Fricke; Marvin Klingner; Tim Fingscheidt; Christopher Plachetka; Benjamin Sertolli; Jenny Fricke; Marvin Klingner; Tim Fingscheidt (2024). 3DHD CityScenes: High-Definition Maps in High-Density Point Clouds [Dataset]. http://doi.org/10.5281/zenodo.7085090
Organization logo

Data from: 3DHD CityScenes: High-Definition Maps in High-Density Point Clouds

Related Article
Explore at:
bin, pdfAvailable download formats
Dataset updated
Jul 16, 2024
Dataset provided by
Zenodohttp://zenodo.org/
Authors
Christopher Plachetka; Benjamin Sertolli; Jenny Fricke; Marvin Klingner; Tim Fingscheidt; Christopher Plachetka; Benjamin Sertolli; Jenny Fricke; Marvin Klingner; Tim Fingscheidt
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

Overview

3DHD CityScenes is the most comprehensive, large-scale high-definition (HD) map dataset to date, annotated in the three spatial dimensions of globally referenced, high-density LiDAR point clouds collected in urban domains. Our HD map covers 127 km of road sections of the inner city of Hamburg, Germany including 467 km of individual lanes. In total, our map comprises 266,762 individual items.

Our corresponding paper (published at ITSC 2022) is available here.
Further, we have applied 3DHD CityScenes to map deviation detection here.

Moreover, we release code to facilitate the application of our dataset and the reproducibility of our research. Specifically, our 3DHD_DevKit comprises:

  • Python tools to read, generate, and visualize the dataset,
  • 3DHDNet deep learning pipeline (training, inference, evaluation) for
    map deviation detection and 3D object detection.

The DevKit is available here:

https://github.com/volkswagen/3DHD_devkit.

The dataset and DevKit have been created by Christopher Plachetka as project lead during his PhD period at Volkswagen Group, Germany.

When using our dataset, you are welcome to cite:

@INPROCEEDINGS{9921866,
  author={Plachetka, Christopher and Sertolli, Benjamin and Fricke, Jenny and Klingner, Marvin and 
  Fingscheidt, Tim},
  booktitle={2022 IEEE 25th International Conference on Intelligent Transportation Systems (ITSC)}, 
  title={3DHD CityScenes: High-Definition Maps in High-Density Point Clouds}, 
  year={2022},
  pages={627-634}}

Acknowledgements

We thank the following interns for their exceptional contributions to our work.

  • Benjamin Sertolli: Major contributions to our DevKit during his master thesis
  • Niels Maier: Measurement campaign for data collection and data preparation

The European large-scale project Hi-Drive (www.Hi-Drive.eu) supports the publication of 3DHD CityScenes and encourages the general publication of information and databases facilitating the development of automated driving technologies.

The Dataset

After downloading, the 3DHD_CityScenes folder provides five subdirectories, which are explained briefly in the following.

1. Dataset

This directory contains the training, validation, and test set definition (train.json, val.json, test.json) used in our publications. Respective files contain samples that define a geolocation and the orientation of the ego vehicle in global coordinates on the map.

During dataset generation (done by our DevKit), samples are used to take crops from the larger point cloud. Also, map elements in reach of a sample are collected. Both modalities can then be used, e.g., as input to a neural network such as our 3DHDNet.

To read any JSON-encoded data provided by 3DHD CityScenes in Python, you can use the following code snipped as an example.

import json

json_path = r"E:\3DHD_CityScenes\Dataset\train.json"
with open(json_path) as jf:
  data = json.load(jf)
print(data)

2. HD_Map

Map items are stored as lists of items in JSON format. In particular, we provide:

  • traffic signs,
  • traffic lights,
  • pole-like objects,
  • construction site locations,
  • construction site obstacles (point-like such as cones, and line-like such as fences),
  • line-shaped markings (solid, dashed, etc.),
  • polygon-shaped markings (arrows, stop lines, symbols, etc.),
  • lanes (ordinary and temporary),
  • relations between elements (only for construction sites, e.g., sign to lane association).

3. HD_Map_MetaData

Our high-density point cloud used as basis for annotating the HD map is split in 648 tiles. This directory contains the geolocation for each tile as polygon on the map. You can view the respective tile definition using QGIS. Alternatively, we also provide respective polygons as lists of UTM coordinates in JSON.

Files with the ending .dbf, .prj, .qpj, .shp, and .shx belong to the tile definition as “shape file” (commonly used in geodesy) that can be viewed using QGIS. The JSON file contains the same information provided in a different format used in our Python API.

4. HD_PointCloud_Tiles

The high-density point cloud tiles are provided in global UTM32N coordinates and are encoded in a proprietary binary format. The first 4 bytes (integer) encode the number of points contained in that file. Subsequently, all point cloud values are provided as arrays. First all x-values, then all y-values, and so on. Specifically, the arrays are encoded as follows.

  • x-coordinates: 4 byte integer
  • y-coordinates: 4 byte integer
  • z-coordinates: 4 byte integer
  • intensity of reflected beams: 2 byte unsigned integer
  • ground classification flag: 1 byte unsigned integer

After reading, respective values have to be unnormalized. As an example, you can use the following code snipped to read the point cloud data. For visualization, you can use the pptk package, for instance.

import numpy as np
import pptk

file_path = r"E:\3DHD_CityScenes\HD_PointCloud_Tiles\HH_001.bin"
pc_dict = {}
key_list = ['x', 'y', 'z', 'intensity', 'is_ground']
type_list = ['

5. Trajectories

We provide 15 real-world trajectories recorded during a measurement campaign covering the whole HD map. Trajectory samples are provided approx. with 30 Hz and are encoded in JSON.

These trajectories were used to provide the samples in train.json, val.json. and test.json with realistic geolocations and orientations of the ego vehicle.

  • OP1 – OP5 cover the majority of the map with 5 trajectories.
  • RH1 – RH10 cover the majority of the map with 10 trajectories.

Note that OP5 is split into three separate parts, a-c. RH9 is split into two parts, a-b. Moreover, OP4 mostly equals OP1 (thus, we speak of 14 trajectories in our paper). For completeness, however, we provide all recorded trajectories here.

Search
Clear search
Close search
Google apps
Main menu