3 datasets found
  1. Data from: 3DHD CityScenes: High-Definition Maps in High-Density Point...

    • data.europa.eu
    • data.niaid.nih.gov
    • +1more
    unknown
    Updated Sep 15, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zenodo (2022). 3DHD CityScenes: High-Definition Maps in High-Density Point Clouds [Dataset]. https://data.europa.eu/data/datasets/oai-zenodo-org-7085090?locale=de
    Explore at:
    unknown(147082)Available download formats
    Dataset updated
    Sep 15, 2022
    Dataset authored and provided by
    Zenodohttp://zenodo.org/
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Overview 3DHD CityScenes is the most comprehensive, large-scale high-definition (HD) map dataset to date, annotated in the three spatial dimensions of globally referenced, high-density LiDAR point clouds collected in urban domains. Our HD map covers 127 km of road sections of the inner city of Hamburg, Germany including 467 km of individual lanes. In total, our map comprises 266,762 individual items. Our corresponding paper (published at ITSC 2022) is available here. Further, we have applied 3DHD CityScenes to map deviation detection here. Moreover, we release code to facilitate the application of our dataset and the reproducibility of our research. Specifically, our 3DHD_DevKit comprises: Python tools to read, generate, and visualize the dataset, 3DHDNet deep learning pipeline (training, inference, evaluation) for map deviation detection and 3D object detection. The DevKit is available here: https://github.com/volkswagen/3DHD_devkit. The dataset and DevKit have been created by Christopher Plachetka as project lead during his PhD period at Volkswagen Group, Germany. When using our dataset, you are welcome to cite: @INPROCEEDINGS{9921866, author={Plachetka, Christopher and Sertolli, Benjamin and Fricke, Jenny and Klingner, Marvin and Fingscheidt, Tim}, booktitle={2022 IEEE 25th International Conference on Intelligent Transportation Systems (ITSC)}, title={3DHD CityScenes: High-Definition Maps in High-Density Point Clouds}, year={2022}, pages={627-634}} Acknowledgements We thank the following interns for their exceptional contributions to our work. Benjamin Sertolli: Major contributions to our DevKit during his master thesis Niels Maier: Measurement campaign for data collection and data preparation The European large-scale project Hi-Drive (www.Hi-Drive.eu) supports the publication of 3DHD CityScenes and encourages the general publication of information and databases facilitating the development of automated driving technologies. The Dataset After downloading, the 3DHD_CityScenes folder provides five subdirectories, which are explained briefly in the following. 1. Dataset This directory contains the training, validation, and test set definition (train.json, val.json, test.json) used in our publications. Respective files contain samples that define a geolocation and the orientation of the ego vehicle in global coordinates on the map. During dataset generation (done by our DevKit), samples are used to take crops from the larger point cloud. Also, map elements in reach of a sample are collected. Both modalities can then be used, e.g., as input to a neural network such as our 3DHDNet. To read any JSON-encoded data provided by 3DHD CityScenes in Python, you can use the following code snipped as an example. import json json_path = r"E:\3DHD_CityScenes\Dataset\train.json" with open(json_path) as jf: data = json.load(jf) print(data) 2. HD_Map Map items are stored as lists of items in JSON format. In particular, we provide: traffic signs, traffic lights, pole-like objects, construction site locations, construction site obstacles (point-like such as cones, and line-like such as fences), line-shaped markings (solid, dashed, etc.), polygon-shaped markings (arrows, stop lines, symbols, etc.), lanes (ordinary and temporary), relations between elements (only for construction sites, e.g., sign to lane association). 3. HD_Map_MetaData Our high-density point cloud used as basis for annotating the HD map is split in 648 tiles. This directory contains the geolocation for each tile as polygon on the map. You can view the respective tile definition using QGIS. Alternatively, we also provide respective polygons as lists of UTM coordinates in JSON. Files with the ending .dbf, .prj, .qpj, .shp, and .shx belong to the tile definition as “shape file” (commonly used in geodesy) that can be viewed using QGIS. The JSON file contains the same information provided in a different format used in our Python API. 4. HD_PointCloud_Tiles The high-density point cloud tiles are provided in global UTM32N coordinates and are encoded in a proprietary binary format. The first 4 bytes (integer) encode the number of points contained in that file. Subsequently, all point cloud values are provided as arrays. First all x-values, then all y-values, and so on. Specifically, the arrays are encoded as follows. x-coordinates: 4 byte integer y-coordinates: 4 byte integer z-coordinates: 4 byte integer intensity of reflected beams: 2 byte unsigned integer ground classification flag: 1 byte unsigned integer After reading, respective values have to be unnormalized. As an example, you can use the following code snipped to read the point cloud data. For visualization, you can use the pptk package, for instance. import numpy as np import pptk file_path = r"E:\3DHD_CityScenes\HD_PointCloud_Tiles\HH_001.bin" pc_dict = {} key_list = ['x', 'y', 'z', 'intensity', 'is_ground'] type_list = ['<i4', '<i4', '<i4', '<u2', 'u1'] with open(file_path, "r") as fid: num_points = np.f

  2. e

    Average local taxes by assets — Departmental Map 54 Meurthe and Moselle 2015...

    • data.europa.eu
    excel xls, jpeg, pdf +1
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    DELETED DELETED, Average local taxes by assets — Departmental Map 54 Meurthe and Moselle 2015 [Dataset]. https://data.europa.eu/data/datasets/56ef07c6c751df0c9ad6e93b
    Explore at:
    zip(79478), pdf(3588797), excel xls(2660864), jpeg(1251950)Available download formats
    Dataset authored and provided by
    DELETED DELETED
    License

    Licence Ouverte / Open Licence 1.0https://www.etalab.gouv.fr/wp-content/uploads/2014/05/Open_Licence.pdf
    License information was derived automatically

    Description

    Here is an image of the global municipal tax (founcier bati + habitation). Average tax per asset Nancy 2014

    To do it again you will need: — QGIS software (Free: https://www.qgis.org/fr/site/forusers/download.html), — a qgs file of your department (http://www.actualitix.com/shapefiles-des-departements-de-france.html) — an export of tax rates (https://www.data.gouv.fr/fr/datasets/impots-locaux/ > Municipal and intercommunal data > Your Department > Local Direct Tax Data 2014 (XLS format)) — data (most days of INSEE here 2012 http://www.insee.fr/fr/themes/detail.asp?reg_id=99&ref_id=base-cc-emploi-pop-active-2012)

    Operating Mode: — process your data in your favorite spreadsheet (Excel or OpenOffice Calc) by integrating impot data, and INSEE to pull out the numbers that seem revealing to you — Install QGIS — Open the.qgs of your department

    Add columns — Right click property on the main layer — Go to the field menu (on the left) — Add (via pencil) the desired columns (here average housing tax per asset, average property tax per asset, and the sum of both) — These are reals of precision 2, and length 6 — Register

    Insert data: — Right-click on the “Open attribute table” layer — Select all — Copy — Paste in excel (or openOffice calcs) — Put the ad hoc formulas in excel (SOMME.SI.ENS to recover the rate) — Save the desired tab in CSV DOS with the new values — In QGIS > Menu > Layer > Add a delimited layer of text — Import the CSV

    Present the data: — To simplify I advise you to make a layer by rate, and layers sums. So rots you in three clicks out the image of the desired rate — For each layer (or rate) — Right click properties on the csv layer — Labels to add city name and desired rate — Style for fct coloring of a csv field

    Print the data in pdf: — To print, you need to define a print template — In the menu choose new printing dialer — choose the format (a department in A0 is rather readable) — Add vas legend, scale, and other — Print and here...

    NB: this method creates aberrations: — in the case where the INSEE does not have a number or numbers that have moved a lot since — it is assumed that only assets pay taxes (which is more fair, but not 100 %)

  3. e

    Local taxes - Departmental map 54 Meurthe et Moselle 2015

    • data.europa.eu
    pdf, zip
    Updated Jul 10, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    DELETED DELETED (2025). Local taxes - Departmental map 54 Meurthe et Moselle 2015 [Dataset]. https://data.europa.eu/data/datasets/56ed013488ee380d03e1a625?locale=en
    Explore at:
    pdf(2556923), pdf(2546950), zip(393104)Available download formats
    Dataset updated
    Jul 10, 2025
    Dataset authored and provided by
    DELETED DELETED
    License

    Licence Ouverte / Open Licence 1.0https://www.etalab.gouv.fr/wp-content/uploads/2014/05/Open_Licence.pdf
    License information was derived automatically

    Description

    Here is an image of the overall municipal tax rate (foncier bati + habitation, for municipalities and inter-municipalities). http://physaphae.noip.me/Img/2015_Rate_54" alt="Local tax rate 54 of 2015" title="Local tax rate 54 of 2015">

    Given that it is at the departmental mesh, it is not useful to include the departmental rate, and national... That would not be part of the comparison.

    To do it again yourself you will need: - QQGIS software (Free: https://www.qgis.org/en/site/forusers/download.html), - a qgs file of your department (http://www.actualitix.com/shapefiles-des-departements-de-france.html) - an export of tax rates (https://www.data.gouv.fr/en/datasets/local taxes/)

    Procedure: Install QGIS Open your department's .qgs

    Add columns - Right click property on the main layer - Go to the fields menu (on the left) - Add (via the pencil) the desired columns (here municipal tax rate, intercommunal built land and housing) - These are reals of a precision 2, and a length 4 - Register

    Insert data: - Right click on the layer "Open attribute table" - Select all - Copy - Paste into excel (or openOffice calcs) - Put the ad hoc formulas in excel (SUM.SI.ENS to recover the rate) - Save the desired tab in CSV DOS with the new values - In QGIS > Menu > Layer > Add a delimited text layer - Import the CSV

    Present the data: - To simplify I advise you to make one layer per rate, and layers are. Thus rots you in three clicks take out the image of the desired rate - For each layer (or rate) - Right click properties on the csv layer - Labels to add the name of the city and the desired rate - Style for coloring in fct of a csv field

    Print the data in pdf: - To print, you need to define a print template - In the menu choose new print dialler - choose the format (a department in A0 is rather readable) - Add vas legend, ladder, and other - Print and voila...

  4. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Zenodo (2022). 3DHD CityScenes: High-Definition Maps in High-Density Point Clouds [Dataset]. https://data.europa.eu/data/datasets/oai-zenodo-org-7085090?locale=de
Organization logo

Data from: 3DHD CityScenes: High-Definition Maps in High-Density Point Clouds

Related Article
Explore at:
unknown(147082)Available download formats
Dataset updated
Sep 15, 2022
Dataset authored and provided by
Zenodohttp://zenodo.org/
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

Overview 3DHD CityScenes is the most comprehensive, large-scale high-definition (HD) map dataset to date, annotated in the three spatial dimensions of globally referenced, high-density LiDAR point clouds collected in urban domains. Our HD map covers 127 km of road sections of the inner city of Hamburg, Germany including 467 km of individual lanes. In total, our map comprises 266,762 individual items. Our corresponding paper (published at ITSC 2022) is available here. Further, we have applied 3DHD CityScenes to map deviation detection here. Moreover, we release code to facilitate the application of our dataset and the reproducibility of our research. Specifically, our 3DHD_DevKit comprises: Python tools to read, generate, and visualize the dataset, 3DHDNet deep learning pipeline (training, inference, evaluation) for map deviation detection and 3D object detection. The DevKit is available here: https://github.com/volkswagen/3DHD_devkit. The dataset and DevKit have been created by Christopher Plachetka as project lead during his PhD period at Volkswagen Group, Germany. When using our dataset, you are welcome to cite: @INPROCEEDINGS{9921866, author={Plachetka, Christopher and Sertolli, Benjamin and Fricke, Jenny and Klingner, Marvin and Fingscheidt, Tim}, booktitle={2022 IEEE 25th International Conference on Intelligent Transportation Systems (ITSC)}, title={3DHD CityScenes: High-Definition Maps in High-Density Point Clouds}, year={2022}, pages={627-634}} Acknowledgements We thank the following interns for their exceptional contributions to our work. Benjamin Sertolli: Major contributions to our DevKit during his master thesis Niels Maier: Measurement campaign for data collection and data preparation The European large-scale project Hi-Drive (www.Hi-Drive.eu) supports the publication of 3DHD CityScenes and encourages the general publication of information and databases facilitating the development of automated driving technologies. The Dataset After downloading, the 3DHD_CityScenes folder provides five subdirectories, which are explained briefly in the following. 1. Dataset This directory contains the training, validation, and test set definition (train.json, val.json, test.json) used in our publications. Respective files contain samples that define a geolocation and the orientation of the ego vehicle in global coordinates on the map. During dataset generation (done by our DevKit), samples are used to take crops from the larger point cloud. Also, map elements in reach of a sample are collected. Both modalities can then be used, e.g., as input to a neural network such as our 3DHDNet. To read any JSON-encoded data provided by 3DHD CityScenes in Python, you can use the following code snipped as an example. import json json_path = r"E:\3DHD_CityScenes\Dataset\train.json" with open(json_path) as jf: data = json.load(jf) print(data) 2. HD_Map Map items are stored as lists of items in JSON format. In particular, we provide: traffic signs, traffic lights, pole-like objects, construction site locations, construction site obstacles (point-like such as cones, and line-like such as fences), line-shaped markings (solid, dashed, etc.), polygon-shaped markings (arrows, stop lines, symbols, etc.), lanes (ordinary and temporary), relations between elements (only for construction sites, e.g., sign to lane association). 3. HD_Map_MetaData Our high-density point cloud used as basis for annotating the HD map is split in 648 tiles. This directory contains the geolocation for each tile as polygon on the map. You can view the respective tile definition using QGIS. Alternatively, we also provide respective polygons as lists of UTM coordinates in JSON. Files with the ending .dbf, .prj, .qpj, .shp, and .shx belong to the tile definition as “shape file” (commonly used in geodesy) that can be viewed using QGIS. The JSON file contains the same information provided in a different format used in our Python API. 4. HD_PointCloud_Tiles The high-density point cloud tiles are provided in global UTM32N coordinates and are encoded in a proprietary binary format. The first 4 bytes (integer) encode the number of points contained in that file. Subsequently, all point cloud values are provided as arrays. First all x-values, then all y-values, and so on. Specifically, the arrays are encoded as follows. x-coordinates: 4 byte integer y-coordinates: 4 byte integer z-coordinates: 4 byte integer intensity of reflected beams: 2 byte unsigned integer ground classification flag: 1 byte unsigned integer After reading, respective values have to be unnormalized. As an example, you can use the following code snipped to read the point cloud data. For visualization, you can use the pptk package, for instance. import numpy as np import pptk file_path = r"E:\3DHD_CityScenes\HD_PointCloud_Tiles\HH_001.bin" pc_dict = {} key_list = ['x', 'y', 'z', 'intensity', 'is_ground'] type_list = ['<i4', '<i4', '<i4', '<u2', 'u1'] with open(file_path, "r") as fid: num_points = np.f

Search
Clear search
Close search
Google apps
Main menu