73 datasets found
  1. f

    Camera-LiDAR Datasets

    • figshare.com
    zip
    Updated Aug 14, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jennifer Leahy (2024). Camera-LiDAR Datasets [Dataset]. http://doi.org/10.6084/m9.figshare.26660863.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Aug 14, 2024
    Dataset provided by
    figshare
    Authors
    Jennifer Leahy
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The datasets are original and specifically collected for research aimed at reducing registration errors between Camera-LiDAR datasets. Traditional methods often struggle with aligning 2D-3D data from sources that have different coordinate systems and resolutions. Our collection comprises six datasets from two distinct setups, designed to enhance versatility in our approach and improve matching accuracy across both high-feature and low-feature environments.Survey-Grade Terrestrial Dataset:Collection Details: Data was gathered across various scenes on the University of New Brunswick campus, including low-feature walls, high-feature laboratory rooms, and outdoor tree environments.Equipment: LiDAR data was captured using a Trimble TX5 3D Laser Scanner, while optical images were taken with a Canon EOS 5D Mark III DSLR camera.Mobile Mapping System Dataset:Collection Details: This dataset was collected using our custom-built Simultaneous Localization and Multi-Sensor Mapping Robot (SLAMM-BOT) in several indoor mobile scenes to validate our methods.Equipment: Data was acquired using a Velodyne VLP-16 LiDAR scanner and an Arducam IMX477 Mini camera, controlled via a Raspberry Pi board.

  2. F

    i.c.sens Visual-Inertial-LiDAR Dataset

    • data.uni-hannover.de
    bag, jpeg, pdf, png +2
    Updated Dec 12, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    i.c.sens (2024). i.c.sens Visual-Inertial-LiDAR Dataset [Dataset]. https://data.uni-hannover.de/dataset/i-c-sens-visual-inertial-lidar-dataset
    Explore at:
    txt(1049), jpeg(556618), jpeg(129333), rviz(6412), png(650007), jpeg(153522), txt(285), pdf(21788288), bag(9982003259), bag(9980268682), bag(9969171093), bag(9971699339), bag(9939783847), bag(9896857478), bag(9960305979), bag(7419679751)Available download formats
    Dataset updated
    Dec 12, 2024
    Dataset authored and provided by
    i.c.sens
    License

    Attribution-NonCommercial 3.0 (CC BY-NC 3.0)https://creativecommons.org/licenses/by-nc/3.0/
    License information was derived automatically

    Description

    The i.c.sens Visual-Inertial-LiDAR Dataset is a data set for the evaluation of dead reckoning or SLAM approaches in the context of mobile robotics. It consists of street-level monocular RGB camera images, a front-facing 180° point cloud, angular velocities, accelerations and an accurate ground truth trajectory. In total, we provide around 77 GB of data resulting from a 15 minutes drive, which is split into 8 rosbags of 2 minutes (10 GB) each. Besides, the intrinsic camera parameters and the extrinsic transformations between all sensor coordinate systems are given. Details on the data and its usage can be found in the provided documentation file.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/0ff90ef9-fa61-4ee3-b69e-eb6461abc57b/download/sensor_platform_small.jpg" alt="">

    Image credit: Sören Vogel

    The data set was acquired in the context of the measurement campaign described in Schoen2018. Here, a vehicle, which can be seen below, was equipped with a self-developed sensor platform and a commercially available Riegl VMX-250 Mobile Mapping System. This Mobile Mapping System consists of two laser scanners, a camera system and a localization unit containing a highly accurate GNSS/IMU system.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/2a1226b8-8821-4c46-b411-7d63491963ed/download/vehicle_small.jpg" alt="">

    Image credit: Sören Vogel

    The data acquisition took place in May 2019 during a sunny day in the Nordstadt of Hannover (coordinates: 52.388598, 9.716389). The route we took can be seen below. This route was completed three times in total, which amounts to a total driving time of 15 minutes.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/8a570408-c392-4bd7-9c1e-26964f552d6c/download/google_earth_overview_small.png" alt="">

    The self-developed sensor platform consists of several sensors. This dataset provides data from the following sensors:

    • Velodyne HDL-64 LiDAR
    • LORD MicroStrain 3DM-GQ4-45 GNSS aided IMU
    • Pointgrey GS3-U3-23S6C-C RGB camera

    To inspect the data, first start a rosmaster and launch rviz using the provided configuration file:

    roscore & rosrun rviz rviz -d icsens_data.rviz
    

    Afterwards, start playing a rosbag with

    rosbag play icsens-visual-inertial-lidar-dataset-{number}.bag --clock
    

    Below we provide some exemplary images and their corresponding point clouds.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/dc1563c0-9b5f-4c84-b432-711916cb204c/download/combined_examples_small.jpg" alt="">

    Related publications:

    • R. Voges, C. S. Wieghardt, and B. Wagner, “Finding Timestamp Offsets for a Multi-Sensor System Using Sensor Observations,” Photogrammetric Engineering & Remote Sensing, vol. 84, no. 6, pp. 357–366, 2018.

    • R. Voges and B. Wagner, “RGB-Laser Odometry Under Interval Uncertainty for Guaranteed Localization,” in Book of Abstracts of the 11th Summer Workshop on Interval Methods (SWIM 2018), Rostock, Germany, Jul. 2018.

    • R. Voges and B. Wagner, “Timestamp Offset Calibration for an IMU-Camera System Under Interval Uncertainty,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, Oct. 2018.

    • R. Voges and B. Wagner, “Extrinsic Calibration Between a 3D Laser Scanner and a Camera Under Interval Uncertainty,” in Book of Abstracts of the 12th Summer Workshop on Interval Methods (SWIM 2019), Palaiseau, France, Jul. 2019.

    • R. Voges, B. Wagner, and V. Kreinovich, “Efficient Algorithms for Synchronizing Localization Sensors Under Interval Uncertainty,” Reliable Computing (Interval Computations), vol. 27, no. 1, pp. 1–11, 2020.

    • R. Voges, B. Wagner, and V. Kreinovich, “Odometry under Interval Uncertainty: Towards Optimal Algorithms, with Potential Application to Self-Driving Cars and Mobile Robots,” Reliable Computing (Interval Computations), vol. 27, no. 1, pp. 12–20, 2020.

    • R. Voges and B. Wagner, “Set-Membership Extrinsic Calibration of a 3D LiDAR and a Camera,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, Oct. 2020, accepted.

    • R. Voges, “Bounded-Error Visual-LiDAR Odometry on Mobile Robots Under Consideration of Spatiotemporal Uncertainties,” PhD thesis, Gottfried Wilhelm Leibniz Universität, 2020.

  3. Mobile LiDAR Data

    • figshare.com
    bin
    Updated Jan 22, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Bin Wu (2021). Mobile LiDAR Data [Dataset]. http://doi.org/10.6084/m9.figshare.13625054.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    Jan 22, 2021
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Bin Wu
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This is a point cloud sampe data which was collected by a mobile Lidar system (MLS).

  4. D

    Detroit Street View Terrestrial LiDAR (2020-2022)

    • detroitdata.org
    • data.detroitmi.gov
    • +1more
    Updated Apr 18, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    City of Detroit (2023). Detroit Street View Terrestrial LiDAR (2020-2022) [Dataset]. https://detroitdata.org/dataset/detroit-street-view-terrestrial-lidar-2020-2022
    Explore at:
    zip, gpkg, gdb, csv, kml, xlsx, arcgis geoservices rest api, html, txt, geojsonAvailable download formats
    Dataset updated
    Apr 18, 2023
    Dataset provided by
    City of Detroit
    Area covered
    Detroit
    Description

    Detroit Street View (DSV) is an urban remote sensing program run by the Enterprise Geographic Information Systems (EGIS) Team within the Department of Innovation and Technology at the City of Detroit. The mission of Detroit Street View is ‘To continuously observe and document Detroit’s changing physical environment through remote sensing, resulting in freely available foundational data that empowers effective city operations, informed decision making, awareness, and innovation.’ LiDAR (as well as panoramic imagery) is collected using a vehicle-mounted mobile mapping system.

    Due to variations in processing, index lines are not currently available for all existing LiDAR datasets, including all data collected before September 2020. Index lines represent the approximate path of the vehicle within the time extent of the given LiDAR file. The actual geographic extent of the LiDAR point cloud varies dependent on line-of-sight.

    Compressed (LAZ format) point cloud files may be requested by emailing gis@detroitmi.gov with a description of the desired geographic area, any specific dates/file names, and an explanation of interest and/or intended use. Requests will be filled at the discretion and availability of the Enterprise GIS Team. Deliverable file size limitations may apply and requestors may be asked to provide their own online location or physical media for transfer.

    LiDAR was collected using an uncalibrated Trimble MX2 mobile mapping system. The data is not quality controlled, and no accuracy assessment is provided or implied. Results are known to vary significantly. Users should exercise caution and conduct their own comprehensive suitability assessments before requesting and applying this data.

    Sample Dataset: https://detroitmi.maps.arcgis.com/home/item.html?id=69853441d944442f9e79199b57f26fe3

    DSV Logo

  5. t

    i.c.sens Visual-Inertial-LiDAR Dataset

    • service.tib.eu
    Updated Aug 19, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2020). i.c.sens Visual-Inertial-LiDAR Dataset [Dataset]. https://service.tib.eu/ldmservice/dataset/luh-i-c-sens-visual-inertial-lidar-dataset
    Explore at:
    Dataset updated
    Aug 19, 2020
    License

    Attribution-NonCommercial 3.0 (CC BY-NC 3.0)https://creativecommons.org/licenses/by-nc/3.0/
    License information was derived automatically

    Description

    The i.c.sens Visual-Inertial-LiDAR Dataset is a data set for the evaluation of dead reckoning or SLAM approaches in the context of mobile robotics. It consists of street-level monocular RGB camera images, a front-facing 180° point cloud, angular velocities, accelerations and an accurate ground truth trajectory. In total, we provide around 77 GB of data resulting from a 15 minutes drive, which is split into 8 rosbags of 2 minutes (10 GB) each. Besides, the intrinsic camera parameters and the extrinsic transformations between all sensor coordinate systems are given. Details on the data and its usage can be found in the provided documentation file. Image credit: Sören Vogel The data set was acquired in the context of the measurement campaign described in Schoen2018. Here, a vehicle, which can be seen below, was equipped with a self-developed sensor platform and a commercially available Riegl VMX-250 Mobile Mapping System. This Mobile Mapping System consists of two laser scanners, a camera system and a localization unit containing a highly accurate GNSS/IMU system. Image credit: Sören Vogel The data acquisition took place in May 2019 during a sunny day in the Nordstadt of Hannover (coordinates: 52.388598, 9.716389). The route we took can be seen below. This route was completed three times in total, which amounts to a total driving time of 15 minutes. The self-developed sensor platform consists of several sensors. This dataset provides data from the following sensors: Velodyne HDL-64 LiDAR LORD MicroStrain 3DM-GQ4-45 GNSS aided IMU Pointgrey GS3-U3-23S6C-C RGB camera To inspect the data, first start a rosmaster and launch rviz using the provided configuration file: roscore & rosrun rviz rviz -d icsens_data.rviz

  6. e

    Developing a SLAM-based backpack mobile mapping system for indoor mapping -...

    • b2find.eudat.eu
    • b2find.dkrz.de
    Updated Feb 21, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2022). Developing a SLAM-based backpack mobile mapping system for indoor mapping - Dataset - B2FIND [Dataset]. https://b2find.eudat.eu/dataset/095eb440-b958-521b-813a-a71f246781a1
    Explore at:
    Dataset updated
    Feb 21, 2022
    Description

    These files are to support the published journal and thesis about the IMU and LIDAR SLAM for indoor mapping. They include datasets and functions used for point clouds generation. Date Submitted: 2022-02-21

  7. t

    Parking lot locations and utilization samples in the Hannover Linden-Nord...

    • service.tib.eu
    Updated May 12, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). Parking lot locations and utilization samples in the Hannover Linden-Nord area from LiDAR mobile mapping surveys [Dataset]. https://service.tib.eu/ldmservice/dataset/luh-parking-locations-and-utilization-from-lidar-mobile-mapping-surveys
    Explore at:
    Dataset updated
    May 12, 2024
    License

    Attribution-NonCommercial 3.0 (CC BY-NC 3.0)https://creativecommons.org/licenses/by-nc/3.0/
    License information was derived automatically

    Area covered
    Hanover, Linden - Nord
    Description

    Work in progress: data might be changed The data set contains the locations of public roadside parking spaces in the northeastern part of Hanover Linden-Nord. As a sample data set, it explicitly does not provide a complete, accurate or correct representation of the conditions! It was collected and processed as part of the 5GAPS research project on September 22nd and October 6th 2022 as a basis for further analysis and in particular as input for simulation studies. Vehicle Detections Based on the mapping methodology of Bock et al. (2015) and processing of Leichter et al. (2021), the utilization was determined using vehicle detections in segmented 3D point clouds. The corresponding point clouds were collected by driving over the area on two half-days using a LiDAR mobile mapping system, resulting in several hours between observations. Accordingly, these are only a few sample observations. The trips are made in such a way that combined they cover a synthetic day from about 8-20 clock. The collected point clouds were georeferenced, processed, and automatically segmented semantically (see Leichter et al., 2021). To automatically extract cars, those points with car labels were clustered by observation epoch and bounding boxes were estimated for the clusters as a representation of car instances. The boxes serve both to filter out unrealistically small and large objects, and to rudimentarily complete the vehicle footprint that may not be fully captured from all sides. Figure 1: Overview map of detected vehicles Parking Areas

  8. e

    Position Estimation of Mobile Mapping Imaging Sensors Using Aerial Images -...

    • b2find.eudat.eu
    Updated Oct 25, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). Position Estimation of Mobile Mapping Imaging Sensors Using Aerial Images - Dataset - B2FIND [Dataset]. https://b2find.eudat.eu/dataset/0779a758-aa99-507e-90dd-80765632e9d5
    Explore at:
    Dataset updated
    Oct 25, 2024
    Description

    This project aims to improve the position estimation of mobile mapping platforms. Mobile Mapping (MM) is a technique to obtain geo-information on a large scale using sensors mounted on a car or another vehicle. Under normal conditions, accurate positioning is provided by the integration of Global Navigation Satellite Systems (GNSS) and Inertial Navigation Systems (INS). However, especially in urban areas, where building structures impede a direct line-of-sight to navigation satellites or lead to multipath effects, MM derived products, such as laser point clouds or images, lack the expected reliability and contain an unknown positioning error. This issue has been addressed by many researchers, whose aim to mitigate these effects mainly concentrates on utilising tertiary data, such as digital maps, ortho images or airborne LiDAR. These data serve as a reliable source of orientation and are being used subsidiarily or as the basis for adjustment. However, these approaches show limitations regarding the achieved accuracy, the correction of error in height, the availability of tertiary data and their feasibility in difficult areas. This project is addressing the aforementioned problem by employing high resolution aerial nadir and oblique imagery as reference data. By exploiting the MM platform?s approximate orientation parameters, very accurate matching techniques can be realised to extract the MM platform?s positioning error. In the form of constraints, they serve as a corrective for an orientation update, which is conducted by an estimation or adjustment technique.

  9. Mobile Robot Dataset with Ouster OS1-32 LiDAR at the University of Málaga

    • zenodo.org
    sh, zip
    Updated May 8, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Francisco Anaya Palacios; Cipriano Galindo; Cipriano Galindo; Javier González-Jiménez; Francisco Anaya Palacios; Javier González-Jiménez (2025). Mobile Robot Dataset with Ouster OS1-32 LiDAR at the University of Málaga [Dataset]. http://doi.org/10.5281/zenodo.15301791
    Explore at:
    sh, zipAvailable download formats
    Dataset updated
    May 8, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Francisco Anaya Palacios; Cipriano Galindo; Cipriano Galindo; Javier González-Jiménez; Francisco Anaya Palacios; Javier González-Jiménez
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset contains data collected using an Ouster OS1-32 LiDAR sensor mounted on a Hunter 2.0 UGV (by AgileX Robotics). The robot was manually driven through the Computer Science building at the University of Málaga (Spain), covering approximately 1 km.

    The dataset includes the following files:

    • etsii_rosbag: A 2.2 GiB ROS 2 bag file containing sensory data, including RGB images, IMU, and GPS (when available). Due to size constraints, raw 3D point cloud data from the LiDAR was not included. However, it can be reconstructed using the official Ouster utilities, such as ouster-ros-extras.

    • metadata_rosbag: A lightweight ROS bag that includes the metadata necessary to decode and reconstruct the 3D point cloud from the sensor.

    • play.sh: A shell script that sequentially plays both rosbag files to facilitate data reproduction and use.

  10. D

    Detroit Street View Panoramic Imagery

    • detroitdata.org
    • data.detroitmi.gov
    • +1more
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    City of Detroit (2023). Detroit Street View Panoramic Imagery [Dataset]. https://detroitdata.org/dataset/detroit-street-view-panoramic-imagery
    Explore at:
    html, arcgis geoservices rest apiAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    City of Detroit
    Area covered
    Detroit
    Description
    Detroit Street View (DSV) is an urban remote sensing program run by the Enterprise Geographic Information Systems (EGIS) Team within the Department of Innovation and Technology at the City of Detroit. The mission of Detroit Street View is ‘To continuously observe and document Detroit’s changing physical environment through remote sensing, resulting in freely available foundational data that empowers effective city operations, informed decision making, awareness, and innovation.’ 360° panoramic imagery (as well as LiDAR) is collected using a vehicle-mounted mobile mapping system.

    The City of Detroit distributes 360° panoramic street view imagery from the Detroit Street View program via Mapillary.com. Within Mapillary, users can search address, pan/zoom around the map, and load images by clicking on image points. Mapillary also provides several tools for accessing and analyzing information including:
    Please see Mapillary API documentation for more information about programmatic access and specific data components within Mapillary.
    DSV Logo
  11. d

    Data from: Using handheld mobile laser scanning to quantify fine-scale...

    • search.dataone.org
    • datadryad.org
    Updated Mar 13, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alanna Post; Brieanne Forbes; Zane Cooper; Kristi Faro; Catherine Seel; Matthew Clark; Mathias Disney; Lisa Bentley (2025). Using handheld mobile laser scanning to quantify fine-scale surface fuels and detect changes post-disturbance in Northern California forests [Dataset]. http://doi.org/10.5061/dryad.sxksn038g
    Explore at:
    Dataset updated
    Mar 13, 2025
    Dataset provided by
    Dryad Digital Repository
    Authors
    Alanna Post; Brieanne Forbes; Zane Cooper; Kristi Faro; Catherine Seel; Matthew Clark; Mathias Disney; Lisa Bentley
    Time period covered
    Jan 1, 2023
    Description

    The understory plays a critical role in the disturbance dynamics of forest ecosystems, as it can influence wildfire behavior. Unfortunately, the 3D structure of understory fuels is often difficult to quantify and model due to vegetation and substrate heterogeneity. LiDAR remote sensing can measure changes in 3D forest structure more rapidly, comprehensively, and accurately than manual approaches, but a remote sensing approach is more frequently applied to the overstory compared to the understory. Here we evaluated the use of handheld mobile laser scanning (HMLS) to measure and detect changes in fine-scale surface fuels following wildfire and timber harvest in Northern Californian forests, USA. First, the ability of HMLS to quantify surface fuels was validated by destructively sampling vegetation below 1 m with a known occupied volume within a 3D frame and comparing destructive-based volumes with HMLS-based occupied volume estimates. There was a positive linear relationship (R2 = 0.72) b..., Data were collected in a few different ways. 3D frame data were collected by scanning a 3D frame with a handheld mobile laser scanner (HMLS) and then destructively sampling of the vegetation inside. The scans were processed by the scanner's software (GeoSLAM, SLAM algorithm), and the vegetation samples were oven dried to get dry mass measurements. Plot-level data were collected at 11.3 m radius circle plots at 2 locations across 3 time periods, lidar scans were taken with the HMLS and Brown's data were collected using the standard Brown's transect protocol. Brown's data were processed to extract estimates of fuel mass per area for each plot. All of the lidar scans taken with the HMLS (both frame and plot scans) were further processed in Lidar360, CloudCompare, and R with the lidR package to clip scans to the frame/plot boundary, height normalize, and voxelize the scans. Frame scans were voxelized at 4 different voxel sizes (1, 5, 10, and 25 cm), while plot scans were all voxelized at 1 ..., , # Data from: Using handheld mobile laser scanning to quantify fine-scale surface fuels and detect changes post-disturbance in Northern California forests

    https://doi.org/10.5061/dryad.sxksn038g

    The dataset includes processed handheld lidar data and dry mass, from 3D frame and plot sampling. The lidar system used is a handheld mobile laser scanner (GeoSLAM's Zeb-REVO).

    Description of the data and file structure

    Sheets within the Excel file are separated based on manuscript sections. '3D Frame' includes the data collected from lidar scans and destructive sampling which was collected to validate the use of handheld lidar for vegetation monitoring. 'Plot-level' contains the total occupied voxels from the processed plot scans taken in each survey/campaign. 'Brown's' is the mass per area calculated from Brown's transects collected at the plots and the predicted mass in grams as calculated from the voxelized plot scans. 'Point Density' contains...

  12. i

    IILABS 3D: iilab Indoor LiDAR-based SLAM Dataset

    • rdm.inesctec.pt
    Updated Feb 27, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). IILABS 3D: iilab Indoor LiDAR-based SLAM Dataset [Dataset]. https://rdm.inesctec.pt/dataset/nis-2025-001
    Explore at:
    Dataset updated
    Feb 27, 2025
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    The IILABS 3D dataset is a rigorously designed benchmark intended to advance research in 3D LiDAR-based Simultaneous Localization and Mapping (SLAM) algorithms within indoor environments. It provides a robust and diverse foundation for evaluating and enhancing SLAM techniques in complex indoor settings. The dataset was retrived in the Industry and Innovation Laboratory (iiLab) and comprises synchronized data from a suite of sensors—including four distinct 3D LiDAR sensors, a 2D LiDAR, an Inertial Measurement Unit (IMU), and wheel odometry—complemented by high-precision ground truth obtained via a Motion Capture (MoCap) system. Project Webpage https://jorgedfr.github.io/3d_lidar_slam_benchmark_at_iilab/ Dataset Toolkit https://github.com/JorgeDFR/iilabs3d-toolkit Data Collection Method Sensor data was captured using the Robot Operating System (ROS) framework’s rosbag record tool on a LattePanda 3 Delta embedded computer. Post-processing involved timestamp correction for the Xsens MTi-630 IMU via custom Python scripts. Ground-truth data was captured using an OptiTrack MoCap system featuring 24 high-resolution PrimeX 22 cameras. These cameras were connected via Ethernet to a primary Windows computer running the Motive software (https://optitrack.com/software/motive), which processed the camera data. This Windows computer was then connected via Ethernet to a secondary Ubuntu machine running the NatNet 4 ROS driver (https://github.com/L2S-lab/natnet_ros_cpp). The driver published the data as ROS topics, which were recorded into rosbag files. Additionally, temporal synchronization between the robot platform and the ground-truth system was achieved using the Network Time Protocol (NTP). Finally, the bag files were processed using the EVO open-source Python library (https://github.com/MichaelGrupp/evo) to convert the data into TUM format and adjust the initial position offsets for accurate SLAM odometry benchmarking. Type of Instrument Mobile Robot Platform: INESC TEC MRDT Modified Hangfa Discovery Q2 Platform. R.B. Sousa, H.M. Sobreira, J.G. Martins, P.G. Costa, M.F. Silva and A.P. Moreira, "Integrating Multimodal Perception into Ground Mobile Robots," 2025 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC2025), Madeira, Portugal, 2025, pp. TBD, doi: TBD [Manuscript accepted for publication].https://sousarbarb.github.io/inesctec_mrdt_hangfa_discovery_q2/ Sensor data: Livox Mid-360, Ouster OS1-64 RevC, RoboSense RS-HELIOS-5515, and Velodyne VLP-16 (3D LiDARs); Hokuyo UST-10LX-H01 (2D LiDAR); Xsens MTi-630 (IMU); and Faulhaber 2342 wheel encoders (64:1 gear ratio, 12 Counts Per Revolution (CPR)). Ground Truth data: OptiTrack Motion Capture System with 24 PrimeX 22 cameras installed in Room A, Floor 0 at iilab

  13. Forest Localisation Dataset

    • researchdata.edu.au
    • data.csiro.au
    datadownload
    Updated Feb 17, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Micheal Bruenig; Paulo Borges; Milad Ramezani; Lucas Carvalho de Lima (2023). Forest Localisation Dataset [Dataset]. http://doi.org/10.25919/FBWY-RK04
    Explore at:
    datadownloadAvailable download formats
    Dataset updated
    Feb 17, 2023
    Dataset provided by
    CSIROhttp://www.csiro.au/
    Authors
    Micheal Bruenig; Paulo Borges; Milad Ramezani; Lucas Carvalho de Lima
    License

    Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
    License information was derived automatically

    Time period covered
    Oct 8, 2021
    Description

    The dataset contains lidar, imu and wheel odometry measurements collected using an all-electric 4 wheel robotic vehicle (Gator) in a forest environment at the Queensland Centre for Advanced Technologies (QCAT - CSIRO) in Brisbane, Australia. The dataset also contains a heightmap image constructed from aerial lidar data of the same forest. This dataset allows users to run the Forest Localisation software and evaluate the results of the presented localisation method. Lineage: The ground view data was collected utilising an all-electric 4 wheel robotic vehicle equipped with a Velodyne VLP-16 laser mounted on a servo-motor, with a 45 degree inclination, spinning around the vertical axis at 0.5Hz. In addition to the lidar scans, imu and wheel odometry measurements were also recorded. The above canopy map (heightmap) was constructed from aerial lidar data captured using a drone also equipped with a spinning mobile lidar sensor.

  14. D

    Data from: Developing a SLAM-based backpack mobile mapping system for indoor...

    • phys-techsciences.datastations.nl
    bin, exe, zip
    Updated Feb 22, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    S. Karam; S. Karam (2022). Developing a SLAM-based backpack mobile mapping system for indoor mapping [Dataset]. http://doi.org/10.17026/DANS-XME-KEPM
    Explore at:
    bin(11456605), zip(21733), exe(17469035), exe(18190303), exe(447), bin(20142672), bin(62579), exe(17513963), bin(45862), exe(17284627), bin(6856377), bin(9279586), exe(17548337), exe(199), exe(17969103), bin(235037), exe(18250973), bin(192189), bin(14741220), bin(3471971), bin(127397), bin(338998), exe(23702808)Available download formats
    Dataset updated
    Feb 22, 2022
    Dataset provided by
    DANS Data Station Physical and Technical Sciences
    Authors
    S. Karam; S. Karam
    License

    https://doi.org/10.17026/fp39-0x58https://doi.org/10.17026/fp39-0x58

    Description

    These files are to support the published journal and thesis about the IMU and LIDAR SLAM for indoor mapping. They include datasets and functions used for point clouds generation. Date Submitted: 2022-02-21

  15. t

    SilviLaser 2021 Benchmark Dataset - Terrestrial Challenge

    • researchdata.tuwien.at
    • researchdata.tuwien.ac.at
    bin, csv, zip
    Updated Jun 25, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Markus Hollaus; Markus Hollaus; Yi-Chen Chen; Yi-Chen Chen (2024). SilviLaser 2021 Benchmark Dataset - Terrestrial Challenge [Dataset]. http://doi.org/10.48436/afdjq-ce434
    Explore at:
    bin, zip, csvAvailable download formats
    Dataset updated
    Jun 25, 2024
    Dataset provided by
    TU Wien
    Authors
    Markus Hollaus; Markus Hollaus; Yi-Chen Chen; Yi-Chen Chen
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Sep 27, 2021
    Description

    This benchmark dataset was acquired during the SilviLaser conference 2021 in Vienna. The benchmark aims to demonstrate the different terrestrial system's capabilities for capturing 3D scenes in various forest conditions. A number of universities, institutes, and companies participated and contributed their outputs to this dataset, compiled by terrestrial laser scanning (TLS), mobile laser scanning (MLS), as well as terrestrial photogrammetric systems (TPS). Along with the terrestrial data, one airborne laser scanning (ALS) data was provided as a reference.

    Eight forest plots were installed in the terrestrial challenge. Each plot was formed with a 25-meter radius circular area and different tree species (i.e. spruce, pine, beech, white fir), forest structures (i.e. one layer, multi-layer, natural regeneration, deadwood), and age classes (~50 – 120 years). The 3D point clouds acquired by each participant cover the eight plots. In addition to point clouds, traditional in-situ data (tree position, tree species, DBH) were recorded by the organization team.

    All point clouds provided by participants were processed in the following steps: co-registration with geo-referenced data, setting a uniform coordinate reference system (CRS), and removing data located out of the plot. This work was performed by OPALS, a laser scanning data processing software developed by the Photogrammetry Group of the TU Wien Department of Geodesy and Geoinformation. Please note that some point clouds are not archived due to problems encountered during pre-processing. The final products consist of one metadata, 3D point clouds, ALS data for reference, and corresponding digital terrain models (DTM) derived from the ALS data using OPALS software. Point clouds are in laz 1.4 format, and DTMs are raster models in GeoTIFF format. Furthermore, all geo-data use CRS of WGS84 / UTM zone 33N (EPSG:32633). More information (e.g. instrument, point density, and extra attributes) can be found in the file "SL21BM_TER_metadata.csv".

    This dataset is available to the community for a wide variety of scientific studies. These unique data sets will also form the basis for an international benchmark for parameter retrieval from different 3D recording methods.

    Acknowledgements

    This dataset was contributed by the universities/institutes/companies (alphabetical order):

    • Czech University of Life Sciences Prague
    • Forest Design
    • Green Valley International
    • RIEGL
    • Silva Tarouca Research Institute
    • Swiss Federal Institute for Forest, Snow and Landscape Research
    • Umweltdata GmbH
    • University of Natural Resources and Life Sciences
    • Wageningen University & Research

    Notes

    1. In terms of in-situ data, please contact Markus Hollaus for details.
    2. To perform a bulk download, please use this file to get the URL list.

  16. d

    Lidar derived shoreline for Beaver Lake near Rogers, Arkansas, 2018

    • catalog.data.gov
    • data.usgs.gov
    • +3more
    Updated Jul 6, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Lidar derived shoreline for Beaver Lake near Rogers, Arkansas, 2018 [Dataset]. https://catalog.data.gov/dataset/lidar-derived-shoreline-for-beaver-lake-near-rogers-arkansas-2018
    Explore at:
    Dataset updated
    Jul 6, 2024
    Dataset provided by
    U.S. Geological Survey
    Area covered
    Beaver Lake, Arkansas, Rogers
    Description

    Beaver Lake was constructed in 1966 on the White River in the northwest corner of Arkansas for flood control, hydroelectric power, public water supply, and recreation. The surface area of Beaver Lake is about 27,900 acres and approximately 449 miles of shoreline are at the conservation pool level (1,120 feet above the North American Vertical Datum of 1988). Sedimentation in reservoirs can result in reduced water storage capacity and a reduction in usable aquatic habitat. Therefore, accurate and up-to-date estimates of reservoir water capacity are important for managing pool levels, power generation, water supply, recreation, and downstream aquatic habitat. Many of the lakes operated by the U.S. Army Corps of Engineers are periodically surveyed to monitor bathymetric changes that affect water capacity. In October 2018, the U.S. Geological Survey, in cooperation with the U.S. Army Corps of Engineers, completed one such survey of Beaver Lake using a multibeam echosounder. The echosounder data was combined with light detection and ranging (lidar) data to prepare a bathymetric map and a surface area and capacity table. Collection of bathymetric data in October 2018 at Beaver Lake near Rogers, Arkansas, used a marine-based mobile mapping unit that operates with several components: a multibeam echosounder (MBES) unit, an inertial navigation system (INS), and a data acquisition computer. Bathymetric data were collected using the MBES unit in longitudinal transects to provide complete coverage of the lake. The MBES was tilted in some areas to improve data collection along the shoreline, in coves, and in areas that are shallower than 2.5 meters deep (the practical limit of reasonable and safe data collection with the MBES). Two bathymetric datasets collected during the October 2018 survey include the gridded bathymetric point data (BeaverLake2018_bathy.zip) computed on a 3.28-foot (1-meter) grid using the Combined Uncertainty and Bathymetry Estimator (CUBE) method, and the bathymetric quality-assurance dataset (BeaverLake2018_QA.zip). The gridded point data used to create the bathymetric surface (BeaverLake2018_bathy.zip) was quality-assured with data from 9 selected resurvey areas (BeaverLake2018_QA.zip) to test the accuracy of the gridded bathymetric point data. The data are provided as comma delimited text files that have been compressed into zip archives. The shoreline was created from bare-earth lidar resampled to a 3.28-foot (1-meter) grid spacing. A contour line representing the flood pool elevation of 1,135 feet was generated from the gridded data. The data are provided in the Environmental Systems Research Institute shapefile format and have the common root name of BeaverLake2018_1135-ft. All files in the shapefile group must be retrieved to be useable.

  17. d

    Data from: EAARL Topography--Three Mile Creek and Mobile-Tensaw Delta,...

    • catalog.data.gov
    • data.usgs.gov
    • +3more
    Updated Jul 6, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). EAARL Topography--Three Mile Creek and Mobile-Tensaw Delta, Alabama, 2010 [Dataset]. https://catalog.data.gov/dataset/eaarl-topography-three-mile-creek-and-mobile-tensaw-delta-alabama-2010
    Explore at:
    Dataset updated
    Jul 6, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    Alabama, Mobile–Tensaw River Delta
    Description

    A digital elevation model (DEM) of a portion of the Mobile-Tensaw Delta region and Three Mile Creek in Alabama was produced from remotely sensed, geographically referenced elevation measurements by the U.S. Geological Survey (USGS). Elevation measurements were collected over the area (bathymetry was irresolvable) using the Experimental Advanced Airborne Research Lidar (EAARL), a pulsed laser ranging system mounted onboard an aircraft to measure ground elevation, vegetation canopy, and coastal topography. The system uses high-frequency laser beams directed at the Earth's surface through an opening in the bottom of the aircraft's fuselage. The laser system records the time difference between emission of the laser beam and the reception of the reflected laser signal in the aircraft. The plane travels over the target area at approximately 50 meters per second at an elevation of approximately 300 meters, resulting in a laser swath of approximately 240 meters with an average point spacing of 2-3 meters. The EAARL, developed originally by the National Aeronautics and Space Administration (NASA) at Wallops Flight Facility in Virginia, measures ground elevation with a vertical resolution of +/-15 centimeters. A sampling rate of 3 kilohertz or higher results in an extremely dense spatial elevation dataset. Over 100 kilometers of coastline can be surveyed easily within a 3- to 4-hour mission. When resultant elevation maps for an area are analyzed, they provide a useful tool to make management decisions regarding land development. For more information on Lidar science and the Experimental Advanced Airborne Research Lidar (EAARL) system and surveys, see http://ngom.usgs.gov/dsp/overview/index.php and http://ngom.usgs.gov/dsp/tech/eaarl/index.php .

  18. D

    Indoor Mobile Laser Scanning System

    • phys-techsciences.datastations.nl
    bin, c, exe, txt, zip
    Updated Nov 16, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    S Karam; S Karam (2023). Indoor Mobile Laser Scanning System [Dataset]. http://doi.org/10.17026/DANS-ZMU-3GYP
    Explore at:
    bin(484071), bin(1301), bin(74406), bin(63900), bin(2073), bin(859676), bin(157), bin(102), bin(1472), bin(2009), c(4337), bin(5208), bin(114), bin(2179), c(171530), bin(193), bin(2258), bin(558), exe(403), bin(1933), c(6072), txt(929), bin(368), bin(36591), bin(29867), bin(696101473), c(6402), bin(2472), txt(1489657), bin(1266), bin(34223), bin(171648), bin(71381), bin(3509377597), bin(29487), bin(2574), bin(4416), bin(1365), c(14454), exe(3755520), bin(372), bin(28895), c(38452), zip(5916), bin(248528), bin(33039), bin(3658813), bin(1976), bin(25343), bin(27119), bin(11408), bin(3602), bin(19382), bin(45471), zip(117310), bin(3348), bin(682930), txt(576), bin(47247), bin(2112), c(1211), bin(106), bin(13021), bin(501), bin(2392), bin(2199), bin(7654), c(42553), bin(2327), bin(30079), bin(31855), bin(175), bin(827776), bin(26527), bin(1488), bin(7765), bin(191), bin(2178), bin(2038), bin(2173), bin(86373), bin(25412), bin(2786), bin(94), bin(257784), exe(1234), bin(25935), c(18841), c(5106), bin(2293)Available download formats
    Dataset updated
    Nov 16, 2023
    Dataset provided by
    DANS Data Station Physical and Technical Sciences
    Authors
    S Karam; S Karam
    License

    https://doi.org/10.17026/fp39-0x58https://doi.org/10.17026/fp39-0x58

    Description

    These files are to support the published journal paper about indoor backpack mobile mapping system. They include point cloud conversion, segmentation and SLAM codes. In addition, code of evaluation method published in the journal paper. The resulting laser point cloud file is also uploaded.

  19. Z

    BLE RSS dataset for fingerprinting radio map calibration

    • data.niaid.nih.gov
    • explore.openaire.eu
    Updated Sep 20, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Marcin Kolakowski (2021). BLE RSS dataset for fingerprinting radio map calibration [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_5457590
    Explore at:
    Dataset updated
    Sep 20, 2021
    Dataset authored and provided by
    Marcin Kolakowski
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The dataset contains Bluetooth Low Energy signal strengths measured in a fully furnished flat. The dataset was originally used in the study concerning RSS-fingerprinting based indoor positioning systems. The data were gathered using a hybrid BLE-UWB localization system, which was installed in the apartment and a mobile robotic platform equipped for a LiDAR. The dataset comprises power measurement results and LiDAR scans performed in 4104 points. The scans used for initial environment mapping and power levels registered in two test scenarios are also attached.

    The set contains both raw and preprocessed measurement data. The Python code for raw data loading is supplied.

    The detailed dataset description can be found in the dataset_description.pdf file.

    When using the dataset, please consider citing the original paper, in which the data were used:

    M. Kolakowski, “Automated Calibration of RSS Fingerprinting Based Systems Using a Mobile Robot and Machine Learning”, Sensors , vol. 21, 6270, Sep. 2021 https://doi.org/10.3390/s21186270

  20. Mobile Integrated Profiling System (MIPS) Ceilometer Data

    • data.ucar.edu
    • ckanprod.ucar.edu
    ascii
    Updated Dec 26, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Justin Walters; Kevin Knupp (2024). Mobile Integrated Profiling System (MIPS) Ceilometer Data [Dataset]. http://doi.org/10.26023/KW91-Y4XA-AV11
    Explore at:
    asciiAvailable download formats
    Dataset updated
    Dec 26, 2024
    Dataset provided by
    University Corporation for Atmospheric Research
    Authors
    Justin Walters; Kevin Knupp
    Time period covered
    May 19, 2003 - Jul 6, 2003
    Area covered
    Description

    This dataset contains Lidar Mobile Integrated Profiling System (MIPS) Ceilometer data from the University of Alabama in Huntsville.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Jennifer Leahy (2024). Camera-LiDAR Datasets [Dataset]. http://doi.org/10.6084/m9.figshare.26660863.v1

Camera-LiDAR Datasets

Explore at:
zipAvailable download formats
Dataset updated
Aug 14, 2024
Dataset provided by
figshare
Authors
Jennifer Leahy
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

The datasets are original and specifically collected for research aimed at reducing registration errors between Camera-LiDAR datasets. Traditional methods often struggle with aligning 2D-3D data from sources that have different coordinate systems and resolutions. Our collection comprises six datasets from two distinct setups, designed to enhance versatility in our approach and improve matching accuracy across both high-feature and low-feature environments.Survey-Grade Terrestrial Dataset:Collection Details: Data was gathered across various scenes on the University of New Brunswick campus, including low-feature walls, high-feature laboratory rooms, and outdoor tree environments.Equipment: LiDAR data was captured using a Trimble TX5 3D Laser Scanner, while optical images were taken with a Canon EOS 5D Mark III DSLR camera.Mobile Mapping System Dataset:Collection Details: This dataset was collected using our custom-built Simultaneous Localization and Multi-Sensor Mapping Robot (SLAMM-BOT) in several indoor mobile scenes to validate our methods.Equipment: Data was acquired using a Velodyne VLP-16 LiDAR scanner and an Arducam IMX477 Mini camera, controlled via a Raspberry Pi board.

Search
Clear search
Close search
Google apps
Main menu