100+ datasets found
  1. a

    Elevation from Lidar (Image Service)

    • czm-moris-mass-eoeea.hub.arcgis.com
    • gis.data.mass.gov
    • +1more
    Updated Jul 23, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    MassGIS - Bureau of Geographic Information (2020). Elevation from Lidar (Image Service) [Dataset]. https://czm-moris-mass-eoeea.hub.arcgis.com/datasets/49cbba6636fa4c41a5ea162ccf1e41bc
    Explore at:
    Dataset updated
    Jul 23, 2020
    Dataset authored and provided by
    MassGIS - Bureau of Geographic Information
    Area covered
    Description

    This is a seamless bare earth digital elevation model (DEM) created from lidar terrain elevation data for the Commonwealth of Massachusetts. It represents the elevation of the surface with vegetation and structures removed. The spatial resolution of the map is 1 meter. The elevation of each 1-meter square cell was linearly interpolated from classified lidar-derived point data.This version of the DEM stores the elevation values as integers. The native VALUE field represents the elevation above/below sea level in meters. MassGIS added a FEET field to the VAT (value attribute table) to store the elevation in feet as calculated by multiplying VALUE x 3.28084.Dates of lidar data used in this DEM range from 2010-2015. The overlapping lidar projects were adjusted to the same projection and datum and then mosaicked, with the most recent data replacing any older data. Several very small gaps between the project areas were patched with older lidar data where necessary or with models from recent aerial photo acquisitions. See https://www.mass.gov/doc/lidar-project-areas-original/download for an index map.This DEM is referenced to the WGS_1984_Web_Mercator_Auxiliary_Sphere spatial reference.See the MassGIS datalayer page to download the data as a file geodatabase raster dataset.View this service in the Massachusetts Elevation Finder.

  2. F

    i.c.sens Visual-Inertial-LiDAR Dataset

    • data.uni-hannover.de
    bag, jpeg, pdf, png +2
    Updated Dec 12, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    i.c.sens (2024). i.c.sens Visual-Inertial-LiDAR Dataset [Dataset]. https://data.uni-hannover.de/dataset/i-c-sens-visual-inertial-lidar-dataset
    Explore at:
    txt(285), png(650007), jpeg(153522), txt(1049), jpeg(129333), rviz(6412), bag(7419679751), bag(9980268682), bag(9982003259), bag(9960305979), pdf(21788288), jpeg(556618), bag(9971699339), bag(9896857478), bag(9939783847), bag(9969171093)Available download formats
    Dataset updated
    Dec 12, 2024
    Dataset authored and provided by
    i.c.sens
    License

    Attribution-NonCommercial 3.0 (CC BY-NC 3.0)https://creativecommons.org/licenses/by-nc/3.0/
    License information was derived automatically

    Description

    The i.c.sens Visual-Inertial-LiDAR Dataset is a data set for the evaluation of dead reckoning or SLAM approaches in the context of mobile robotics. It consists of street-level monocular RGB camera images, a front-facing 180° point cloud, angular velocities, accelerations and an accurate ground truth trajectory. In total, we provide around 77 GB of data resulting from a 15 minutes drive, which is split into 8 rosbags of 2 minutes (10 GB) each. Besides, the intrinsic camera parameters and the extrinsic transformations between all sensor coordinate systems are given. Details on the data and its usage can be found in the provided documentation file.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/0ff90ef9-fa61-4ee3-b69e-eb6461abc57b/download/sensor_platform_small.jpg" alt="">

    Image credit: Sören Vogel

    The data set was acquired in the context of the measurement campaign described in Schoen2018. Here, a vehicle, which can be seen below, was equipped with a self-developed sensor platform and a commercially available Riegl VMX-250 Mobile Mapping System. This Mobile Mapping System consists of two laser scanners, a camera system and a localization unit containing a highly accurate GNSS/IMU system.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/2a1226b8-8821-4c46-b411-7d63491963ed/download/vehicle_small.jpg" alt="">

    Image credit: Sören Vogel

    The data acquisition took place in May 2019 during a sunny day in the Nordstadt of Hannover (coordinates: 52.388598, 9.716389). The route we took can be seen below. This route was completed three times in total, which amounts to a total driving time of 15 minutes.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/8a570408-c392-4bd7-9c1e-26964f552d6c/download/google_earth_overview_small.png" alt="">

    The self-developed sensor platform consists of several sensors. This dataset provides data from the following sensors:

    • Velodyne HDL-64 LiDAR
    • LORD MicroStrain 3DM-GQ4-45 GNSS aided IMU
    • Pointgrey GS3-U3-23S6C-C RGB camera

    To inspect the data, first start a rosmaster and launch rviz using the provided configuration file:

    roscore & rosrun rviz rviz -d icsens_data.rviz
    

    Afterwards, start playing a rosbag with

    rosbag play icsens-visual-inertial-lidar-dataset-{number}.bag --clock
    

    Below we provide some exemplary images and their corresponding point clouds.

    https://data.uni-hannover.de/dataset/0bcea595-0786-44f6-a9e2-c26a779a004b/resource/dc1563c0-9b5f-4c84-b432-711916cb204c/download/combined_examples_small.jpg" alt="">

    Related publications:

    • R. Voges, C. S. Wieghardt, and B. Wagner, “Finding Timestamp Offsets for a Multi-Sensor System Using Sensor Observations,” Photogrammetric Engineering & Remote Sensing, vol. 84, no. 6, pp. 357–366, 2018.

    • R. Voges and B. Wagner, “RGB-Laser Odometry Under Interval Uncertainty for Guaranteed Localization,” in Book of Abstracts of the 11th Summer Workshop on Interval Methods (SWIM 2018), Rostock, Germany, Jul. 2018.

    • R. Voges and B. Wagner, “Timestamp Offset Calibration for an IMU-Camera System Under Interval Uncertainty,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, Oct. 2018.

    • R. Voges and B. Wagner, “Extrinsic Calibration Between a 3D Laser Scanner and a Camera Under Interval Uncertainty,” in Book of Abstracts of the 12th Summer Workshop on Interval Methods (SWIM 2019), Palaiseau, France, Jul. 2019.

    • R. Voges, B. Wagner, and V. Kreinovich, “Efficient Algorithms for Synchronizing Localization Sensors Under Interval Uncertainty,” Reliable Computing (Interval Computations), vol. 27, no. 1, pp. 1–11, 2020.

    • R. Voges, B. Wagner, and V. Kreinovich, “Odometry under Interval Uncertainty: Towards Optimal Algorithms, with Potential Application to Self-Driving Cars and Mobile Robots,” Reliable Computing (Interval Computations), vol. 27, no. 1, pp. 12–20, 2020.

    • R. Voges and B. Wagner, “Set-Membership Extrinsic Calibration of a 3D LiDAR and a Camera,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, Oct. 2020, accepted.

    • R. Voges, “Bounded-Error Visual-LiDAR Odometry on Mobile Robots Under Consideration of Spatiotemporal Uncertainties,” PhD thesis, Gottfried Wilhelm Leibniz Universität, 2020.

  3. a

    Intensity Images - USGS LiDAR

    • hub.arcgis.com
    • data-dauphinco.opendata.arcgis.com
    • +1more
    Updated May 1, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dauphin County, PA (2018). Intensity Images - USGS LiDAR [Dataset]. https://hub.arcgis.com/documents/f44ca0ba1a2d4551be483da92f500442
    Explore at:
    Dataset updated
    May 1, 2018
    Dataset authored and provided by
    Dauphin County, PA
    Description

    The Dauphin County, PA 2016 QL2 LiDAR project called for the planning, acquisition, processing and derivative products of LIDAR data to be collected at a nominal pulse spacing (NPS) of 0.7 meters. Project specifications are based on the U.S. Geological Survey National Geospatial Program Base LIDAR Specification, Version 1.2. The data was developed based on a horizontal projection/datum of NAD83 (2011) State Plane Pennsylvania South Zone, US survey feet; NAVD1988 (Geoid 12B), US survey feet. LiDAR data was delivered in RAW flight line swath format, processed to create Classified LAS 1.4 Files formatted to 711 individual 5,000-foot x 5,000-foot tiles. Tile names use the following naming schema: "YYYYXXXXPAd" where YYYY is the first 3 characters of the tile's upper left corner Y-coordinate, XXXX - the first 4 characters of the tile's upper left corner X-coordinate, PA = Pennsylvania, and d = 'N' for North or 'S' for South. Corresponding 2.5-foot gridded hydro-flattened bare earth raster tiled DEM files and intensity image files were created using the same 5,000-foot x 5,000-foot schema. Hydro-flattened breaklines were produced in Esri file geodatabase format. Continuous 2-foot contours were produced in Esri file geodatabase format. Ground Conditions: LiDAR collection began in Spring 2016, while no snow was on the ground and rivers were at or below normal levels. In order to post process the LiDAR data to meet task order specifications, Quantum Spatial established a total of 84 control points (24 calibration control points and 60 QC checkpoints). These were used to calibrate the LIDAR to known ground locations established throughout the project area.

  4. i

    Image-grade LiDAR dataset

    • ieee-dataport.org
    Updated May 10, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Yusheng Wang (2022). Image-grade LiDAR dataset [Dataset]. https://ieee-dataport.org/documents/image-grade-lidar-dataset
    Explore at:
    Dataset updated
    May 10, 2022
    Authors
    Yusheng Wang
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset provides a dataset of high resolution image-grade LiDAR SLAM in .bag format.

  5. D

    OC 2017 LiDAR Image Service

    • detroitdata.org
    • portal.datadrivendetroit.org
    • +3more
    Updated May 18, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Oakland County, Michigan (2021). OC 2017 LiDAR Image Service [Dataset]. https://detroitdata.org/dataset/oc-2017-lidar-image-service1
    Explore at:
    html, arcgis geoservices rest apiAvailable download formats
    Dataset updated
    May 18, 2021
    Dataset provided by
    Oakland County, Michigan
    Description

    BY USING THIS WEBSITE OR THE CONTENT THEREIN, YOU AGREE TO THE TERMS OF USE.

    The Classified Point Cloud (LAS) for the 2017 Michigan LiDAR project covering approximately 907 square miles, covering Oakland County. LAS data products are suitable for 1 foot contour generation. USGS LiDAR Base Specification 1.2, QL2. 19.6 cm NVA.

    This data is for planning purposes only and should not be used for legal or cadastral purposes. Any conclusions drawn from analysis of this information are not the responsibility of Sanborn Map Company. Users should be aware that temporal changes may have occurred since this dataset was collected and some parts of this dataset may no longer represent actual surface conditions. Users should not use these data for critical applications without a full awareness of its limitations.

    This service is best used directly within ArcMap or ArcGIS Pro.If the raw LiDAR points are needed, use these clients to extract project area size portions. Due to the density of the data, downloading the entire County from this service is not possible. For further questions, contact the Oakland County Service Center at 248-858-8812, servicecenter@oakgov.com.

  6. m

    Extended Evaluation of SnowPole Detection for Machine-Perceivable...

    • data.mendeley.com
    Updated Jun 30, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Durga Prasad Bavirisetti (2025). Extended Evaluation of SnowPole Detection for Machine-Perceivable Infrastructure for Nordic Winter Conditions: A Comparative Study of Object Detection Models [Dataset]. http://doi.org/10.17632/tt6rbx7s3h.3
    Explore at:
    Dataset updated
    Jun 30, 2025
    Authors
    Durga Prasad Bavirisetti
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    In this study, we present an extensive evaluation of state-of-the-art YOLO object detection architectures for identifying snow poles in LiDAR-derived imagery captured under challenging Nordic conditions. Building upon our previous work on the SnowPole Detection dataset [1] and our LiDAR–GNSS-based localization framework [2], we expand the benchmark to include six YOLO models—YOLOv5s, YOLOv7-tiny, YOLOv8n, YOLOv9t, YOLOv10n, and YOLOv11n—evaluated across multiple input modalities. Specifically, we assess single-channel modalities (Reflectance, Signal, Near-Infrared) and six pseudo-color combinations derived by mapping these channels to RGB representations. Each model’s performance is quantified using Precision, Recall, mAP@50, mAP@50–95, and GPU inference latency. To facilitate systematic comparison, we define a composite Rank Score that integrates detection accuracy and real-time performance in a weighted formulation. Experimental results show that YOLOv9t consistently achieves the highest detection accuracy, while YOLOv11n provides the best trade-off between accuracy and inference speed, making it a promising candidate for real-time applications on embedded platforms. Among input modalities, pseudo-color combinations—particularly those fusing Near-Infrared, Signal, and Reflectance channels—outperformed single modalities across most configurations, achieving the highest Rank Scores and mAP metrics. Therefore, we recommend using multimodal LiDAR representations such as Combination 4 and Combination 5 to maximize detection robustness in practical deployments. All datasets, benchmarking code, and trained models are publicly avail- able to support reproducibility and further research through our GitHub repository (a).

    References [1] Durga Prasad Bavirisetti, Gabriel Hanssen Kiss, Petter Arnesen, Hanne Seter, Shaira Tabassum, and Frank Lindseth. Snowpole detection: A comprehensive dataset for detection and localization using lidar imaging in nordic winter conditions. Data in Brief, 59:111403, 2025. [2] Durga Prasad Bavirisetti, Gabriel Hanssen Kiss, and Frank Lindseth. A pole detection and geospatial localization framework using lidar-gnss data fusion. In 2024 27th International Conference on Information Fusion (FUSION), pages 1–8. IEEE, 2024. (a) https://github.com/MuhammadIbneRafiq/Extended-evaluation-snowpole-lidar-dataset

  7. m

    Shaded Relief from LiDAR (Image Service)

    • gis.data.mass.gov
    • geo-massdot.opendata.arcgis.com
    Updated Nov 23, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    MassGIS - Bureau of Geographic Information (2021). Shaded Relief from LiDAR (Image Service) [Dataset]. https://gis.data.mass.gov/datasets/7377a612845a493c9987216a67a9919c
    Explore at:
    Dataset updated
    Nov 23, 2021
    Dataset authored and provided by
    MassGIS - Bureau of Geographic Information
    Area covered
    Description

    This shaded relief image was generated from the lidar-based bare-earth digital elevation model (DEM). A shaded relief image provides an illustration of variations in elevation using artificial shadows. Based on a specified position of the sun, areas that would be in sunlight are highlighted and areas that would be in shadow are shaded. In this instance, the position of the sun was assumed to be 45 degrees above the northwest horizon.The shaded relief image shows areas that are not in direct sunlight as shadowed. It does not show shadows that would be cast by topographic features onto the surrounding surface.Using ERDAS IMAGINE, a 3X3 neighborhood around each pixel in the DEM was analyzed, and a comparison was made between the sun's position and the angle that each pixel faces. The pixel was then assigned a value between -1 and +1 to represent the amount of light reflected. Negative numbers and zero values represent shadowed areas, and positive numbers represent sunny areas. In ArcGIS Desktop 10.7.1, the image was converted to a JPEG 2000 format with values from 0 (black) to 255 (white).See the MassGIS datalayer page to download the data as a JPEG 2000 image file.View this service in the Massachusetts Elevation Finder.MassGIS has also published a Lidar Shaded Relief tile service (cache) hosted in ArcGIS Online.

  8. Open Topographic Lidar Data - Dataset - data.gov.ie

    • data.gov.ie
    Updated Oct 22, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    data.gov.ie (2021). Open Topographic Lidar Data - Dataset - data.gov.ie [Dataset]. https://data.gov.ie/dataset/open-topographic-lidar-data
    Explore at:
    Dataset updated
    Oct 22, 2021
    Dataset provided by
    data.gov.ie
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This data was collected by the Geological Survey Ireland, the Department of Culture, Heritage and the Gaeltacht, the Discovery Programme, the Heritage Council, Transport Infrastructure Ireland, New York University, the Office of Public Works and Westmeath County Council. All data formats are provided as GeoTIFF rasters but are at different resolutions. Data resolution varies depending on survey requirements. Resolutions for each organisation are as follows: GSI – 1m DCHG/DP/HC - 0.13m, 0.14m, 1m NY – 1m TII – 2m OPW – 2m WMCC - 0.25m Both a DTM and DSM are raster data. Raster data is another name for gridded data. Raster data stores information in pixels (grid cells). Each raster grid makes up a matrix of cells (or pixels) organised into rows and columns. The grid cell size varies depending on the organisation that collected it. GSI data has a grid cell size of 1 meter by 1 meter. This means that each cell (pixel) represents an area of 1 meter squared.

  9. W

    EarthScope Northern California LiDAR Project

    • wifire-data.sdsc.edu
    • catalog.data.gov
    • +1more
    laz
    Updated Feb 3, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    OpenTopography (2022). EarthScope Northern California LiDAR Project [Dataset]. https://wifire-data.sdsc.edu/dataset/earthscope-northern-california-lidar-project
    Explore at:
    lazAvailable download formats
    Dataset updated
    Feb 3, 2022
    Dataset provided by
    OpenTopography
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Northern California, California
    Description

    The EarthScope Northern California Lidar project acquired high resolution airborne laser swath mapping imagery along major active faults as part of the EarthScope Facility project funded by the National Science Foundation (NSF). Between this project and the previously conducted B4 project, also funded by NSF, the entire San Andreas fault system has now been imaged with high resolution airborne lidar, along with many other important geologic features. EarthScope is funded by NSF and conducted in partnership with the USGS and NASA. GeoEarthScope is a component of EarthScope that includes the acquisition of aerial and satellite imagery and geochronology. EarthScope is managed at UNAVCO. Please use the following language to acknowledge EarthScope Lidar: This material is based on services provided to the Plate Boundary Observatory by NCALM (http://www.ncalm.org). PBO is operated by UNAVCO for EarthScope (http://www.earthscope.org) and supported by the National Science Foundation (No. EAR-0350028 and EAR-0732947).

  10. d

    Tas Imagery & LiDAR Program Index

    • data.gov.au
    html, unknown format
    Updated Apr 12, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Land Tasmania (2024). Tas Imagery & LiDAR Program Index [Dataset]. https://data.gov.au/dataset/ds-listtas-d72ff038-1321-4b62-b634-54119ab3ce67
    Explore at:
    html, unknown formatAvailable download formats
    Dataset updated
    Apr 12, 2024
    Dataset provided by
    Land Tasmania
    Description

    The Tas Imagery and LiDAR Program Index shows the planning and progress of current and future capture of aerial imagery and LiDAR data procured through the Tasmanian Imagery Program. The Tas Imagery and LiDAR Program Index shows the planning and progress of current and future capture of aerial imagery and LiDAR data procured through the Tasmanian Imagery Program.

  11. KUCL: Korea University Camera-LIDAR Dataset

    • zenodo.org
    • data.niaid.nih.gov
    zip
    Updated Jan 28, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jaehyeon Kang; Jaehyeon Kang; Nakju Lett Doh; Nakju Lett Doh (2020). KUCL: Korea University Camera-LIDAR Dataset [Dataset]. http://doi.org/10.5281/zenodo.2640062
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jan 28, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Jaehyeon Kang; Jaehyeon Kang; Nakju Lett Doh; Nakju Lett Doh
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Overview

    The Korea University Camera-LIDAR (KUCL) dataset contains images and point clouds acquired in indoor and outdoor environments for various applications (e.g., calibration of rigid-body transformation between camera and LIDAR) in robotics and computer vision communities.

    • Indoor dataset: contains 63 pairs of images and point clouds ('indoor.zip'). We collected the indoor dataset in a static indoor environment with walls, floor, and ceiling.
    • Outdoor dataset: 61 pairs of images and point clouds ('outdoor.zip'). We collected the outdoor dataset in an outdoor environment including buildings and trees.

    Setup

    The images were taken using a Point Grey Ladybug5 (specifications) camera and point clouds were acquired with a Velodyne VLP-16 LIDAR (specifications). We rigidly mounted both sensors on the sensor frame during the overall data acquisition. Each pair of images and point clouds was discretely acquired while maintaining the sensor system standing still to reduce time-synchronization problems.

    Description

    Each dataset (zip file) is organized as follows:

    • images/pano: This folder contains spherical panorama images (8000 X 4000) collected using the Ladybug5.
    • images/pinhole/cam0~cam5: These folders contain rectified pinhole images (2448 X 2048) collected using six cameras (cam0~cam5) of the Ladybug5.
    • images/pinhole/mask: This folder contains the mask (BW image) of each camera of the Ladybug5.
    • images/pinhole/cam_param_pinhole.txt: This file contains extrinsic (transformation from the Ladybug5 to each lens) and intrinsic (focal length and center) parameters of each lens of the Ladybug5. For details of Ladybug5 coordinate system, please refer to the technical application note.
    • scans: This folder contains point clouds collected using the VLP-16 LIDAR in text files. The first line of each file is the number of points (N), and the remaining lines are points and corresponding reflectivities (N X 4).

    We also provide MATLAB functions projecting point cloud onto spherical panorama and pinhole images. Before running the following functions, please unzip the dataset file ('indoor.zip' or 'outdoor.zip') under the main directory.

    • run_pano_projection.m: This function projects points onto a spherical panorama image. Lines 19-20 select dataset and index of an image and a point cloud.
    • run_pinhole_projection.m: This function projects points onto a pinhole image. Lines 19-21 select dataset, index of an image and a point cloud, and pinhole camera index.

    The rigid-body transformation between the Ladybug5 and the VLP-16 in each function is acquired using our edge-based Camera-LIDAR calibration method with Gaussian Mixture Model (GMM). For the details, please refer to our paper (https://doi.org/10.1002/rob.21893).

    Citation

    Please cite the following paper when using this dataset in your work.

    • Jaehyeon Kang and Nakju L. Doh, "Automatic Targetless Camera-LIDAR Calibration by Aligning Edge with Gaussian Mixture Model," Journal of Field Robotics, vol. 37, no. 1, pp.158-179, 2020.
    • @ARTICLE {kang-2020-jfr,
      AUTHOR = {Jaehyeon Kang and Nakju Lett Doh},
      TITLE = {Automatic Targetless Camera–{LIDAR} Calibration by Aligning Edge with {Gaussian} Mixture Model},
      JOURNAL = {Journal of Field Robotics},
      YEAR = {2020},
      VOLUME = {37},
      NUMBER = {1},
      PAGES = {158--179},
      }

    License information

    The KUCL dataset is released under a Creative Commons Attribution 4.0 International License, CC BY 4.0

    Contact Information

    If you have any issues about the KUCL dataset, please contact us at kangjae07@gmail.com.

  12. w

    Thermal infrared imaging and co-acquired lidar for portions of Lake, Harney,...

    • data.wu.ac.at
    • datadiscoverystudio.org
    Updated Dec 4, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2017). Thermal infrared imaging and co-acquired lidar for portions of Lake, Harney, and Malheur Counties, Oregon [Dataset]. https://data.wu.ac.at/schema/geothermaldata_org/MGVlNzJhOTktMjlkOC00ODczLWI5NjAtNDMwNWRkZTYwYzEw
    Explore at:
    Dataset updated
    Dec 4, 2017
    Area covered
    b2f8b53754f07521fe8ce918e03a131ae6b176cc
    Description

    ARRA Supplemental Deliverable, Task 8: Digital Data Series Thermal Infrared Information Layer for Oregon (contains all or portions of 42120H6-Ana River; 43120C5-Christmas Lake; 43120C6-Crack In The Ground; 42120H5-Diablo Peak; 43118B1-Dowell Butte; 43118B2-Duck Creek Butte; 43120A7-Egli Rim; 43120B5-Fandango Canyon; 43118A3-Folly Farm; 43121C1-Fort Rock; 43120C4-Fossil Lake; 43121A1-Hager Mountain; 43118A3-Lambing Canyon; 43121C2-McCarty Butte; 43121B2-Oatman Flat; 43120A5-Saint Patrick Mountain; 43120C3-Sand Rock; 43120A6-Sheeplick Draw; 43121B2-Silver Lake; 42120H7-Summer Lake; 43120B8-Tuff Butte; 43118A1-Turnbull Peak) , Release-1 (TIRILO-1) contains thermal infrared intensity images, image-frames rectified, native image frames, and thermal infrared mosaics. Lidar ascii point data are available in LAS format; DEMs provided as bare-earth, highest-hit; and accompaning metadata. Other files include Shp files of 7.5 minute USGS quadrangles of Oregon, 1/100th USGS quadrangles of Oregon. All data are format specific to ESRI format - data must be viewed using specialty software capable of viewing .shp, geotif, and ESRI grid formats. Customers are responsible for sending DOGAMI a blank 400 GB portable external hard drive storage, USB 2.0 Interface. Fee $150.

  13. d

    2018 lidar-derived imagery of karst areas in Puerto Rico at 1-meter...

    • catalog.data.gov
    • data.usgs.gov
    • +1more
    Updated Jul 6, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). 2018 lidar-derived imagery of karst areas in Puerto Rico at 1-meter resolution [Dataset]. https://catalog.data.gov/dataset/2018-lidar-derived-imagery-of-karst-areas-in-puerto-rico-at-1-meter-resolution
    Explore at:
    Dataset updated
    Jul 6, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    Puerto Rico
    Description

    This raster dataset contains 1-meter lidar-derived imagery of 7.5 minute quadrangles in karst areas of Puerto Rico and was created using geographic information systems (GIS) software. Lidar-derived elevation data, acquired in 2018, were used to create a 1-meter resolution working digital elevation model (DEM). To create this imagery, a hillshade was applied and a topographic position index (TPI) raster was calculated. These two rasters were uploaded into GlobalMapper, where the TPI raster was made partially transparent and overlaid the hillshade DEM. The resulting image was exported to create these 1-meter resolution lidar-derived images. The data is projected in North America Datum (NAD) 1983 (2011) UTM Zone 19N.

  14. d

    LiDAR Index External (DWER-045) - Datasets - data.wa.gov.au

    • catalogue.data.wa.gov.au
    Updated Jan 23, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2018). LiDAR Index External (DWER-045) - Datasets - data.wa.gov.au [Dataset]. https://catalogue.data.wa.gov.au/dataset/lidar-index-external
    Explore at:
    Dataset updated
    Jan 23, 2018
    Area covered
    Western Australia
    Description

    The LiDAR Index was created to illustrate the extents of LiDAR imagery and data currently Existing or In the Progress or Planned for the Department of Water and Environmental Regulation (DWER). Each area is delineated by a polygon with attributes denoting its general area coverage, status, file location, Contractor and availability of metadata. Exists various datasets with varying degrees of accuracy, coverage and access. DWER custodial datasets can be purchased by external entities by contacting the Department of Water and Environmental Regulation.

  15. P

    DurLAR Dataset

    • paperswithcode.com
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    DurLAR Dataset [Dataset]. https://paperswithcode.com/dataset/durlar
    Explore at:
    Description

    DurLAR is a high-fidelity 128-channel 3D LiDAR dataset with panoramic ambient (near infrared) and reflectivity imagery for multi-modal autonomous driving applications. Compared to existing autonomous driving task datasets, DurLAR has the following novel features:

    High vertical resolution LiDAR with 128 channels, which is twice that of any existing datasets, full 360 degree depth, range accuracy to ±2 cm at 20-50m.
    Ambient illumination (near infrared) and reflectivity panoramic imagery are made available in the Mono16 format (2048 × 128 resolution), with this being only dataset to make this provision.
    No rolling shutter effect, as our flash LiDAR captures all 128 channels simultaneously.
    Ambient illumination data is recorded via an on-board lux meter, which is again not available in previous datasets.
    High-fidelity GNSS/INS available via an onboard OxTS navigation unit operating at 100 Hz and receiving position and timing data from multiple GNSS con-stellations in addition to GPS.
    KITTI data format adopted as the de facto dataset format such that it can be parsed using both the DurLAR development kit and existing KITTI-compatible tools.
    Diversity over repeated locations such that the dataset has been collected under diverse environmental and weather conditions over the same driving route with additional variations in the time of day relative to environmental conditions.

    Sensor placement

    LiDAR: Ouster OS1-128 LiDAR sensor with 128 channels vertical resolution

    Stereo Camera: Carnegie Robotics MultiSense S21 stereo camera with grayscale, colour, and IR enhanced imagers, 2048x1088 @ 2MP resolution

    GNSS/INS: OxTS RT3000v3 global navigation satellite and inertial navigation system, supporting localization from GPS, GLONASS, BeiDou, Galileo, PPP and SBAS constellations

    Lux Meter: Yocto Light V3, a USB ambient light sensor (lux meter), measuring ambient light up to 100,000 lux

  16. NCAR REAL Lidar Imagery [NCAR/EOL]

    • data.ucar.edu
    image
    Updated Dec 26, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    UCAR/NCAR - Earth Observing Laboratory (2024). NCAR REAL Lidar Imagery [NCAR/EOL] [Dataset]. http://doi.org/10.5065/D6TT4P9G
    Explore at:
    imageAvailable download formats
    Dataset updated
    Dec 26, 2024
    Dataset provided by
    University Corporation for Atmospheric Research
    Authors
    UCAR/NCAR - Earth Observing Laboratory
    Time period covered
    Mar 14, 2006 - May 1, 2006
    Area covered
    Description

    This dataset contains png images from the NCAR REAL Lidar Imagery dataset. The dataset is from the TREX period from 20060314 to 20060501 and contains both RHI and PPI images. When ordering or browsing data, be aware of the following data gap. There is no data for the following days: 20060424-28

  17. 2023 USGS Lidar: San Francisco, CA

    • fisheries.noaa.gov
    las/laz - laser +1
    Updated Jan 1, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    OCM Partners (2024). 2023 USGS Lidar: San Francisco, CA [Dataset]. https://www.fisheries.noaa.gov/inport/item/73386
    Explore at:
    las/laz - laser, not applicableAvailable download formats
    Dataset updated
    Jan 1, 2024
    Dataset provided by
    OCM Partners, LLC
    Time period covered
    Apr 20, 2023
    Area covered
    Description

    Original Product: These lidar data are processed Classified LAS 1.4 files, formatted to 654 individual 1000 m x 1000 m tiles; used to create intensity images, 3D breaklines, and hydro-flattened DEMs as necessary.

    Original Dataset Geographic Extent: 4 counties (Alameda, Marin, San Francisco, San Mateo) in California, covering approximately 53 total square miles.

    Original Dataset Descriptio...

  18. Data from: Table S1 LiDAR and Imagery Metadata

    • scholarscommons.fgcu.edu
    • figshare.com
    Updated Mar 29, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Matthew Ware (2021). Table S1 LiDAR and Imagery Metadata [Dataset]. https://scholarscommons.fgcu.edu/esploro/outputs/dataset/Table-S1-LiDAR-and-Imagery-Metadata/99384088493106570
    Explore at:
    Dataset updated
    Mar 29, 2021
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Matthew Ware
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Mar 29, 2021
    Description

    Supplementary Table. Metadata for the 23 LiDAR surveys used to create a temporally- and spatially-averaged digital elevation model of nesting beach in the Florida Panhandle. Used in Ware et al. (2021) Exposure of loggerhead sea turtle nests to waves in the Florida Panhandle.

  19. a

    LIDAR 2017 TopoBath Intensity Image

    • hub.arcgis.com
    Updated Mar 13, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NYC DCP Mapping Portal (2019). LIDAR 2017 TopoBath Intensity Image [Dataset]. https://hub.arcgis.com/documents/7c82be3a1d4d48158e4c6327e504dc93
    Explore at:
    Dataset updated
    Mar 13, 2019
    Dataset authored and provided by
    NYC DCP Mapping Portal
    Description

    The intensity values of the Green and NIR LiDAR laser returns from the New York City Topobathymetric LiDAR dataset. Collected between 05/03/17 and 07/26/17.

  20. World Bank - ImageCat Inc. - RIT Haiti Earthquake Lidar dataset

    • catalog.data.gov
    • cloud.csiss.gmu.edu
    • +3more
    Updated Nov 12, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rochester Institute of Technology, Center for Imaging Science (Originator); Rochester Institute of Technology, Information Products Laboratory for Emergency Response (Originator); Global Facility for Disaster Reduction and Recovery (Originator); ImageCat, Inc. (Originator); Kucera International, Inc. (Originator); null (Originator) (2020). World Bank - ImageCat Inc. - RIT Haiti Earthquake Lidar dataset [Dataset]. https://catalog.data.gov/dataset/world-bank-imagecat-inc-rit-haiti-earthquake-lidar-dataset
    Explore at:
    Dataset updated
    Nov 12, 2020
    Dataset provided by
    Global Facility for Disaster Reduction and Recoveryhttp://www.gfdrr.org/
    Area covered
    Haiti
    Description

    These lidar data were collected between January 21st and January 27th, 2010, in response to the January 12th magnitude 7.0 Haiti earthquake. The data collection was performed by the Center for Imaging Science at Rochester Institute of Technology (RIT) and Kucera International under sub-contract to ImageCat, Inc., and funded by the Global Facility for Disaster Recovery and Recovery (GFDRR) hosted at the World Bank. All data are available in the public domain. More information about these data can be found at the RIT Information Products Laboratory for Emergency Response (IPLER) 2010 Haiti Earthquake page.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
MassGIS - Bureau of Geographic Information (2020). Elevation from Lidar (Image Service) [Dataset]. https://czm-moris-mass-eoeea.hub.arcgis.com/datasets/49cbba6636fa4c41a5ea162ccf1e41bc

Elevation from Lidar (Image Service)

Explore at:
Dataset updated
Jul 23, 2020
Dataset authored and provided by
MassGIS - Bureau of Geographic Information
Area covered
Description

This is a seamless bare earth digital elevation model (DEM) created from lidar terrain elevation data for the Commonwealth of Massachusetts. It represents the elevation of the surface with vegetation and structures removed. The spatial resolution of the map is 1 meter. The elevation of each 1-meter square cell was linearly interpolated from classified lidar-derived point data.This version of the DEM stores the elevation values as integers. The native VALUE field represents the elevation above/below sea level in meters. MassGIS added a FEET field to the VAT (value attribute table) to store the elevation in feet as calculated by multiplying VALUE x 3.28084.Dates of lidar data used in this DEM range from 2010-2015. The overlapping lidar projects were adjusted to the same projection and datum and then mosaicked, with the most recent data replacing any older data. Several very small gaps between the project areas were patched with older lidar data where necessary or with models from recent aerial photo acquisitions. See https://www.mass.gov/doc/lidar-project-areas-original/download for an index map.This DEM is referenced to the WGS_1984_Web_Mercator_Auxiliary_Sphere spatial reference.See the MassGIS datalayer page to download the data as a file geodatabase raster dataset.View this service in the Massachusetts Elevation Finder.

Search
Clear search
Close search
Google apps
Main menu