U.S. Government Workshttps://www.usa.gov/government-works
License information was derived automatically
The dataset contains GPS survey data of ground control points deployed for aerial LiDAR data collection. The targets were deployed and surveyed on 8/12/2020 and the LiDAR flight was conducted on the morning of 8/13/2020. The control points were surveyed using Leica GS14 RTK GPS equipment, S/N rover-2806883, using the Real Time Network, Leica Smartnet, to determine absolute position.
Small Uncrewed Aircraft Systems (sUAS) were used to collect aerial remote sensing data over Town Neck Beach, Massachusetts. The area is a highly trafficked public beach with a parking lot, boardwalk, and renourishment and dune stabilization plan. On December 19th, 2023, after a recent placement of dredged sand and dune grass plug plantings, USGS personnel collected natural (RGB) color images, lidar, check points, and ground control points. These data were processed to produce a high-resolution lidar point cloud, digital elevation models, and a natural-color orthomosaic. Data are related to USGS Field activity 2023-028-FA and support observations of coastal change and lidar data testing.
https://www.promarketreports.com/privacy-policyhttps://www.promarketreports.com/privacy-policy
The global Mobile Mapping Systems market is experiencing robust growth, projected to reach $20,740 million in 2025 and maintain a Compound Annual Growth Rate (CAGR) of 16.0% from 2025 to 2033. This expansion is driven by several key factors. The increasing adoption of autonomous vehicles and the need for highly accurate and detailed maps for navigation and advanced driver-assistance systems (ADAS) are significantly fueling market demand. Furthermore, the growth of smart cities initiatives, requiring comprehensive infrastructure mapping for efficient urban planning and management, is a major contributor. Government and public sector investments in infrastructure projects, coupled with rising demand for location-based services across various sectors like transportation and logistics, real estate, and video entertainment, are also boosting market growth. The shift towards cloud-based solutions and the integration of advanced technologies like LiDAR and GPS are further enhancing the capabilities and efficiency of mobile mapping systems, attracting broader adoption. The market is segmented by system type (Direct Mobile Mapping System and Backpack Mobile Mapping System) and application (Automobile, Transportation & Logistics, Government & Public Sector, Video Entertainment, Real Estate, Travel & Hospitality, and Other). While the Automobile sector currently holds a significant market share, the Government & Public Sector and Transportation & Logistics segments are expected to witness substantial growth due to increasing infrastructure development and the need for efficient logistics management. Competition in the market is intense, with major players including Ericsson, Microsoft, Apple, Google, and TomTom continuously innovating and expanding their product offerings to cater to the evolving demands of various industries. The market's geographical distribution is diverse, with North America and Europe currently leading in adoption, followed by the Asia-Pacific region, which is expected to demonstrate significant growth potential in the coming years driven by economic development and increasing urbanization. This comprehensive report analyzes the burgeoning Mobile Mapping Systems (MMS) market, projected to reach $15 billion by 2030. It delves into key trends, competitive landscapes, and growth drivers, providing invaluable insights for businesses and investors alike. The report leverages extensive market research and data analysis to provide actionable intelligence on this rapidly evolving technology. Keywords: Mobile Mapping, LiDAR, 3D Mapping, GIS, Location-Based Services, Autonomous Vehicles, Mapping Technology, Geospatial Data.
Small Uncrewed Aircraft Systems (sUAS) were used to collect aerial remote sensing data over Marsh Island, a salt marsh restoration site along New Bedford Harbor, Massachusetts. Remediation of the site will involve direct hydrological and geochemical monitoring of the system alongside the UAS remote sensing data. On October 26th, 2023, USGS personnel collected natural (RGB) color images, multispectral images, lidar, and ground control points. These data were processed to produce a high resolution lidar point cloud (LPC), digital elevation models (surface and terrain), and natural-color and multispectral reflectance image mosaics. Data collection is related to USGS Field Activity 2023-025-FA and this release only provides the UAS portion.
Attribution-NonCommercial-NoDerivs 4.0 (CC BY-NC-ND 4.0)https://creativecommons.org/licenses/by-nc-nd/4.0/
License information was derived automatically
Adverse weather conditions, low-light environments, and bumpy road surfaces pose significant challenges to SLAM in robotic navigation and autonomous driving. Existing datasets in this field predominantly rely on single sensors or combinations of LiDAR, cameras, and IMUs. However, 4D millimeter-wave radar demonstrates robustness in adverse weather, infrared cameras excel in capturing details under low-light conditions, and depth images provide richer spatial information. Multi-sensor fusion methods also show potential for better adaptation to bumpy roads. Despite some SLAM studies incorporating these sensors and conditions, there remains a lack of comprehensive datasets addressing low-light environments and bumpy road conditions, or featuring a sufficiently diverse range of sensor data. In this study, we introduce a multi-sensor dataset covering challenging scenar ios such as snowy weather, rainy weather, nighttime conditions, speed bumps, and rough terrains. The dataset includes rarely utilized sensors for extreme conditions, such as 4D millimeter wave radar, infrared cameras, and depth cameras, alongside 3D LiDAR, RGB cameras, GPS, and IMU. It supports both autonomous driving and ground robot applications and provides reliable GPS/INS ground truth data, covering structured and semi-structured terrains. We evaluated various SLAM algorithms using this dataset, including RGB images, infrared images, depth images, LiDAR, and 4D millimeter-wave radar. The dataset spans a total of 18.5 km, 69 minutes, and approximately 660 GB, offering a valuable resource for advancing SLAM research under complex and extreme conditions.
This dataset provides information needed to reproduce a digital model of the Nogahabara Dune Field located in interior Alaska. The Nogahabara Dunes represent one of three active inland dune fields found in Alaska today. In an effort to update geospatial coverage of the dunes lidar data was collected over Nogahabara Sand Dunes in September 2015 using a 1955 Cessna 180 aircraft equipped with a Riegl brand LMS-Q240i laser scanner. The scanner was set to 10,000 laser shots per second and a +/- 30 degree beam sweep and flown over the Koyukuk National Wildlife Refuge’s Nogahabara sand dunes. The flight pattern was designed for 50 percent overlap between each adjacent swath to achieve 2 points per square meter over the entire coverage. The lidar scanner was rigidly attached to a OXTS brand Inertial+2 GPS/IMU unit, which was being fed by a Trimble R7 GPS receiver. The resultant survey achieved 1.4 points per square meter. Discrete-return point cloud data are available in the LAS format.
LIDAR data is remotely sensed high-resolution elevation data collected by an airborne collection platform. Using a combination of laser rangefinding, GPS positioning and inertial measurement technologies; LIDAR instruments are able to make highly detailed Digital Elevation Models (DEMs) of the earth's terrain, man-made structures and vegetation. This data was collected over a portion of Maui and Oahu, Hawaii with a Leica ALS-40 Aerial Lidar Sensor. Multiple returns were recorded for each pulse in addition to an intensity value. Original contact information: Contact Org: NOAA Office for Coastal Management Phone: 843-740-1202 Email: coastal.info@noaa.gov
Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
License information was derived automatically
DTU Roadrunners DatasetThis dataset contains data from a Velodyne VLP-16 Puck lidar, a ZED 2 stereo camera, a Swift Navigation Piksi Multi RTK GPS, and odometry, captured on the DTU Roadrunners Dynamo Ecocar.The data was captured at the race track at Roskilde Racing Center (RRC) in March - July 2020 and at the DTU Autonomous Vehicle Test Track in May 2020.For more information about the DTU Roadrunners project visit http://ecocar.dk/Rosbag Content/clockTime/velodyne_packetsRaw LIDAR point cloud from Velodyne VLP-16 Puck LIDAR/gyro_angleRaw angle from gyroscope used for odometry/car_pose_estimateOdometry with fieldsmsg_data->data[0], // xmsg_data->data[1], // ymsg_data->data[2], // zmsg_data->data[3], // orientationmsg_data->data[4], // speedmsg_data->data[5] // accelerationmsg_data->data[6] // driven_distanceSVO FileStereolabs ZED2 Stereo camera data with IMU.Images can be viewed using ZED Explorer, extracted with SVO Export, or published to ROS using the ROS wrapper.See https://www.stereolabs.com/docs/installation/ for more information.GPS CSV FileBaseline, Position, and Velocity."Flags" column has following definition:0 Invalid1 Single Point Position (SPP)2 Differential GNSS (DGNSS)3 Float RTK4 Fixed RTK5 Undefined6 SBAS Position Conversion of /velodyne_packets into sensor_msgs/PointCloud2Install velodyne drivers$ sudo apt-get install ros-melodic-velodyneLaunch cloud nodelet$ roslaunch cloud_nodelet_conversion.launchPlay bag file$ rosbag play X.bag
This part of the data release presents topography data from northern Monterey Bay, California collected in September 2017 with a terrestrial lidar scanner.
This is collection level metadata for LAS and ASCII data files from the statewide Iowa Lidar Project. The Iowa Light Detection and Ranging (LiDAR) Project collects location and elevation (X, Y, Z) data to a set standard for the entire state of Iowa. LIDAR is defined as an airborne laser system, flown aboard rotary or fixed-wing aircraft, that is used to acquire x, y, and z coordinates of terrain and terrain features that are both manmade and naturally occurring. LIDAR systems consist of a light-emitting scanning laser, an airborne Global Positioning System (GPS) with attendant GPS base station(s), and an Inertial Measuring Unit (IMU). The laser scanning system measures ranges from the scanning laser to terrain surfaces by measuring the time it takes for the emitted light (LIDAR return) to reach the earth's surface and reflect back to the onboard LIDAR detector. The airborne GPS system ascertains the in-flight three-dimensional position of the sensor, and the IMU delivers precise information about the attitude of the sensor. The LIDAR system incorporates data from these three subsystems to produce a large cloud of points on the land surface whose X, Y, and Z coordinates are known within the specified accuracy. This collection consists of ASCII files of bare earth elevations and intensity (x,y,z,i) and, LAS (version 1.0 lidar data interchange standard) binary files that include all 1st and last returns, intensity and bare earth classification.
This data product offers Lidar, radar, camera, gps, imu and odometer data collected with historical ground truth system.
Attribution 3.0 (CC BY 3.0)https://creativecommons.org/licenses/by/3.0/
License information was derived automatically
Ayres Associates provided Wood County, Wisconsin, with lidar based topographic mapping services in the spring of 2015 as part of WROC. The LiDAR data was collected from 3/21/2015 to 3/31/2015 using an Optech Orion H300 sensor mounted in a fixed-wing aircraft. LiDAR data was collected to support the generation of 2-foot contours to meet FEMA vertical accuracy standards. The LiDAR data was delivered according to a 5,000 foot x 5,000 foot tile schematic. The LiDAR data was calibrated using information collected at the time of flight from GPS base stations on the ground and airborne GPS/IMU in the aircraft. The calibrated LiDAR data was processed to produce a classified point cloud, bare earth DTM, DEM, DSM, contours, breaklines, and intensity images.Hydrographic breaklines are collected using LiDARgrammetry to ensure hydroflattened water surfaces. This process involves manipulating the LiDAR data's intensity information to create a metrically sound stereo environment. From this generated "imagery", breaklines are photogrammetrically compiled. Breakline polygons are created to represent open water bodies. The LiDAR points that fall within these areas are classified as "water." Breaklines representing streams and rivers shall be smooth, continuous, and monotonic, and represent the water surface without any stair steps except for dams and rapids. All hydrographic breaklines include a 1 foot buffer, with the points being re-classified as Class 10 (ignored ground). TerraSolid is further used for the subsequent manual classification of the LiDAR points allowing technicians to view the point cloud in a number of ways to ensure accuracy and consistency of points and uniformity of point coverage. The 2014 breaklines dataset contains the hydrographic breaklines necessary for terrain surface development.
These lidar point clouds and images cover, in high detail, the terrain at Great Sippewissett Marsh, Cape Cod, MA on November 2nd, 2022. USGS researchers tested different sensors that collected lidar and images for photogrammetry point cloud data using Uncrewed Aerial Systems (UAS) to look at differences in coverage and elevation accuracy. The lidar data were acquired with a YellowScan Mapper lidar scanner, which consists of the Livox Horizon scanner and Applanix 15 inertial measurement unit; and a YellowScan VX20-100 lidar scanner, which consists of the Riegl minivux-1uav scanner and Applanix 20 inertial measurement unit. The YellowScan Mapper Sony UMC-R10C camera and a Ricoh GRII camera were used to take photos for structure from motion processing and to compare point clouds. The lidar data was post-processed to a R8s Trimble base station. Smart AeroPoint ground control points (GCPs) and ground truthing GPS points were used for vertical validation.
https://www.marketreportanalytics.com/privacy-policyhttps://www.marketreportanalytics.com/privacy-policy
The Asia-Pacific (APAC) LiDAR market is experiencing robust growth, driven by increasing adoption across diverse sectors. The region's burgeoning automotive industry, particularly in China, India, and Japan, is a key driver, with manufacturers integrating LiDAR technology into advanced driver-assistance systems (ADAS) and autonomous vehicles. Furthermore, the expansion of smart infrastructure initiatives, including the development of smart cities and precision agriculture, is fueling demand for high-resolution mapping and surveying solutions offered by LiDAR systems. Government investments in infrastructure projects and supportive regulatory environments across several APAC nations are also contributing to market expansion. While the high initial cost of LiDAR technology and a lack of skilled professionals for installation and maintenance present challenges, the ongoing miniaturization and cost reduction of LiDAR sensors are expected to mitigate these restraints. The segment witnessing the highest growth is likely Aerial LiDAR, due to its applicability in large-scale mapping projects crucial for urban planning and infrastructure development prevalent in rapidly expanding APAC cities. Ground-based LiDAR is also showing significant growth, primarily fueled by applications in construction and surveying. Competition is intense, with both established international players and emerging domestic companies vying for market share. The market's future trajectory is highly optimistic, with a projected continued high CAGR driven by technological advancements and the region’s strong economic growth. The dominance of China within APAC will be a defining characteristic of the region's LiDAR market. Its robust domestic automotive industry and ambitious infrastructure development plans guarantee significant demand. India and Japan are also expected to contribute significantly, albeit at a slightly slower rate, driven by increasing investments in autonomous vehicles and modernization of their respective infrastructure. Specific regional variations within APAC are expected, reflecting differences in economic development, regulatory frameworks, and the stage of adoption of LiDAR technologies. Nonetheless, across the board, growth is predicted to outpace the global average CAGR, driven by the region's unique combination of rapid technological adoption and substantial infrastructure development needs. The market segmentation within APAC will likely mirror global trends, with Aerial and Ground-based LiDAR applications dominating, followed by robust demand for high-quality components such as GPS units and laser scanners. Recent developments include: September 2023: Toshiba Corporation announced the development of world-first advances in LiDAR technologies that secure an unmatched accuracy of 99.9% in object tracking and object recognition of 98.9% with data acquired by the LiDAR alone. The technologies also significantly improve the environmental robustness and the potential for LiDAR in many applications., April 2023: Innoviz Technologies Ltd., a technology leader in high-performance, automotive-grade LiDAR sensors and perception software, announced that it signed a distribution agreement with Ascendtek Electronics Inc., an integrated provider of advanced electronic systems, to drive sales of the company's LiDAR solutions throughout the Greater China region.. Key drivers for this market are: Growing Applications in the Government Sector, Increasing Adoption in the Automotive Industry. Potential restraints include: Growing Applications in the Government Sector, Increasing Adoption in the Automotive Industry. Notable trends are: Ground-based LiDAR Expected to Witness The Highest Growth.
LIDAR data is remotely sensed high-resolution elevation data collected by an airborne collection platform. Using a combination of laser rangefinding, GPS positioning and inertial measurement technologies; LIDAR instruments are able to make highly detailed Digital Elevation Models (DEMs) of the earth's terrain, man-made structures and vegetation. This data was collected at submeter resolution to provide nominal 1m spacing of collected points. Two returns were recorded for each pulse in addition to an intensity value. Original contact information: Contact Org: NOAA Office for Coastal Management Phone: 843-740-1202 Email: coastal.info@noaa.gov
https://www.zionmarketresearch.com/privacy-policyhttps://www.zionmarketresearch.com/privacy-policy
Global LiDAR market size was valued USD 2.15 Bn in 2023 & is predicted to grow USD 4.88 Bn by 2032 with a CAGR of 9.64% between 2024 and 2032.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The study proposes a multi-sensor localization and real-timeble mapping method based on the fusion of 3D LiDAR point clouds and visual-inertial data, which addresses the issue of decreased localization accuracy and mapping in complex environments that affect the autonomous navigation of robot dogs. Through the experiments conducted, the proposed method improved the overall localization accuracy by 42.85% compared to the tightly coupled LiDAR-inertial odometry method using smoothing and mapping. In addition, the method achieved lower mean absolute trajectory errors and root mean square errors compared to other algorithms evaluated on the urban navigation dataset. The highest root-mean-square error recorded was 2.72m in five sequences from a multi-modal multi-scene ground robot dataset, which was significantly lower than competing approaches. When applied to a real robot dog, the rotational error was reduced to 1.86°, and the localization error in GPS environments was 0.89m. Furthermore, the proposed approach closely followed the theoretical path, with the smallest average error not exceeding 0.12 m. Overall, the proposed technique effectively improves both autonomous navigation and mapping for robot dogs, significantly increasing their stability.
U.S. Government Workshttps://www.usa.gov/government-works
License information was derived automatically
This dataset consists of point cloud data collected in 2016 and 2017 of the lower and upper Scenic Drive landslide locations in La Honda, California. Point cloud data were collected in 2016 to establish baseline for movement detection of past landslides. Point cloud data were collected in 2017 adjacent and upslope of 2016 data to document a newly formed landslide. The data were collected with a Riegl VZ400 Terrestrial Laser Scanner and georeferenced using a Leica Viva GS15 survey grade GPS. The data are delivered as georeferenced (NAD83 UTM zone 10N ellipsoid) classified point clouds, 5 cm resolution digital elevation models, and a text file of surveyed GPS control points.
The included files are:
LH2017_Jan.laz
LH2016_Jan.laz
LH2017_5cm_DEM_be_tin.tif
LH2017_5cm_DEM_bebldg_tin.tif
LH2017_5cm_DEM_be_idp.tif
LH2016_5cm_DEM_be_tin.tif
LH2016_5cm_DEM_bebldg_tin.tif
LH2016_5cm_DEM_be_idp.tif
LH_GPS_control_points_NAD83_UTM_z10N_ell.txt
LiDAR data is a remotely sensed high resolution elevation data collected by an airborne platform. The LiDAR sensor uses a combination of laser range finding, GPS positioning, and inertial measurement technologies. The LiDAR systems collect data point clouds that are used to produce highly detailed Digital Elevation Models (DEMs) of the earth's terrain, man-made structures and vegetation. The ta...
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The B4 Lidar Project collected lidar point cloud data of the southern San Andreas and San Jacinto Faults in southern California. Data acquisition and processing were performed by the National Center for Airborne Laser Mapping (NCALM) in partnership with the USGS and Ohio State University through funding from the EAR Geophysics program at the National Science Foundation (NSF). Optech International contributed the ALTM3100 laser scanner system. UNAVCO and SCIGN assisted in GPS ground control and continuous high rate GPS data acquisition. A group of volunteers from USGS, UCSD, UCLA, Caltech and private industry, as well as gracious landowners along the fault zones, also made the project possible. If you utilize the B4 data for talks, posters or publications, we ask that you acknowledge the B4 project. The B4 logo can be downloaded here.
A new reprocessed (classified) version of this dataset is here:
Publications associated with this dataset can be found at NCALM's Data Tracking Center
U.S. Government Workshttps://www.usa.gov/government-works
License information was derived automatically
The dataset contains GPS survey data of ground control points deployed for aerial LiDAR data collection. The targets were deployed and surveyed on 8/12/2020 and the LiDAR flight was conducted on the morning of 8/13/2020. The control points were surveyed using Leica GS14 RTK GPS equipment, S/N rover-2806883, using the Real Time Network, Leica Smartnet, to determine absolute position.