10 datasets found
  1. Z

    Robot@Home2, a robotic dataset of home environments

    • data.niaid.nih.gov
    Updated Apr 4, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ambrosio-Cestero, Gregorio (2024). Robot@Home2, a robotic dataset of home environments [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_3901563
    Explore at:
    Dataset updated
    Apr 4, 2024
    Dataset provided by
    Ruiz-Sarmiento, José Raul
    González-Jiménez, Javier
    Ambrosio-Cestero, Gregorio
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The Robot-at-Home dataset (Robot@Home, paper here) is a collection of raw and processed data from five domestic settings compiled by a mobile robot equipped with 4 RGB-D cameras and a 2D laser scanner. Its main purpose is to serve as a testbed for semantic mapping algorithms through the categorization of objects and/or rooms.

    This dataset is unique in three aspects:

    The provided data were captured with a rig of 4 RGB-D sensors with an overall field of view of 180°H. and 58°V., and with a 2D laser scanner.

    It comprises diverse and numerous data: sequences of RGB-D images and laser scans from the rooms of five apartments (87,000+ observations were collected), topological information about the connectivity of these rooms, and 3D reconstructions and 2D geometric maps of the visited rooms.

    The provided ground truth is dense, including per-point annotations of the categories of the objects and rooms appearing in the reconstructed scenarios, and per-pixel annotations of each RGB-D image within the recorded sequences

    During the data collection, a total of 36 rooms were completely inspected, so the dataset is rich in contextual information of objects and rooms. This is a valuable feature, missing in most of the state-of-the-art datasets, which can be exploited by, for instance, semantic mapping systems that leverage relationships like pillows are usually on beds or ovens are not in bathrooms.

    Robot@Home2

    Robot@Home2, is an enhanced version aimed at improving usability and functionality for developing and testing mobile robotics and computer vision algorithms. It consists of three main components. Firstly, a relational database that states the contextual information and data links, compatible with Standard Query Language. Secondly,a Python package for managing the database, including downloading, querying, and interfacing functions. Finally, learning resources in the form of Jupyter notebooks, runnable locally or on the Google Colab platform, enabling users to explore the dataset without local installations. These freely available tools are expected to enhance the ease of exploiting the Robot@Home dataset and accelerate research in computer vision and robotics.

    If you use Robot@Home2, please cite the following paper:

    Gregorio Ambrosio-Cestero, Jose-Raul Ruiz-Sarmiento, Javier Gonzalez-Jimenez, The Robot@Home2 dataset: A new release with improved usability tools, in SoftwareX, Volume 23, 2023, 101490, ISSN 2352-7110, https://doi.org/10.1016/j.softx.2023.101490.

    @article{ambrosio2023robotathome2,title = {The Robot@Home2 dataset: A new release with improved usability tools},author = {Gregorio Ambrosio-Cestero and Jose-Raul Ruiz-Sarmiento and Javier Gonzalez-Jimenez},journal = {SoftwareX},volume = {23},pages = {101490},year = {2023},issn = {2352-7110},doi = {https://doi.org/10.1016/j.softx.2023.101490},url = {https://www.sciencedirect.com/science/article/pii/S2352711023001863},keywords = {Dataset, Mobile robotics, Relational database, Python, Jupyter, Google Colab}}

    Version historyv1.0.1 Fixed minor bugs.v1.0.2 Fixed some inconsistencies in some directory names. Fixes were necessary to automate the generation of the next version.v2.0.0 SQL based dataset. Robot@Home v1.0.2 has been packed into a sqlite database along with RGB-D and scene files which have been assembled into a hierarchical structured directory free of redundancies. Path tables are also provided to reference files in both v1.0.2 and v2.0.0 directory hierarchies. This version has been automatically generated from version 1.0.2 through the toolbox.v2.0.1 A forgotten foreign key pair have been added.v.2.0.2 The views have been consolidated as tables which allows a considerable improvement in access time.v.2.0.3 The previous version does not include the database. In this version the database has been uploaded.v.2.1.0 Depth images have been updated to 16-bit. Additionally, both the RGB images and the depth images are oriented in the original camera format, i.e. landscape.

  2. a

    Mobile, Alabama and Pensacola, Florida 5-meter Bathymetry - Gulf of Mexico...

    • hub.arcgis.com
    Updated Sep 12, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    jeradk18@tamu.edu_tamu (2019). Mobile, Alabama and Pensacola, Florida 5-meter Bathymetry - Gulf of Mexico (GCOOS) [Dataset]. https://hub.arcgis.com/maps/6465ebd399554ac4b72fcb39781b584e
    Explore at:
    Dataset updated
    Sep 12, 2019
    Dataset authored and provided by
    jeradk18@tamu.edu_tamu
    Area covered
    Description

    This digital elevation model (DEM) is a part of a series of DEMs produced for the National Oceanic and Atmospheric Administration Coastal Services Center's Sea Level Rise and Coastal Flooding Impacts Viewer (www.csc.noaa.gov/slr/viewer). This metadata record describes the DEM for Mobile County in Alabama and Escambia, Santa Rosa, and Okaloosa (southern coastal portion only) Counties in Florida. The DEM includes the best available lidar data known to exist at the time of DEM creation for the coastal areas of Mobile County in Alabama and Escambia, Santa Rosa, and Okaloosa (portion) counties in Florida, that met project specification.This DEM is derived from the USGS National Elevation Dataset (NED), US Army Corps of Engineers (USACE) LiDAR data, as well as LiDAR collected for the Northwest Florida Water Management District (NWFWMD) and the Florida Department of Emergency Management (FDEM). NED and USACE data were used only in Mobile County, AL. NWFWMD or FDEM data were used in all other areas. Hydrographic breaklines used in the creation of the DEM were obtained from FDEM and Southwest Florida Water Management District (SWFWMD). This DEM is hydro flattened such that water elevations are less than or equal to 0 meters.This DEM is referenced vertically to the North American Vertical Datum of 1988 (NAVD88) with vertical units of meters and horizontally to the North American Datum of 1983 (NAD83). The resolution of the DEM is approximately 5 meters. This DEM does not include licensed data (Baldwin County, Alabama) that is unavailable for distribution to the general public. As such, the extent of this DEM is different than that of the DEM used by the NOAA Coastal Services Center in creating the inundation data seen in the Sea Level Rise and Coastal Impacts Viewer (www.csc.noaa.gov/slr/viewer).The NOAA Coastal Services Center has developed high-resolution digital elevation models (DEMs) for use in the Center's Sea Level Rise And Coastal Flooding Impacts internet mapping application. These DEMs serve as source datasets used to derive data to visualize the impacts of inundation resulting from sea level rise along the coastal United States and its territories.The dataset is provided "as is," without warranty to its performance, merchantable state, or fitness for any particular purpose. The entire risk associated with the results and performance of this dataset is assumed by the user. This dataset should be used strictly as a planning reference and not for navigation, permitting, or other legal purposes.

  3. f

    Data from: Dataset "ForestScanner: A mobile application for measuring and...

    • figshare.com
    txt
    Updated May 10, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Shinichi Tatsumi; Keiji Yamaguchi; Naoyuki Furuya (2022). Dataset "ForestScanner: A mobile application for measuring and mapping trees with LiDAR-equipped iPhone and iPad" [Dataset]. http://doi.org/10.6084/m9.figshare.19721656.v3
    Explore at:
    txtAvailable download formats
    Dataset updated
    May 10, 2022
    Dataset provided by
    figshare
    Authors
    Shinichi Tatsumi; Keiji Yamaguchi; Naoyuki Furuya
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Tree diameter and coordinate data obtained by iPhone, iPad, and conventional survey methods in a 1 ha forest plot in Hokkaido, Japan (42°59'57" N, 141°23'29" E). Tatsumi, Yamaguchi, Furuya (in press) ForestScanner: A mobile application for measuring and mapping trees with LiDAR-equipped iPhone and iPad. Methods in Ecology and Evolution.

  4. d

    City of Sioux Falls Mobile Food Vendor Areas

    • catalog.data.gov
    Updated Apr 19, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    City of Sioux Falls GIS (2025). City of Sioux Falls Mobile Food Vendor Areas [Dataset]. https://catalog.data.gov/dataset/city-of-sioux-falls-mobile-food-vendor-areas-083f3
    Explore at:
    Dataset updated
    Apr 19, 2025
    Dataset provided by
    City of Sioux Falls GIS
    Area covered
    Sioux Falls
    Description

    Web mapping application containing where mobile food vendors are allowed to operate within the city limits of Sioux Falls, South Dakota.

  5. Forest Localisation Dataset

    • researchdata.edu.au
    datadownload
    Updated Feb 17, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Micheal Bruenig; Paulo Borges; Milad Ramezani; Lucas Carvalho de Lima (2023). Forest Localisation Dataset [Dataset]. http://doi.org/10.25919/FBWY-RK04
    Explore at:
    datadownloadAvailable download formats
    Dataset updated
    Feb 17, 2023
    Dataset provided by
    CSIROhttp://www.csiro.au/
    Authors
    Micheal Bruenig; Paulo Borges; Milad Ramezani; Lucas Carvalho de Lima
    License

    Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
    License information was derived automatically

    Time period covered
    Oct 8, 2021
    Description

    The dataset contains lidar, imu and wheel odometry measurements collected using an all-electric 4 wheel robotic vehicle (Gator) in a forest environment at the Queensland Centre for Advanced Technologies (QCAT - CSIRO) in Brisbane, Australia. The dataset also contains a heightmap image constructed from aerial lidar data of the same forest. This dataset allows users to run the Forest Localisation software and evaluate the results of the presented localisation method. Lineage: The ground view data was collected utilising an all-electric 4 wheel robotic vehicle equipped with a Velodyne VLP-16 laser mounted on a servo-motor, with a 45 degree inclination, spinning around the vertical axis at 0.5Hz. In addition to the lidar scans, imu and wheel odometry measurements were also recorded. The above canopy map (heightmap) was constructed from aerial lidar data captured using a drone also equipped with a spinning mobile lidar sensor.

  6. f

    Application.

    • plos.figshare.com
    zip
    Updated Jun 2, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Till Koebe (2023). Application. [Dataset]. http://doi.org/10.1371/journal.pone.0241981.s003
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jun 2, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Till Koebe
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Code and data for replicating the application study. See S1 Appendix for further details. (ZIP)

  7. a

    Alaska 2023 Forest Health Flightlines

    • arc-gis-hub-home-arcgishub.hub.arcgis.com
    • gis.data.alaska.gov
    • +2more
    Updated Mar 6, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alaska Department of Natural Resources ArcGIS Online (2025). Alaska 2023 Forest Health Flightlines [Dataset]. https://arc-gis-hub-home-arcgishub.hub.arcgis.com/maps/SOA-DNR::alaska-2023-forest-health-flightlines
    Explore at:
    Dataset updated
    Mar 6, 2025
    Dataset authored and provided by
    Alaska Department of Natural Resources ArcGIS Online
    Area covered
    Description

    This forest health dataset includes flightlines from 2023. Along these flightlines, surveyors from the Alaska Division of Forestry & Fire Protection and USDA Forest Service - Forest Health Protection document insect, disease, and abiotic damage in the forest from about 1000 feet altitude using a digital mobile sketch-mapping tablet and software. The aerial survey covers about 15% of the forests statewide each year. Note that much of the forest damage documented during these surveys does not typically result in tree or shrub mortality. Aerial survey data disclaimer: USDA Forest Service - Forest Health Protection and the Alaska Division of Forestry & Fire Protection make every attempt to accurately identify and locate forest damage. The data is offered "as is".

  8. a

    Alaska 2023 Forest Health Damage

    • data-soa-dnr.opendata.arcgis.com
    • gis.data.alaska.gov
    • +2more
    Updated Mar 6, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alaska Department of Natural Resources ArcGIS Online (2025). Alaska 2023 Forest Health Damage [Dataset]. https://data-soa-dnr.opendata.arcgis.com/datasets/alaska-2023-forest-health-damage
    Explore at:
    Dataset updated
    Mar 6, 2025
    Dataset authored and provided by
    Alaska Department of Natural Resources ArcGIS Online
    Area covered
    Description

    This forest health dataset includes both polygon and point data from 2023. Points have a buffered area based on tree number. Surveyors from the Alaska Division of Forestry & Fire Protection and USDA Forest Service - Forest Health Protection document insect, disease, and abiotic damage in the forest from about 1000 feet altitude using a digital mobile sketch-mapping tablet and software. The aerial survey covers about 15% of the forests statewide each year. Note that much of the forest damage documented during these surveys does not typically result in tree or shrub mortality. Aerial survey data disclaimer: USDA Forest Service - Forest Health Protection and the Alaska Division of Forestry & Fire Protection make every attempt to accurately identify and locate forest damage. The data is offered "as is".

  9. a

    Alaska 2021 Forest Health Flightlines

    • gis.data.alaska.gov
    • arc-gis-hub-home-arcgishub.hub.arcgis.com
    Updated Mar 6, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alaska Department of Natural Resources ArcGIS Online (2025). Alaska 2021 Forest Health Flightlines [Dataset]. https://gis.data.alaska.gov/datasets/alaska-2021-forest-health-flightlines/explore?showTable=true
    Explore at:
    Dataset updated
    Mar 6, 2025
    Dataset authored and provided by
    Alaska Department of Natural Resources ArcGIS Online
    Area covered
    Description

    This forest health dataset includes both polygon and point data from 2021. Points have a buffered area based on tree number. Surveyors from the Alaska Division of Forestry & Fire Protection and USDA Forest Service - Forest Health Protection document insect, disease, and abiotic damage in the forest from about 1000 feet altitude using a digital mobile sketch-mapping tablet and software. The aerial survey covers about 15% of the forests statewide each year. Note that much of the forest damage documented during these surveys does not typically result in tree or shrub mortality. Aerial survey data disclaimer: USDA Forest Service - Forest Health Protection and the Alaska Division of Forestry & Fire Protection make every attempt to accurately identify and locate forest damage from the air. A very small percentage of the mapped data can be ground-checked and it is possible that errors in the data exist. These data are offered 'as is'.

  10. ACT and Southern Tablelands Weed Spotter

    • gbif.org
    • researchdata.edu.au
    Updated Jul 4, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    GBIF (2025). ACT and Southern Tablelands Weed Spotter [Dataset]. http://doi.org/10.15468/cgg3lk
    Explore at:
    Dataset updated
    Jul 4, 2025
    Dataset provided by
    Atlas of Living Australiahttp://www.ala.org.au/
    Global Biodiversity Information Facilityhttps://www.gbif.org/
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Southern Tablelands
    Description

    The Atlas of Living Australia (ALA) ACT and Southern Tablelands Weedspotter web portal and mobile phone weed mapping application (weed mapping app)

    WHAT IS IT? The web portal and associated weed mapping application allows individuals, industry, government and community groups to:

    Electronically map weed infestations in the field, including:
      What the weed species was,
      Where it was seen,
      When it was seen,
      How many plants were seen, and
      How dense the infestation was
    Record weed control activities, including:
      What type of control,
      When it was controlled,
      Which species were targeted, and
      How effective the control effort was
    Store that information on the Weedspotter web portal,
    View, manage, review and analyse this information and generate maps and reports on this information to help in future weed management planning.
    

    WHY DO WE NEED IT?

    If you are passionate about controlling weeds in our region, you know how to identify and/or control weeds and you want to be part of the solution - to protect farm lands and nature conservation areas from weeds - then register and become involved.

    The Weedspotter web portal and weed mapping app helps us to collaborate more and be more effective in controlling weeds in our region, by allowing all users to:

    provide an early warning of new and emerging weeds coming into our region
    track changes in weed occurrence over time and at different sites;
    track weed control effort - where it has occurred, what weeds have been targeted and how effective this work has been;
    

    By understanding what is happening to weeds in our region - individuals (such as farmers), groups (such as Landcare and ParkCare groups) and government can:

    prevent new and emerging weeds from becoming a problem.
    prioritise weed control effort in their area of interest, and
    

    Identifying and controlling new and emerging weeds is particularly important in a changing climate, as weeds that previously could not survive in our climate, are starting to appear in our region and are at risk of becoming invasive and becoming the next problem weed.

    WHAT IS ON THE WEBSITE AND APP? Weed species of interest

    This section includes:

    Weed species of interest to the 10 local government areas and the ACT government. These are lists of weeds that NSW local government areas (LGAs) and the ACT government have identified as weeds that could become a problem, are starting to become a problem or are already widespread in our region (the weeds of interest vary across the different LGAs and in the ACT). The ACT and NSW LGAs are particularly interested in new and emerging weeds that are at risk of becoming invasive and becoming the next problem weed
    Species profiles - each species listed in the weeds of interest will include detailed information and photos to assist users to identify weed species in the field (using the app) or when they are on the website.
    Identification Tool - a Weed Identification Tool that assists all users to identify those hard to pick weeds (this will include look-alikes that are not weeds including native look-alike plants) in the field (using the app) or on the website.
    

    Mapping - How to contribute There are three mapping tools on the website and available through the app. Each of these mapping tools allows users to photograph weeds of interest, weed infestations of interest or weed control effort and lodge them on the portal. This will assist in confirming weed identification and also tracking change over time.

    A general weed mapping tool - which allows users to digitally record in the field or through the portal, a single weed species at a single site.
    A more advanced weed mapping tool - which allows users to digitally record (field based and directly into the website portal) a weed infestation - including multiples species, the infestation area and the infestation density at one site.
    A tool for recording weed treatment/management information at a particular site.
    

    To use the mapping tools - register for the site.

    Weed sightings to date This area of the website allows users to look at what they have mapped, what else has been mapped in the region - by location, species, date and type of record (single species, multi-species of weed control effort); analyse this data, develop a report and maps and print these documents. This also provides a link to the broader Atlas of Living Australia website and all the biodiversity data that can be accessed through the ALA website.

    HOW DO I GET HOLD OF THE MOBILE PHONE APP? The Atlas of Living Australia (ALA) ACT and Southern Tablelands Weedspotter mobile phone applications are available in IPhone and Android Smart phones from the Apple iTunes Store or from Google Play.

  11. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Ambrosio-Cestero, Gregorio (2024). Robot@Home2, a robotic dataset of home environments [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_3901563

Robot@Home2, a robotic dataset of home environments

Explore at:
Dataset updated
Apr 4, 2024
Dataset provided by
Ruiz-Sarmiento, José Raul
González-Jiménez, Javier
Ambrosio-Cestero, Gregorio
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

The Robot-at-Home dataset (Robot@Home, paper here) is a collection of raw and processed data from five domestic settings compiled by a mobile robot equipped with 4 RGB-D cameras and a 2D laser scanner. Its main purpose is to serve as a testbed for semantic mapping algorithms through the categorization of objects and/or rooms.

This dataset is unique in three aspects:

The provided data were captured with a rig of 4 RGB-D sensors with an overall field of view of 180°H. and 58°V., and with a 2D laser scanner.

It comprises diverse and numerous data: sequences of RGB-D images and laser scans from the rooms of five apartments (87,000+ observations were collected), topological information about the connectivity of these rooms, and 3D reconstructions and 2D geometric maps of the visited rooms.

The provided ground truth is dense, including per-point annotations of the categories of the objects and rooms appearing in the reconstructed scenarios, and per-pixel annotations of each RGB-D image within the recorded sequences

During the data collection, a total of 36 rooms were completely inspected, so the dataset is rich in contextual information of objects and rooms. This is a valuable feature, missing in most of the state-of-the-art datasets, which can be exploited by, for instance, semantic mapping systems that leverage relationships like pillows are usually on beds or ovens are not in bathrooms.

Robot@Home2

Robot@Home2, is an enhanced version aimed at improving usability and functionality for developing and testing mobile robotics and computer vision algorithms. It consists of three main components. Firstly, a relational database that states the contextual information and data links, compatible with Standard Query Language. Secondly,a Python package for managing the database, including downloading, querying, and interfacing functions. Finally, learning resources in the form of Jupyter notebooks, runnable locally or on the Google Colab platform, enabling users to explore the dataset without local installations. These freely available tools are expected to enhance the ease of exploiting the Robot@Home dataset and accelerate research in computer vision and robotics.

If you use Robot@Home2, please cite the following paper:

Gregorio Ambrosio-Cestero, Jose-Raul Ruiz-Sarmiento, Javier Gonzalez-Jimenez, The Robot@Home2 dataset: A new release with improved usability tools, in SoftwareX, Volume 23, 2023, 101490, ISSN 2352-7110, https://doi.org/10.1016/j.softx.2023.101490.

@article{ambrosio2023robotathome2,title = {The Robot@Home2 dataset: A new release with improved usability tools},author = {Gregorio Ambrosio-Cestero and Jose-Raul Ruiz-Sarmiento and Javier Gonzalez-Jimenez},journal = {SoftwareX},volume = {23},pages = {101490},year = {2023},issn = {2352-7110},doi = {https://doi.org/10.1016/j.softx.2023.101490},url = {https://www.sciencedirect.com/science/article/pii/S2352711023001863},keywords = {Dataset, Mobile robotics, Relational database, Python, Jupyter, Google Colab}}

Version historyv1.0.1 Fixed minor bugs.v1.0.2 Fixed some inconsistencies in some directory names. Fixes were necessary to automate the generation of the next version.v2.0.0 SQL based dataset. Robot@Home v1.0.2 has been packed into a sqlite database along with RGB-D and scene files which have been assembled into a hierarchical structured directory free of redundancies. Path tables are also provided to reference files in both v1.0.2 and v2.0.0 directory hierarchies. This version has been automatically generated from version 1.0.2 through the toolbox.v2.0.1 A forgotten foreign key pair have been added.v.2.0.2 The views have been consolidated as tables which allows a considerable improvement in access time.v.2.0.3 The previous version does not include the database. In this version the database has been uploaded.v.2.1.0 Depth images have been updated to 16-bit. Additionally, both the RGB images and the depth images are oriented in the original camera format, i.e. landscape.

Search
Clear search
Close search
Google apps
Main menu