100+ datasets found
  1. no_robots

    • huggingface.co
    Updated Nov 11, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Hugging Face H4 (2023). no_robots [Dataset]. https://huggingface.co/datasets/HuggingFaceH4/no_robots
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Nov 11, 2023
    Dataset provided by
    Hugging Facehttps://huggingface.co/
    Authors
    Hugging Face H4
    License

    Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
    License information was derived automatically

    Description

    Dataset Card for No Robots 🙅‍♂️🤖

    Look Ma, an instruction dataset that wasn't generated by GPTs!

      Dataset Summary
    

    No Robots is a high-quality dataset of 10,000 instructions and demonstrations created by skilled human annotators. This data can be used for supervised fine-tuning (SFT) to make language models follow instructions better. No Robots was modelled after the instruction dataset described in OpenAI's InstructGPT paper, and is comprised mostly of single-turn… See the full description on the dataset page: https://huggingface.co/datasets/HuggingFaceH4/no_robots.

  2. h

    no-robots-sharegpt

    • huggingface.co
    Updated Nov 27, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Doctor Shotgun (2023). no-robots-sharegpt [Dataset]. https://huggingface.co/datasets/Doctor-Shotgun/no-robots-sharegpt
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Nov 27, 2023
    Authors
    Doctor Shotgun
    License

    Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
    License information was derived automatically

    Description

    no-robots-sharegpt

    HuggingFaceH4/no_robots with both test and train splits combined and converted to ShareGPT format for use in common training repositories. Please refer to the original repository's dataset card for more information. no-robots-sharegpt.jsonl

    Original dataset converted to ShareGPT

    no-robots-sharegpt-fixed.jsonl

    Manual edits were made to ~10 dataset entries that were throwing warnings in axolotl - turns out that some of the multi-turn conversations had… See the full description on the dataset page: https://huggingface.co/datasets/Doctor-Shotgun/no-robots-sharegpt.

  3. h

    vietnamese_no_robots

    • huggingface.co
    Updated Nov 13, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Thien Phu Nguyen (2023). vietnamese_no_robots [Dataset]. https://huggingface.co/datasets/nguyenphuthien/vietnamese_no_robots
    Explore at:
    Dataset updated
    Nov 13, 2023
    Authors
    Thien Phu Nguyen
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Vietnamese-translated version of HuggingFaceH4/no_robots dataset

      Dataset Card for No Robots 🙅‍♂️🤖
    

    Look Ma, an instruction dataset that wasn't generated by GPTs!

      Dataset Summary
    

    No Robots is a high-quality dataset of 10,000 instructions and demonstrations created by skilled human annotators. This data can be used for supervised fine-tuning (SFT) to make language models follow instructions better. No Robots was modelled after the instruction dataset described… See the full description on the dataset page: https://huggingface.co/datasets/nguyenphuthien/vietnamese_no_robots.

  4. h

    No-Robots-500

    • huggingface.co
    Updated Nov 13, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Talha Rüzgar Akkuş (2024). No-Robots-500 [Dataset]. https://huggingface.co/datasets/Q-bert/No-Robots-500
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Nov 13, 2024
    Authors
    Talha Rüzgar Akkuş
    Description

    Q-bert/No-Robots-500 dataset hosted on Hugging Face and contributed by the HF Datasets community

  5. G

    No-Code Robot Programming Market Research Report 2033

    • growthmarketreports.com
    csv, pdf, pptx
    Updated Aug 4, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Growth Market Reports (2025). No-Code Robot Programming Market Research Report 2033 [Dataset]. https://growthmarketreports.com/report/no-code-robot-programming-market
    Explore at:
    pdf, csv, pptxAvailable download formats
    Dataset updated
    Aug 4, 2025
    Dataset authored and provided by
    Growth Market Reports
    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    No-Code Robot Programming Market Outlook



    According to our latest research, the global No-Code Robot Programming market size reached USD 1.28 billion in 2024, reflecting strong momentum driven by the rapid digitalization of industrial and service sectors. With a robust compound annual growth rate (CAGR) of 23.7% projected through the forecast period, the market is anticipated to soar to approximately USD 10.28 billion by 2033. This remarkable growth is primarily fueled by the increasing need for automation, democratization of robotics, and the acute shortage of skilled programmers, which has accelerated the adoption of no-code solutions across various industries worldwide.




    The proliferation of no-code robot programming platforms can be attributed to the urgent demand for simplifying complex automation processes. As businesses strive to remain competitive, there is a growing emphasis on reducing operational costs and improving efficiency. No-code solutions empower professionals without deep technical backgrounds to program and deploy robots, thereby alleviating the dependency on highly specialized software engineers. This democratization of automation technology is opening new avenues for small and medium enterprises (SMEs) to leverage robotics, which was previously restricted due to high entry barriers associated with traditional programming methods. Furthermore, the integration of artificial intelligence and machine learning capabilities into no-code platforms is enhancing their flexibility and adaptability, making them suitable for a wider range of applications and industries.




    Another significant growth driver for the no-code robot programming market is the expanding use of collaborative robots (cobots) and service robots in diverse sectors such as healthcare, logistics, and retail. The shift towards human-robot collaboration in the workplace is necessitating intuitive interfaces that can be easily used by non-experts. No-code platforms are bridging this gap by providing drag-and-drop programming environments and pre-built templates, which significantly reduce the time and expertise required for robot deployment. Additionally, the increasing trend of mass customization in manufacturing and the surge in e-commerce activities have compelled organizations to adopt flexible automation solutions, further propelling market growth.




    The rapid digital transformation observed globally, particularly in emerging economies, is also playing a pivotal role in the expansion of the no-code robot programming market. Governments and private organizations are investing heavily in smart manufacturing initiatives and Industry 4.0 projects, which prioritize automation and digital integration. The COVID-19 pandemic further accelerated the adoption of robotics and automation as companies sought to minimize human intervention and ensure business continuity. As a result, the demand for user-friendly, scalable, and cost-effective automation solutions has surged, positioning no-code robot programming platforms as a critical enabler of future-ready enterprises.




    From a regional perspective, Asia Pacific is emerging as a dominant force in the no-code robot programming market, fueled by robust industrialization, large-scale investments in automation, and a burgeoning manufacturing sector. North America and Europe are also witnessing substantial growth, driven by technological advancements, a mature robotics ecosystem, and widespread digital adoption across industries. Meanwhile, Latin America and the Middle East & Africa are gradually catching up, supported by increasing awareness and government initiatives aimed at fostering innovation and digital skills development. Each of these regions presents unique opportunities and challenges, shaping the overall trajectory of the global no-code robot programming market.





    Component Analysis



    The no-code robot programming market is broadly segmented by component into Software and Services. The software segment comm

  6. a

    no_robots_kn

    • aifasthub.com
    • huggingface.co
    Updated Jan 24, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Tensoic AI (2024). no_robots_kn [Dataset]. https://www.aifasthub.com/datasets/Tensoic/no_robots_kn
    Explore at:
    Dataset updated
    Jan 24, 2024
    Dataset authored and provided by
    Tensoic AI
    License

    Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
    License information was derived automatically

    Description

    No- robots dataset translated to Kannada (KN) with chat and code data removed.

  7. THOR - point clouds

    • data.europa.eu
    • data.niaid.nih.gov
    • +1more
    unknown
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zenodo (2020). THOR - point clouds [Dataset]. https://data.europa.eu/data/datasets/oai-zenodo-org-3405915?locale=cs
    Explore at:
    unknown(209450287)Available download formats
    Dataset updated
    Jan 24, 2020
    Dataset authored and provided by
    Zenodohttp://zenodo.org/
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    THÖR is a dataset with human motion trajectory and eye gaze data collected in an indoor environment with accurate ground truth for the position, head orientation, gaze direction, social grouping and goals. THÖR contains sensor data collected by a 3D lidar sensor and involves a mobile robot navigating the space. In comparison to other, our dataset has a larger variety in human motion behaviour, is less noisy, and contains annotations at higher frequencies. The dataset includes 9 separate recordings in 3 variations: One obstacle" - features one obstacle in the environment and no robotMoving robot" - features one obstacle in the environment and the moving robot ``Three obstacles" - features three obstacles in the environment and no robot THOR - point clouds is the part of THÖR data set containing bag files with 3D scans collcted during the experiments. Reference: For more details check project website thor.oru.se or check our publications: @article{thorDataset2019, title={TH"OR: Human-Robot Indoor Navigation Experiment and Accurate Motion Trajectories Dataset}, author={Andrey Rudenko and Tomasz P. Kucner and Chittaranjan S. Swaminathan and Ravi T. Chadalavada and Kai O. Arras and Achim J. Lilienthal}, journal={arXiv preprint arXiv:1909.04403}, year={2019} }

  8. w

    Dataset of artists who created Robot 2 from the Technological Dreams Series:...

    • workwithdata.com
    Updated May 8, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Work With Data (2025). Dataset of artists who created Robot 2 from the Technological Dreams Series: no 1, Robots project (Model) [Dataset]. https://www.workwithdata.com/datasets/artists?f=1&fcol0=j0-artwork&fop0=%3D&fval0=Robot+2+from+the+Technological+Dreams+Series%3A+no+1%2C+Robots+project+%28Model%29&j=1&j0=artworks
    Explore at:
    Dataset updated
    May 8, 2025
    Dataset authored and provided by
    Work With Data
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset is about artists. It has 2 rows and is filtered where the artworks is Robot 2 from the Technological Dreams Series: no 1, Robots project (Model). It features 9 columns including birth date, death date, country, and gender.

  9. THOR - people tracks

    • data.europa.eu
    • zenodo.org
    unknown
    Updated Jul 3, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zenodo (2025). THOR - people tracks [Dataset]. https://data.europa.eu/data/datasets/oai-zenodo-org-3382145?locale=da
    Explore at:
    unknown(17401149)Available download formats
    Dataset updated
    Jul 3, 2025
    Dataset authored and provided by
    Zenodohttp://zenodo.org/
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    THÖR is a dataset with human motion trajectory and eye gaze data collected in an indoor environment with accurate ground truth for the position, head orientation, gaze direction, social grouping and goals. THÖR contains sensor data collected by a 3D lidar sensor and involves a mobile robot navigating the space. In comparison to other, our dataset has a larger variety in human motion behaviour, is less noisy, and contains annotations at higher frequencies. The dataset includes 13 separate recordings in 3 variations: One obstacle" - features one obstacle in the environment and no robotMoving robot" - features one obstacle in the environment and the moving robot ``Three obstacles" - features three obstacles in the environment and no robot THOR - people tracks is the part of THÖR data set containing ground truth position of people in the environment, including information about head orientation. The data are available in three formats: mat - Matlab binary file TSV - text file bag - ROS bag file MAT files File - [char] Path to original QTM file Timestamp - [string] Date and time of the startof the data collection Start Fram - [char] 1 Frames - [double] Number of frames in the file FrameRate - [double] Number of frames per second Events - [struct] 0 Trajectories - [struct] 3D postion of observed reflective markers Labeled - [struct] Markers belonging to the tracked agents: Count - [double] Number of tracked markers Labels - [cell] List of marker labels Data - [double] Array of dimension {Count}x4x{Frames}, contains the 3D position of each marker and residue RigidBodies - [struct] 6D pose of the helmet, corresponds to head poistion and orientation: Bodies - [double] Number of tracked bodies Name - [cell] Bodies Names Positions - [double] Array of dimension {Bodies}x3x{Frames} contains the position of the centre of the mass of the markers defining the rigid body Rotations - [double] Array of dimension {Bodies}x9x{Frames} contains rotation matrix describing the orientation of the rigid body RPYs - [double] Array of dimension {Bodies}x3x{Frames} contains orientation of the rigid body described as RPY angles Residual - [double] Array of dimension {Bodies}x1x{Frames} contains residual for each rigid body TSV files 3D data File Header NO_OF_FRAMES - number of frames in the file NO_OF_CAMERAS - number of cameras tracking makers NO_OF_MARKERS - number of tracked markers FREQUENCY - tracking frequency [Hz] NO_OF_ANALOG - number of analog inputs ANALOG_FREQUENCY - frequency of analog input DESCRIPTION - -- TIME_STAMP - the beginning of the data recording DATA_INCLUDED - the type of data included MARKER_NAMES - names of tracked makers Column names Frame - frame ID Time - frame timestamp [marker name] [C] - coordinate of a [marker name] along [C] axis 6D data File Header NO_OF_FRAMES - number of frames in the file NO_OF_CAMERAS - number of cameras tracking makers NO_OF_MARKERS - number of tracked markers FREQUENCY - tracking frequency [Hz] NO_OF_ANALOG - number of analog inputs ANALOG_FREQUENCY - frequency of analog input DESCRIPTION - -- TIME_STAMP - the beginning of the data recording DATA_INCLUDED - the type of data included BODY_NAMES - names of tracked rigid bodies Colum Names Frame - frame ID Time - frame timestamp The columns are grouped according to the rigid body. Each group starts with the name of the rigid body and then is followed by the position of the centre of the mas and the orientation expressed as RPY angles and rotation matrix Reference: For more details check project website thor.oru.se or check our publications: @article{thorDataset2019, title={TH"OR: Human-Robot Indoor Navigation Experiment and Accurate Motion Trajectories Dataset}, author={Andrey Rudenko and Tomasz P. Kucner and Chittaranjan S. Swaminathan and Ravi T. Chadalavada and Kai O. Arras and Achim J. Lilienthal}, journal={arXiv preprint arXiv:1909.04403}, year={2019} }

  10. Demands by people in Japan for living with pet-like social robots 2020

    • statista.com
    Updated Jul 23, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Statista (2025). Demands by people in Japan for living with pet-like social robots 2020 [Dataset]. https://www.statista.com/statistics/1243996/japan-demands-living-with-pet-like-communication-robots/
    Explore at:
    Dataset updated
    Jul 23, 2025
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    Nov 5, 2020 - Nov 12, 2020
    Area covered
    Japan
    Description

    In November 2020, around **** percent of respondents in Japan stated that they would not like to live with a pet-like communication robot no matter what. Practical concerns prevailed among people who considered living with such a robot under certain circumstances. For friend-like communication robots and family member-like or partner-like communication robots, conversational skills were more important.

  11. Demands by people in Japan for living with family member-like social robots...

    • statista.com
    Updated Jul 23, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Statista (2025). Demands by people in Japan for living with family member-like social robots 2020 [Dataset]. https://www.statista.com/statistics/1244024/japan-demands-living-with-family-member-like-communication-robots/
    Explore at:
    Dataset updated
    Jul 23, 2025
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    Nov 5, 2020 - Nov 12, 2020
    Area covered
    Japan
    Description

    In November 2020, around **** percent of respondents in Japan stated that they would not like to live with a family member-like or partner-like communication robot no matter what. People considering living with such a robot under certain circumstances valued the skill to speak naturally to humans the most.

  12. F

    OpenAXES Example Robot Dataset

    • data.uni-hannover.de
    zip,csv
    Updated Jul 24, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Institut fuer Mikroelektronische Systeme (2023). OpenAXES Example Robot Dataset [Dataset]. https://data.uni-hannover.de/dataset/openaxes-example-robot-dataset
    Explore at:
    zip,csvAvailable download formats
    Dataset updated
    Jul 24, 2023
    Dataset authored and provided by
    Institut fuer Mikroelektronische Systeme
    License

    Attribution-ShareAlike 3.0 (CC BY-SA 3.0)https://creativecommons.org/licenses/by-sa/3.0/
    License information was derived automatically

    Description

    This is an example dataset recorded using version 1.0 of the open-source-hardware OpenAXES IMU. Please see the github repository for more information on the hardware and firmware. Please find the most up-to-date version of this document in the repository

    This dataset was recorded using four OpenAXES IMUs mounted on the segments of a robot arm (UR5 by Universal Robots). The robot arm was programmed to perform a calibration movement, then trace a 2D circle or triangle in the air with its tool center point (TCP), and return to its starting position, at four different speeds from 100 mm/s to 250 mm/s. This results in a total of 8 different scenarios (2 shapes times 4 speeds). The ground truth joint angle and TCP position values were obtained from the robot controller. The calibration movement at the beginning of the measurement allows for calculating the exact orientation of the sensors on the robot arm.

    The IMUs were configured to send the raw data from the three gyroscope axes and the six accelerometer axes to a PC via BLE with 16 bit resolution per axis and 100 Hz sample rate. Since no data packets were lost during this process, this dataset allows comparing and tuning different sensor fusion algorithms on the recorded raw data while using the ground truth robot data as a reference.

    In order to visualize the results, the quaternion sequences from the IMUs were applied to the individual segments of a 3D model of the robot arm. The end of this kinematic chain represents the TCP of the virtual model, which should ideally move along the same trajectory as the ground truth, barring the accuracy of the IMUs. Since the raw sensor data of these measurements is available, the calibration coefficients can also be applied ex-post.

    Since there are are 6 joints but only 4 IMUS, some redundancy must be exploited. The redundancy comes from the fact that each IMU has 3 rotational degrees of fredom, but each joint has only one:

    • The data for q0 and q1 are both derived from the orientation of the "humerus" IMU.
    • q2 is the difference between the orientation of the "humerus" and "radius" IMUs.
    • q3 is the difference between the orientation of the "radius" and "carpus" IMUs.
    • q4 is the difference between the orientation of the "carpus" and "digitus" IMUs.
    • The joint q5 does not influence the position of the TCP, only its orientation, so it is ignored in the evaluation.
    • Of course, difference here means not the subtraction of the quaternions but the rotational difference, which is the R1 * inv(R0) for two quaternions (or rotations) R0 and R1. The actual code works a bit differently, but this describes the general principle.

    Data

    • Data recorded from the IMUs is in the directory measure_raw-2022-09-15/, one folder per scenario. In those folders, there is one CSV file per IMU.
    • Data recorded from the robot arm is in the directory measure_raw-2022-09-15/robot/, one CSV and MAT file per scenario.
    • Some photos and videos of the recording process can be found in Media. Videos are stored in git lfs.

    Evaluation

    The file openaxes-example-robot-dataset.ipynb is provided to play around with the data in the dataset and demonstrate how the files are read and interpreted. To use the notebook, set up a Python 3 virtual environment and therein install the necessary packets with pip install -r resuirements.txt. In order to view the graphs contained in the ipynb file, you will most likely have to trust the notebook beforehand, using the following command:

    jupyter trust openaxes-example-robot-dataset.ipynb
    

    Beware: This notebook is not a comprehensive evaluation and any results and plots shown in the file are not necessarily scientifically sound evidence of anything.

    The notebook will store intermediate files in the measure_raw-2022-09-15 directory, like the quaternion files calculated by the different filters, or the files containing the reconstructed TCP positions. All intermediate files should be ignored by the file measure_raw-2022-09-15/.gitignore.

    The generated intermediate files are also provided in the file measure_raw-2022-09-15.tar.bz2, in case you want to inspect the generated files without running the the notebook.

    Tools

    A number of tools are used in the evaluation notebook. Below is a short overview, but not a complete specification. If you need to understand the input and output formats for each tool, please read the code.

    • The file calculate-quaternions.py is used in the evaluation notebook to compute different attitude estimation filters like Madgwick or VQF on the raw accelerometer and gyroscrope measurements at 100 Hz.
    • The directory madgwick-filter contains a small C program that applies the original Madgwick filter to a CSV file containing raw measurements and prints the results. It is used by calculate-quaternions.py.
    • The file calculate-robot-quaternions.py calculates a CSV file of quaternions equivalent to the IMU quaternions from a CSV file containing the joint angles of the robot.
    • The program dsense_vis mentioned in the notebook is used to calculate the 3D model of the robot arm from quaternions and determine the mounting orientations of the IMUs on the robot arm. This program will be released at a future date. In the meantime, the output files of dsense_vis are provided in the file measure_raw-2022-09-15.tar.bz2, which contains the complete content of the measure_raw-2022-09-15 directory after executing the whole notebook. Just unpack this archive and merge its contents with the measure_raw-2022-09-15 directory. This allows you to explore the reconstructed TCP files for the filters implemented at the time of publication.
  13. Process and robot data from a two robot workcell representative performing...

    • catalog.data.gov
    • data.nist.gov
    • +1more
    Updated Mar 14, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Institute of Standards and Technology (2025). Process and robot data from a two robot workcell representative performing representative manufacturing operations. [Dataset]. https://catalog.data.gov/dataset/process-and-robot-data-from-a-two-robot-workcell-representative-performing-representative-
    Explore at:
    Dataset updated
    Mar 14, 2025
    Dataset provided by
    National Institute of Standards and Technologyhttp://www.nist.gov/
    Description

    This data set is captured from a robot workcell that is performing activities representative of several manufacturing operations. The workcell contains two, 6-degree-of-freedom robot manipulators where one robot is performing material handling operations (e.g., transport parts into and out of a specific work space) while the other robot is performing a simulated precision operation (e.g., the robot touching the center of a part with a tool tip that leaves a mark on the part). This precision operation is intended to represent a precise manufacturing operation (e.g., welding, machining). The goal of this data set is to provide robot level and process level measurements of the workcell operating in nominal parameters. There are no known equipment or process degradations in the workcell. The material handling robot will perform pick and place operations, including moving simulated parts from an input area to in-process work fixtures. Once parts are placed in/on the work fixtures, the second robot will interact with the part in a specified precise manner. In this specific instance, the second robot has a pen mounted to its tool flange and is drawing the NIST logo on a surface of the part. When the precision operation is completed, the material handling robot will then move the completed part to an output. This suite of data includes process data and performance data, including timestamps. Timestamps are recorded at predefined state changes and events on the PLC and robot controllers, respectively. Each robot controller and the PLC have their own internal clocks and, due to hardware limitations, the timestamps recorded on each device are relative to their own internal clocks. All timestamp data collected on the PLC is available for real-time calculations and is recorded. The timestamps collected on the robots are only available as recorded data for post-processing and analysis. The timestamps collected on the PLC correspond to 14 part state changes throughout the processing of a part. Timestamps are recorded when PLC-monitored triggers are activated by internal processing (PLC trigger origin) or after the PLC receives an input from a robot controller (robot trigger origin). Records generated from PLC-originated triggers include parts entering the work cell, assignment of robot tasks, and parts leaving the work cell. PLC-originating triggers are activated by either internal algorithms or sensors which are monitored directly in the PLC Inputs/Outputs (I/O). Records generated from a robot-originated trigger include when a robot begins operating on a part, when the task operation is complete, and when the robot has physically cleared the fixture area and is ready for a new task assignment. Robot-originating triggers are activated by PLC I/O. Process data collected in the workcell are the variable pieces of process information. This includes the input location (single option in the initial configuration presented in this paper), the output location (single option in the initial configuration presented in this paper), the work fixture location, the part number counted from startup, and the part type (task number for drawing robot). Additional information on the context of the workcell operations and the captured data can be found in the attached files, which includes a README.txt, along with several noted publications. Disclaimer: Certain commercial entities, equipment, or materials may be identified or referenced in this data, or its supporting materials, in order to illustrate a point or concept. Such identification or reference is not intended to imply recommendation or endorsement by NIST; nor does it imply that the entities, materials, equipment or data are necessarily the best available for the purpose. The user assumes any and all risk arising from use of this dataset.

  14. E

    Endoscope Robot Report

    • datainsightsmarket.com
    doc, pdf, ppt
    Updated May 21, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Data Insights Market (2025). Endoscope Robot Report [Dataset]. https://www.datainsightsmarket.com/reports/endoscope-robot-1768973
    Explore at:
    doc, ppt, pdfAvailable download formats
    Dataset updated
    May 21, 2025
    Dataset authored and provided by
    Data Insights Market
    License

    https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The global endoscope robot market is experiencing robust growth, driven by the increasing prevalence of minimally invasive surgeries, technological advancements leading to enhanced precision and dexterity, and the rising demand for improved patient outcomes. The market, estimated at $2 billion in 2025, is projected to experience a Compound Annual Growth Rate (CAGR) of approximately 15% from 2025 to 2033, reaching an estimated value exceeding $6 billion by 2033. Key growth drivers include the increasing adoption of robotic-assisted surgery across various surgical specialties – urology, gynecology, and general surgery being prominent examples. The multi-hole mirror robot segment currently holds the largest market share, owing to its established presence and versatility, but the single-hole and no-hole mirror robot segments are witnessing significant growth fueled by their potential for smaller incisions, reduced trauma, and faster recovery times. Major players like Intuitive Surgical, Medtronic, and CMR Surgical are driving innovation and expanding their product portfolios, fostering competition and further accelerating market growth. Geographic expansion, particularly in emerging markets with burgeoning healthcare infrastructure, is contributing significantly to the market's expansion. Restraints include high initial investment costs associated with robotic systems, the need for specialized training for surgeons, and potential regulatory hurdles in certain regions. Despite the challenges, the long-term outlook for the endoscope robot market remains positive. Continued technological advancements focusing on improved image quality, enhanced haptic feedback, and the development of more affordable and user-friendly systems will further fuel market penetration. The increasing integration of artificial intelligence and machine learning is expected to improve surgical precision and efficiency, thereby strengthening the market's growth trajectory. Furthermore, the rising adoption of tele-surgery and remote robotic procedures holds immense potential for future market expansion, especially in underserved areas. The market segmentation across different applications and robot types allows for targeted growth strategies and addresses the varied needs of different surgical specialties, contributing to the overall market dynamism.

  15. R

    Roobots For Frc Rooobots Dataset

    • universe.roboflow.com
    zip
    Updated Dec 8, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Adam K (2023). Roobots For Frc Rooobots Dataset [Dataset]. https://universe.roboflow.com/adam-k/roobots-dataset-for-frc-rooobots/dataset/6
    Explore at:
    zipAvailable download formats
    Dataset updated
    Dec 8, 2023
    Dataset authored and provided by
    Adam K
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Variables measured
    Robots Bounding Boxes
    Description

    WARNING: very thrown together and will be redone

    Initial RoboFlow Dataset for (Roobot) Object Detection in First Robotics Detection

    Description: This dataset serves as a starting collection of images from robotics events, designed to train object detection models. The images have been sourced from a variety of different first robotics competitions across a couple years and showcase robots in action, spectators, and the dynamic environments typically found in such events.

    Please note, this is an initial, hastily assembled dataset and is not comprehensive in its coverage nor optimal in its annotation quality. Our aim was to create a starting point for further development. Over time, we plan to refine the annotations, increase the diversity of the images, and improve the overall quality of the dataset. We encourage users to provide feedback and contribute to the dataset's evolution.

    Current State:

    Number of Images: 921 Annotations: Bounding boxes with basic labels Classes: Red Robot, Blue Robot, Robot (No Alliance Visible in Image)

  16. Degradation Measurement of Robot Arm Position Accuracy

    • catalog.data.gov
    • data.nist.gov
    Updated Jul 29, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Institute of Standards and Technology (2022). Degradation Measurement of Robot Arm Position Accuracy [Dataset]. https://catalog.data.gov/dataset/degradation-measurement-of-robot-arm-position-accuracy-be76e
    Explore at:
    Dataset updated
    Jul 29, 2022
    Dataset provided by
    National Institute of Standards and Technologyhttp://www.nist.gov/
    Description

    The dataset contains both the robot's high-level tool center position (TCP) health data and controller-level components' information (i.e., joint positions, velocities, currents, temperatures, currents). The datasets can be used by users (e.g., software developers, data scientists) who work on robot health management (including accuracy) but have limited or no access to robots that can capture real data. The datasets can support the: Development of robot health monitoring algorithms and tools Research of technologies and tools to support robot monitoring, diagnostics, prognostics, and health management (collectively called PHM) Validation and verification of the industrial PHM implementation. For example, the verification of a robot's TCP accuracy after the work cell has been reconfigured, or whenever a manufacturer wants to determine if the robot arm has experienced a degradation. For data collection, a trajectory is programmed for the Universal Robot (UR5) approaching and stopping at randomly-selected locations in its workspace. The robot moves along this preprogrammed trajectory during different conditions of temperature, payload, and speed. The TCP (x,y,z) of the robot are measured by a 7-D measurement system developed at NIST. Differences are calculated between the measured positions from the 7-D measurement system and the nominal positions calculated by the nominal robot kinematic parameters. The results are recorded within the dataset. Controller level sensing data are also collected from each joint (direct output from the controller of the UR5), to understand the influences of position degradation from temperature, payload, and speed. Controller-level data can be used for the root cause analysis of the robot performance degradation, by providing joint positions, velocities, currents, accelerations, torques, and temperatures. For example, the cold-start temperatures of the six joints were approximately 25 degrees Celsius. After two hours of operation, the joint temperatures increased to approximately 35 degrees Celsius. Control variables are listed in the header file in the data set (UR5TestResult_header.xlsx). If you'd like to comment on this data and/or offer recommendations on future datasets, please email guixiu.qiao@nist.gov.

  17. h

    no_robots_dutch

    • huggingface.co
    Updated Dec 20, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Bram Vanroy (2023). no_robots_dutch [Dataset]. https://huggingface.co/datasets/BramVanroy/no_robots_dutch
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Dec 20, 2023
    Authors
    Bram Vanroy
    License

    Apache License, v2.0https://www.apache.org/licenses/LICENSE-2.0
    License information was derived automatically

    Description

    Dataset Card for No Robots Dutch

      Citation
    

    If you use this dataset, GEITje 7B Ultra (SFT) or any of its derivatives or quantizations, place cite the following paper: @misc{vanroy2024geitje7bultraconversational, title={GEITje 7B Ultra: A Conversational Model for Dutch}, author={Bram Vanroy}, year={2024}, eprint={2412.04092}, archivePrefix={arXiv}, primaryClass={cs.CL}, url={https://arxiv.org/abs/2412.04092}, }

      Dataset… See the full description on the dataset page: https://huggingface.co/datasets/BramVanroy/no_robots_dutch.
    
  18. A

    AI Companion Robots Report

    • datainsightsmarket.com
    doc, pdf, ppt
    Updated May 26, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Data Insights Market (2025). AI Companion Robots Report [Dataset]. https://www.datainsightsmarket.com/reports/ai-companion-robots-609157
    Explore at:
    pdf, doc, pptAvailable download formats
    Dataset updated
    May 26, 2025
    Dataset authored and provided by
    Data Insights Market
    License

    https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The AI companion robot market is experiencing robust growth, driven by increasing demand for elderly care solutions, the rising prevalence of loneliness, and advancements in artificial intelligence and robotics technologies. The market, estimated at $2 billion in 2025, is projected to achieve a Compound Annual Growth Rate (CAGR) of 15% from 2025 to 2033, reaching an estimated market value of $7 billion by 2033. Key drivers include technological improvements leading to more sophisticated and affordable robots, increasing consumer awareness of the benefits of AI companions, and the expanding integration of these robots into healthcare and assistive living environments. Market segmentation includes robots for elderly care, children's education and entertainment, and those providing companionship for individuals with disabilities. The leading companies, such as Ubtech Robotics, Hanson Robotics, and Emotix, are actively innovating to enhance functionalities, including natural language processing, emotion recognition, and personalized interaction capabilities. However, challenges remain, including concerns about data privacy, ethical considerations surrounding AI interactions, and the need to overcome high initial costs for consumers. Despite these restraints, the long-term outlook for the AI companion robot market remains positive. The continued miniaturization and affordability of robotic components, combined with evolving AI algorithms enabling more human-like interactions, will further fuel market expansion. Growth will be particularly strong in regions with aging populations, such as North America, Europe, and East Asia, where the demand for elderly care solutions is substantial. Furthermore, the emergence of new applications in education, therapeutic interventions, and personal assistance will broaden the market's appeal and contribute to its continued expansion throughout the forecast period. The market is also expected to see increased collaboration between technology companies and healthcare providers, leading to the development of more integrated and effective solutions.

  19. f

    Data from: User Experience in Social Robots

    • figshare.com
    xlsx
    Updated Jun 21, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Selina Demi (2021). User Experience in Social Robots [Dataset]. http://doi.org/10.6084/m9.figshare.14550870.v2
    Explore at:
    xlsxAvailable download formats
    Dataset updated
    Jun 21, 2021
    Dataset provided by
    figshare
    Authors
    Selina Demi
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Social robots are increasingly penetrating our daily lives. They are used in various domains, such as healthcare, education, home use and manufacturing. However, introducing this technology for use into conventional environments is not trivial. For users to accept social robots, a positive user experience is vital and it should be considered as a critical part of the robots development process. This may potentially lead to excessive use of social robots and strengthen their diffusion in society. The goal of this study is to summarize the extant literature that is focused on user experience in social robots, and to identify the challenges and benefits of UX evaluation in social robots. To achieve this goal, the authors carried out a systematic literature review that relies on the guidelines for secondary studies in software engineering. Our findings revealed that the most common methods to evaluate UX in social robots are interviews and surveys. UX evaluations were found out to be beneficial in providing early feedback and consequently in handling errors at an early stage. However, despite the importance of UX in social robots, robot developers often neglect to set UX goals due to lack of knowledge or lack of time. This study emphasizes the need for robot developers to acquire the required theoretical and practical knowledge on how to perform a successful UX evaluation.

  20. Z

    Robot List OA-Statistics

    • data.niaid.nih.gov
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Recke, Marco (2020). Robot List OA-Statistics [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_8573
    Explore at:
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Hitzler, Matthias
    Recke, Marco
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Standard Robot Exclusion List used by OA-Statistics. Compilation of some popular black-list.

    Status of this document

    This document represents a consensus of the german project OA-Statistik. It is not an official standard backed by a standards body, or owned by any commercial organisation. It is not enforced by anybody, and there no guarantee that all current and future robots will use it. It has also been open for discussion.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Hugging Face H4 (2023). no_robots [Dataset]. https://huggingface.co/datasets/HuggingFaceH4/no_robots
Organization logo

no_robots

No Robots

HuggingFaceH4/no_robots

Explore at:
54 scholarly articles cite this dataset (View in Google Scholar)
CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
Dataset updated
Nov 11, 2023
Dataset provided by
Hugging Facehttps://huggingface.co/
Authors
Hugging Face H4
License

Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
License information was derived automatically

Description

Dataset Card for No Robots 🙅‍♂️🤖

Look Ma, an instruction dataset that wasn't generated by GPTs!

  Dataset Summary

No Robots is a high-quality dataset of 10,000 instructions and demonstrations created by skilled human annotators. This data can be used for supervised fine-tuning (SFT) to make language models follow instructions better. No Robots was modelled after the instruction dataset described in OpenAI's InstructGPT paper, and is comprised mostly of single-turn… See the full description on the dataset page: https://huggingface.co/datasets/HuggingFaceH4/no_robots.

Search
Clear search
Close search
Google apps
Main menu