Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Open X-Embodiment Dataset (unofficial)
RLDS dataset for train vla
use this dataset
download the dataset by hf: (
prepare by yourself
The code modified from rlds_dataset_mod We upload the precessed dataset in this repository ❤… See the full description on the dataset page: https://huggingface.co/datasets/WeiChow/VLATrainingDataset.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
PhysicalAI-Robotics-GR00T-X-Embodiment-Sim
Github Repo: Isaac GR00T N1 We provide a set of datasets used for post-training of GR00T N1. Each dataset is a collection of trajectories from different robot embodiments and tasks.
Cross-embodied bimanual manipulation: 9k trajectories
Dataset Name
bimanual_panda_gripper.Threading 1000
bimanual_panda_hand.LiftTray 1000
bimanual_panda_gripper.ThreePieceAssembly 1000… See the full description on the dataset page: https://huggingface.co/datasets/nvidia/PhysicalAI-Robotics-GR00T-X-Embodiment-Sim.
PR2 opening fridge doors
To use this dataset:
import tensorflow_datasets as tfds
ds = tfds.load('utokyo_pr2_opening_fridge_converted_externally_to_rlds', split='train')
for ex in ds.take(4):
print(ex)
See the guide for more informations on tensorflow_datasets.
MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
RoVI-Book Dataset
🎉 CVPR 2025 🎉 Official dataset for Robotic Visual Instruction
This is an example to demonstrate the RoVI Book dataset, adapted from the Open-X Embodiments dataset. The bottom displays the proportion of each task type. Paper:Robotic Visual Instruction Project Page: https://robotic-visual-instruction.github.io/ Code: https://github.com/RoboticsVisualInstruction/RoVI-Book
Introduction
The RoVI-Book dataset is introduced alongside Robotic… See the full description on the dataset page: https://huggingface.co/datasets/yanbang/rovibook.
What is this?
This is an example RLDS dataset builder adapted from https://github.com/kpertsch/rlds_dataset_builder for training Interleave-VLAs. This repo aims to illustrate the data format of Interleave-VLA.
How to use?
We assume interleave Open X-Embodiment data is first extracted and stored in pickles. The folder to pickles should be set in config.py. Next, select dataset_name in openx_interleave.py to your desired dataset. Finally, run tfds build --overwrite
in this… See the full description on the dataset page: https://huggingface.co/datasets/Interleave-VLA/rlds_dataset_builder_interleave.
Apache License, v2.0https://www.apache.org/licenses/LICENSE-2.0
License information was derived automatically
This dataset was created using LeRobot.
Dataset Structure
meta/info.json: { "codebase_version": "v2.1", "robot_type": "franka", "total_episodes": 8612, "total_frames": 1137459, "total_tasks": 24, "total_videos": 34448, "total_chunks": 9, "chunks_size": 1000, "fps": 10, "splits": { "train": "0:8612" }, "data_path": "data/chunk-{episode_chunk:03d}/episode_{episode_index:06d}.parquet", "video_path":… See the full description on the dataset page: https://huggingface.co/datasets/IPEC-COMMUNITY/fmb_dataset_lerobot.
Not seeing a result you expected?
Learn how you can add new datasets to our index.
Open X-Embodiment: Robotic learning datasets and RT-X models