Let Dataset
| Entity Passport | |
| Registry ID | hf-dataset--lejurobotics--let_dataset |
| License | CC-BY-NC-SA-4.0 |
| Provider | huggingface |
Cite this dataset
Academic & Research Attribution
@misc{hf_dataset__lejurobotics__let_dataset,
author = {LejuRobotics},
title = {Let Dataset Dataset},
year = {2026},
howpublished = {\url{https://huggingface.co/datasets/lejurobotics/let_dataset}},
note = {Accessed via Free2AITools Knowledge Fortress}
} ๐ฌTechnical Deep Dive
Full Specifications [+]โพ
โ๏ธ Nexus Index V2.0
๐ฌ Index Insight
FNI V2.0 for Let Dataset: Semantic (S:50), Authority (A:0), Popularity (P:50), Recency (R:73), Quality (Q:30).
Verification Authority
๐๏ธ Data Preview
Row-level preview not available for this dataset.
Schema structure is shown in the Field Logic panel when available.
๐ Explore Full Dataset โ๐งฌ Field Logic
Schema not yet indexed for this dataset.
Dataset Specification
LET:Full-Size Humanoid Robot Real-World Dataset
ไธญๆ| [English]
๐ Table of Contents
- Key Features
- Hardware Platform
- Usage Guide
- Tasks and Data Overview
- Dataset
- Data Access
- Data Communication Group
- Citation
- License
โจ Key Features
- Large-scale, real-world, full-size humanoid robot multi-view, multi-modal data, continuously updated
- Covers multiple domains including industry, home, medical, and service, with 31 sub-task scenarios
- Includes 117 atomic skills such as grasping, bimanual operation, tool use, with a total duration of over 1000 hours
- Expert-labeled and human-verified data to ensure high quality
- Provides a complete toolchain from data conversion, model training to inference and validation
Assembly line sorting |
Daily table cleaning |
Assembly line sorting (dexterous hand) |
Left hand camera view |
Head camera view |
Right hand camera view |
๐ค Hardware Platform
The main hardware platform is Kuavo 4 Pro and its wheeled version, with the following features:
- Robot parameters: Height 1.66 m, weight 55 kg, supports hot-swappable batteries
- Motion control: 40 degrees of freedom, max walking speed 7 km/h, supports bipedal autonomous SLAM
- Generalization: Supports multi-modal large models (e.g., Pangu, DeepSeek, ChatGPT), with 20+ atomic skills
๐ Usage Guide
Tool Repository
We provide a complete tool repository, including:
- Data conversion tool (rosbag2lerobot): Convert rosbag files to formats suitable for model training
- Two imitation learning models: Diffusion Policy and ACT
- Model training scripts
- Code and deployment instructions for both real robots and simulation environments
For details, see the open-source repository: kuavo_data_challenge ๐ฅ
๐ฌ Tasks and Data Overview
This dataset covers various scenarios such as automobile factories, FMCG, hotel services, 3C factories, life services, logistics, etc., including multi-modal observations (RGB, Depth, joints, etc.) and a rich set of atomic skills (grasping, bimanual operation, tool use, etc.).
Consumer goods sorting |
Simulation data demonstration |
Assembly feeding |
Semantic Labels
The LET dataset decomposes complex tasks into a series of atomic action steps with clear semantics, using standardized annotation methods to provide sub-task level timelines and natural language annotations for each task.
Each data entry is accompanied by multi-dimensional semantic label information, including:
- Object labels: industrial parts, tableware, daily utensils, medicines, etc.
- Skill labels: grasp, place, rotate, push, pull, press, etc.
- Task and scene identifiers: unified task name coding, scene dimension distinguishes operation context semantics
- End effector type: records actions performed by gripper and dexterous hand separately
- Language description: e.g., "Pick up the medicine box from the conveyor belt and place it on the designated tray", supporting natural language and action alignment modeling
Data Statistics
LET dataset statistics are as follows:
Data type & Scene distribution
| Data type distribution | Scene distribution |
|---|---|
![]() |
![]() |
Task distribution
Task duration distribution
Distribution of atomic skills
๐ฆ Dataset
Dataset Directory Structure
.
โโโ hdf5
โย ย โโโ real
โย ย โย ย โโโ Labelled
โย ย โย ย โย ย โโโ customer_check_in-P4-dex_hand
โย ย โย ย โย ย โโโ deliver_room_card-P4-dex_hand
โย ย โย ย โย ย โโโ deliver_water_bottle-P4-dex_hand
โย ย โย ย โย ย โโโ loading_of_large_tooling-P4-dex_hand
โย ย โย ย โย ย โโโ loading_of_small_tooling-P4-dex_hand
โย ย โย ย โย ย โโโ more_coil_sorting-P4-dex_hand
โย ย โย ย โย ย โโโ more_FMCG_loading-P4-dex_hand
โย ย โย ย โย ย โโโ more_goods_orders-P4-dex_hand
โย ย โย ย โย ย โโโ more_scan_code_for_weighing-P4-dex_hand
โย ย โย ย โย ย โโโ parts_offline-P4-dex_hand
โย ย โย ย โย ย โโโ quick_sort-P4-leju_claw
โย ย โย ย โย ย โโโ rubbish_sorting-P4-leju_claw
โย ย โย ย โย ย โโโ shop_oversale-P4-leju_claw
โย ย โย ย โย ย โโโ single_coil_sorting-P4-dex_hand
โย ย โย ย โย ย โโโ single_FMCG_loading-P4-dex_hand
โย ย โย ย โย ย โโโ single_goods_orders-P4-dex_hand
โย ย โย ย โย ย โโโ single_scan_code_for_weighing-P4-dex_hand
โย ย โย ย โย ย โโโ SPS_parts_grab-P4-leju_claw
โย ย โย ย โย ย โโโ SPS_parts_sorting-P4-dex_hand
โย ย โย ย โย ย โโโ task_mass_check-P4-leju_claw
โย ย โย ย โโโ Unlabelled
โย ย โย ย โโโ assembly_line_sorting-P4-leju_claw
โย ย โย ย โโโ clothing_storage-P4-leju_claw
โย ย โย ย โโโ countertop_cleaning-P4-leju_claw
โย ย โย ย โโโ deliver_room_card-P4-dex_hand
โย ย โย ย โโโ desktop_decluttering-P4-leju_claw
โย ย โย ย โโโ drug_finishing-P4-leju_claw
โย ย โย ย โโโ express_delivery_sorting-P4-leju_claw
โย ย โย ย โโโ express_logistics_scenario-P4-leju_claw
โย ย โย ย โโโ loading_of_large_tooling-P4-dex_hand
โย ย ๏ฟฝ๏ฟฝย ย โโโ loading_of_small_tooling-P4-dex_hand
โย ย โย ย โโโ loading_of_small_tooling-P4-leju_claw
โย ย โย ย โโโ more_coil_sorting-P4-dex_hand
โย ย โย ย โโโ more_FMCG_loading-P4-dex_hand
โย ย โย ย โโโ more_goods_orders-P4-dex_hand
โย ย โย ย โโโ more_goods_orders-P4-leju_claw
โย ย โย ย โโโ more_scan_code_for_weighing-P4-dex_hand
โย ย โย ย โโโ parts_offline-P4-dex_hand
โย ย โย ย โโโ parts_off_line-P4-leju_claw
โย ย โย ย โโโ quick_sort-P4-leju_claw
โย ย โย ย โโโ rubbish_sorting-P4-leju_claw
โย ย โย ย โโโ shop_oversale-P4-leju_claw
โย ย โย ย โโโ single_coil_sorting-P4-dex_hand
โย ย โย ย โโโ single_FMCG_loading-P4-leju_claw
โย ย โย ย โโโ single_goods_orders-P4-dex_hand
โย ย โย ย โโโ SMT_tray_rack_blanking-P4-leju_claw
โย ย โย ย โโโ SPS_parts_grab-P4-leju_claw
โย ย โย ย โโโ SPS_parts_sorting-P4-dex_hand
โย ย โย ย โโโ SPS_parts_sorting-P4-leju_claw
โย ย โย ย โโโ standardized_feeding_for_FMCG-P4-dex_hand
โย ย โย ย โโโ task_mass_check-P4-leju_claw
โย ย โโโ sim
โย ย โโโ Unlabelled
โย ย โโโ bottle_flip-P4-claw(Rq2f85)
โย ย โโโ package_weighing-P4-claw(Rq2f85)
โย ย โโโ SPS_parts_sorting-P4-claw(Rq2f85)
โย ย โโโ target_placement-P4-claw(Rq2f85)
โโโ rosbag
โโโ real
โย ย โโโ Labelled // Same task structure as HDF5.
โย ย โโโ Unlabelled // Same task structure as HDF5.
โโโ sim
โโโ Unlabelled // Same task structure as HDF5.
Data Format
ROSbag Data Format
| Topic Type | Topic Name | Message Type | Main Fields / Description |
|---|---|---|---|
| Camera RGB Image | /cam_x/color/image_raw/compressed | sensor_msgs/CompressedImage | x is h/l/r, for head/left wrist/right wrist camera respectively; header (message header with timestamp, sequence, frame, etc.), format (image encoding format), data (image data) |
| Camera Depth Image | /cam_x/depth/image_rect_raw/compressed | sensor_msgs/CompressedImage | x is h/l/r, for head/left wrist/right wrist camera respectively; header (message header), format (encoding format), data (image data) |
| Arm Trajectory Control | /kuavo_arm_traj | sensor_msgs/JointState | header (message header), name (joint name list, 14 joints, arm_joint_1~arm_joint_14), position (desired joint position, structure same as raw sensor data items 12-25) |
| Raw Sensor Data | /sensors_data_raw | kuavo_msgs/sensorsData | sensor_time (timestamp), joint_data (joint data: position, velocity, acceleration, current), imu_data (IMU data: gyroscope, accelerometer, quaternion), end_effector_data (end effector data, currently unused) |
| Dexterous Hand Position (Real Robot) | /control_robot_hand_position | kuavo_msgs/robotHandPosition | left_hand_position (left hand 6D, 0 open, 100 closed), right_hand_position (right hand 6D, 0 open, 100 closed) |
| Dexterous Hand State (Real Robot) | /dexhand/state | sensor_msgs/JointState | name (12 joint names), position (12 joint positions, first 6 for left hand, last 6 for right hand), velocity (12 joint velocities), effort (12 joint currents) |
| Gripper Control (Real Robot) | /leju_claw_command | kuavo_msgs/leju_claw_command | name (length 2, left_claw/right_claw), position (length 2, 0 open, 100 closed), velocity (length 2, target velocity, default 50), effort (length 2, target current in A, default 1) |
| Gripper State (Real Robot) | /leju_claw_state | kuavo_msgs/lejuClawState | state (int8[2], left/right gripper state, see details below), data (kuavo_msgs/endEffectorData, contains gripper position, velocity, current) |
| Simulation Gripper Control | /gripper/command | sensor_msgs/JointState | header (message header), position (length 2, 0 open, 255 closed) |
| Simulation Gripper State | /gripper/state | sensor_msgs/JointState | header (message header), position (length 2, 0 open, 0.8 closed) |
| Robot Position Command | /cmd_pose_world | geometry_msgs/Twist | linear.x/y/z (translation in world frame in m), angular.x/y/z (rotation in world frame in radians) |
Detailed Field Descriptions
/cam_x/color/image_raw/compressedใ/cam_x/depth/image_rect_raw/compressed๏ผ
- header๏ผstd_msgs/Header๏ผ๏ผMessage header with timestamp, sequence number, frame information
- format๏ผstring๏ผ๏ผImage encoding format
- data๏ผuint8[]๏ผ๏ผImage data
/kuavo_arm_traj๏ผ
- header๏ผMessage header
- name๏ผJoint name list, 14 joints named arm_joint_1~arm_joint_14
- position๏ผDesired joint position, structure same as raw sensor data items 12-25
/sensors_data_raw๏ผ
- sensor_time๏ผtime๏ผ๏ผTimestamp
- joint_data๏ผkuavo_msgs/jointData๏ผ๏ผJoint data including position, velocity, acceleration, current
- Data order๏ผ
First 12 items are lower limb motor data:
- Indices 0โ5: left leg
(l_leg_roll,l_leg_yaw,l_leg_pitch,l_knee,l_foot_pitch,l_foot_roll) - Indices 6โ11: right leg
(r_leg_roll,r_leg_yaw,r_leg_pitch,r_knee,r_foot_pitch,r_foot_roll)
- Indices 0โ5: left leg
Next 14 items are arm motor data:
- Indices 12โ18: left arm
(l_arm_pitch,l_arm_roll,l_arm_yaw,l_forearm_pitch,l_hand_yaw,l_hand_pitch,l_hand_roll) - Indices 19โ25: right arm
(r_arm_pitch,r_arm_roll,r_arm_yaw,r_forearm_pitch,r_hand_yaw,r_hand_pitch,r_hand_roll)
- Indices 12โ18: left arm
Last 2 items are head motor data: head_yaw, head_pitch
- Units: position in radians, velocity in radian/s, acceleration in radian/sยฒ, current in Amperes (A)
- Data order๏ผ
- imu_data๏ผkuavo_msgs/imuData๏ผ๏ผIMU data including gyroscope (gyro, unit rad/s), accelerometer (acc, unit m/sยฒ), quat (IMU orientation)
- end_effector_data๏ผkuavo_msgs/endEffectorData๏ผ๏ผEnd effector data, currently unused
/control_robot_hand_position๏ผ
- left_hand_position๏ผfloat[6]๏ผ๏ผLeft hand 6D, each element [0,100], 0 fully open, 100 fully closed
- right_hand_position๏ผfloat[6]๏ผ๏ผRight hand 6D, same meaning as above
/dexhand/state๏ผ
- name๏ผstring[12]๏ผ๏ผ12 joint names
- position๏ผfloat[12]๏ผ๏ผ12 joint positions, first 6 for left hand, last 6 for right hand
- velocity๏ผfloat[12]๏ผ๏ผ12 joint velocities, first 6 for left hand, last 6 for right hand
- effort๏ผfloat[12]๏ผ๏ผ12 joint currents, first 6 for left hand, last 6 for right hand
/leju_claw_command๏ผ
- name๏ผstring[2]๏ผ๏ผleft_claw, right_claw
- position๏ผfloat[2]๏ผ๏ผLeft/right gripper target position, [0,100], 0 open, 100 closed
- velocity๏ผfloat[2]๏ผ๏ผTarget velocity, [0,100], default 50
- effort๏ผfloat[2]๏ผ๏ผTarget current in A, default 1
/leju_claw_state๏ผ
- state๏ผint8[2]๏ผ๏ผLeft/right gripper state, meanings as follows:
- -1๏ผError (execution anomaly)
- 0๏ผUnknown (default initialization state)
- 1๏ผMoving
- 2๏ผReached target position
- 3๏ผObject grasped
- data๏ผkuavo_msgs/endEffectorData๏ผ๏ผContains gripper position, velocity, current, structure same as /leju_claw_command
- state๏ผint8[2]๏ผ๏ผLeft/right gripper state, meanings as follows:
/gripper/command๏ผSimulation๏ผ๏ผ
- header๏ผMessage header
- position๏ผfloat[2]๏ผ๏ผLeft/right gripper target position, [0,255], 0 open, 255 closed
/gripper/state๏ผSimulation๏ผ๏ผ
- header๏ผMessage header
- position๏ผfloat[2]๏ผ๏ผLeft/right gripper current position, [0,0.8], 0 open, 0.8 closed
/cmd_pose_world๏ผSimulation Task 4 only๏ผ๏ผ
- linear.x/y/z๏ผfloat๏ผ๏ผTranslation in world frame in meters
- angular.x/y/z๏ผfloat๏ผ๏ผRotation in world frame in radians
HDF5 Data Format
โโโ cameras
โ โโโ hand_left // Left hand camera
โ โ โโโ color // RGB image info
โ โ โ โโโ data // RGB image data (by timestamp)
โ โ โโโ depth/ // Depth image info
โ โ โโโ data // Depth data
โ โโโ hand_right // Right hand camera
โ โ โโโ color // RGB image info
โ โ โ โโโ data // RGB data
โ โ โโโ depth // Depth image info
โ โ โโโ data // Depth data
โ โโโ head // Head camera
โ โโโ color // RGB image info
โ โ โโโ data // RGB image data
โ โโโ depth // Depth image info
โ โโโ data // Depth data
โโโ joints // Joint data
โ โโโ action // Desired joint values
โ โ โโโ arm // Arm
โ โ โ โโโ position // N(rows)*14(cols); N=frames, 14=DoF for both arms (7 per arm)
โ โ โ โโโ velocity // Desired joint velocity
โ โ โโโ effector // End effector
โ โ โ โโโ position // N(rows)*2(cols); N=frames, 2=left/right gripper open/close
โ โ โโโ head // Head
โ โ โ โโโ position // N(rows)*2(cols); N=frames, 2=2 DoF (pitch/yaw)
โ โ โ โโโ velocity // Joint velocity
โ โ โโโ leg // Leg
โ โ โโโ position // N(rows)*12(cols)
โ โ โโโ velocity // Joint velocity
โ โโโ state // Actual joint values
โ โโโ arm // Arm
โ โ โโโ position // N(rows)*14(cols); N=frames, 14=DoF for both arms (7 per arm)
โ โ โโโ velocity // Joint velocity
โ โโโ effector // End effector
โ โ โโโ position // N(rows)*2(cols); N=frames, 2=left/right gripper open/close
โ โโโ head // Head
โ โ โโโ position // N(rows)*2(cols); N=frames, 2=2 DoF (pitch/yaw)
โ โ โโโ velocity // Joint velocity
โ โโโ leg // Leg
โ โโโ position // N(rows)*12(cols)
โ โโโ velocity // Joint velocity
โโโ parameters // Sensor extrinsics
โ โโโ camera
โ โโโ hand_left.json # Left hand camera intrinsics/extrinsics
โ โโโ hand_right.json # Right hand camera intrinsics/extrinsics
โ โโโ head.json # Head camera intrinsics/extrinsics
โโโ metadata.json # Collection metadata: device, end effector type, camera frame rate, joint info, etc.
Label Format
Label information is stored in a JSON file with the same name as the data file. Example:
{
"loaction": "Yangtze River Delta Integrated Demonstration Zone Intelligent Robot Training Center",
"primaryScene": "Default primary scene",
"primarySceneCode": "default_level_one_scene",
"secondaryScene": "3C factory scene",
"secondarySceneCode": "3C factory manufacturing",
"tertiaryScene": "Coil sorting",
"tertiarySceneCode": "Coil sorting",
"initSceneText": "Coils of various colors are placed in the middle of the table, material boxes are placed on both sides of the table, and the robot is located at the back of the table",
"englishInitSceneText": "Coils of various colors are placed in the middle of the table, material boxes are placed on both sides of the table, and the robot is located at the back of the table",
"taskGroupName": "Single coil sorting",
"taskGroupCode": "single_coil_sorting",
"taskName": "7-22-Coil classification",
"taskCode": "XQFL_11",
"deviceSn": "P4-209",
"taskPrompt": "",
"marks": [
{
"taskId": "1947326026455584768",
"markStart": "2025-07-22 9:18:39.640",
"markEnd": "2025-07-22 9:18:39.814",
"duration": 0.233,
"startPosition": 0.7363737795977026,
"endPosition": 0.769568869806783,
"skillAtomic": "pick",
"skillDetail": "Pick up the coil from the table",
"enSkillDetail": "pick coil from table",
"markType": "step"
}
]
}
๐ฅData Access
- Official request: You can request access by contacting the official email
[email protected]. - Public platforms: The LET dataset will be publicly released on major platforms such as Openloong, ModelScope, and Hugging Face to provide convenience for developers and researchers worldwide.
๐ Data Communication Group
- Data communication QQ group: 1043359345
๐ Citation
If you use this dataset in your research, please cite it according to the platform from which you accessed it:
Citation for Hugging Face
@misc{LET2025,
title={LET:Full-Size Humanoid Robot Real-World Dataset},
author={LejuRobotics},
year={2025},
howpublished={\url{https://huggingface.co/datasets/LejuRobotics/let_dataset}}
}
Citation for ModelScope
@misc{LET2025,
title={LET:Full-Size Humanoid Robot Real-World Dataset},
author={LejuRobotics},
year={2025},
howpublished={\url{https://www.modelscope.cn/datasets/lejurobot/let_dataset}}
}
Citation for Atomgit AI
@misc{LET2025,
title={LET:Full-Size Humanoid Robot Real-World Dataset},
author={LejuRobotics},
year={2025},
howpublished={\url{https://ai.atomgit.com/lejurobot/let_dataset}}
}
๐ License
All the data and code within this repo are under CC BY-NC-SA-4.0.
Social Proof
AI Summary: Based on Hugging Face metadata. Not a recommendation.
๐ก๏ธ Dataset Transparency Report
Technical metadata sourced from upstream repositories.
๐ Identity & Source
- id
- hf-dataset--lejurobotics--let_dataset
- slug
- lejurobotics--let_dataset
- source
- huggingface
- author
- LejuRobotics
- license
- CC-BY-NC-SA-4.0
- tags
- license:cc-by-nc-sa-4.0, region:us
โ๏ธ Technical Specs
- architecture
- null
- params billions
- null
- context length
- null
- pipeline tag
๐ Engagement & Metrics
- downloads
- 24,616
- stars
- 2
- forks
- 0
Data indexed from public sources. Updated daily.

