Robotics: Science and Systems (RSS) 2025
Project Page | Paper | Documentation | 30min工作讲解
Han Zhang1,2, Songbo Hu1, Zhecheng Yuan1,2,3 Huazhe Xu1,2,3
1Tsinghua University, 2Shanghai Qi Zhi Institute, 3Shanghai AI Lab
- 2025/04/28 — Initial commit.
- 2025/05/11 — Added embedded firmware repository.
- 2025/06/01 — Added MakerWorld link, PCBA files, and Onshape model link.
- 2025/06/22 — Update the tutorial. Add the Python scripts.
Tested on: Ubuntu 20.04 LTS
conda create -n DOGlove python=3.9.19conda install -c conda-forge mujocopip install -r requirements.txtFollow the official installation guide:
- Download the installer: Linux Download
- Grant permission:
sudo chmod 775 DynamixelWizard2Setup_x64
- Run the installer:
./DynamixelWizard2Setup_x64
- Follow the prompts to complete installation.
- Add your user to the
dialoutgroup to access the USB port:sudo usermod -aG dialout <your_account_id> # You can find your ID using: whoami
- Reboot to apply the changes:
sudo reboot
conda activate DOGlove
python servo.py
python glove_mcu.pypython fk.pypython fk_ik_core.pyApplication Note: New Hand Models
model_spec.scale is a uniform FK-to-IK scale factor applied to all fingertips.
Tune this first for each model:
- Set
model_spec.offset = (0, 0, 0)and all per-tip adjustments to zero. - Move one finger through a wide range of motion.
- Increase
scaleif IK motion amplitude is too small, or decrease it if too large.
Only tune global and per-tip translations after scale is close.
Application Note: Alignment Tuning
After scale is set, tune translation terms in this order:
model_spec.offsetfor global XYZ alignment across all fingertips.model_spec.tip_position_adjustments[tip]for residual per-tip errors.
Use offset to correct shared drift across all fingers. Keep per-tip adjustments
small, and use them only for model-specific fingertip bias.
python tools/udp_record.py --duration <record_time> --output <path, e.g. recordings/udp_capture_test.jsonl>Records current packets from hardware.
python tools/udp_replay.py --capture <path, e.g. recordings/udp_capture_test.jsonl>Substitutes glove_mcu.py and servo.py with recorded packets to simulate
hardware movement.
python tools/udp_reply_render.py --capture <path, e.g. recordings/udp_capture_test.jsonl>Renders recorded packets in MuJoCo.
See tracker.md
This repository is released under the MIT license. See LICENSE for more details.
- Our wrist tracking code is adapted from HTC Vive Tracker Python API.
- Our Franka control code is adapted from UMI and Data Scaling Laws.
- Our 3D diffusion policy implementation is adapted from 3D Diffusion Policy and DemoGen.
- The teleoperation baseline (AnyTeleop) is implemented from Dex Retargeting.
Contact Han Zhang if you have any questions or suggestions.
If you find our work useful, please consider citing:
@article{zhang2025doglove,
title={DOGlove: Dexterous Manipulation with a Low-Cost Open-Source Haptic Force Feedback Glove},
author={Zhang, Han and Hu, Songbo and Yuan, Zhecheng and Xu, Huazhe},
journal={arXiv preprint arXiv:2502.07730},
year={2025}
}