A unified ROS 2 driver and teleoperation bridge for HTC Vive hardware (Trackers, Controllers, and Base Stations). This package provides a standalone Dockerized workflow for workspace calibration and robot teleoperation.
| Teleoperation | Workspace Calibration |
|---|---|
![]() |
![]() |
- ROS 2 Vive Controller
- HTC Vive Base Stations (Lighthouse): At least one is mandatory for tracking.
- HTC Vive Controller or Tracker: For teleoperation and calibration.
- SteamVR Compatible Dongle (or HMD): Required to connect the devices to the PC.
- Linux (Ubuntu 22.04 recommended)
- Docker & NVIDIA Container Toolkit
- Python 3 (for build scripts)
- Steam Account (Username and Password required for headless SteamVR installation)
This project uses a multi-stage Docker build to separate the heavy SteamVR dependencies from your daily development code. You do not need to install ROS 2 or SteamVR on your host machine.
We provide a Python script to handle the multi-stage build arguments automatically.
- Export your Steam Credentials (Required to download the headless SteamVR server):
export STEAM_USER="your_username"
export STEAM_PASSWORD="your_password"- Run the Build Script:
# Builds the development image (mounts local code)
python3 scripts/build_docker.py --target devThe run_docker.py script handles GPU passthrough, USB device mapping, and ROS 2 network configuration (ROS_DOMAIN_ID). It also provides aliases for common tasks.
Basic Usage:
python3 scripts/run_docker.py [COMMAND] --domain [ID]
Available Commands:
| Command | Description |
|---|---|
calibrate |
Launches the workspace calibration tool with RViz. |
teleop |
Launches the main teleoperation bridge (driver + bridge). |
g1 |
Launches the G1 Humanoid dual-arm teleop configuration. |
tiago |
Launches the Tiago dual-arm teleop configuration. |
| (empty) | Starts a generic bash shell inside the container. |
Example:
# Run teleoperation on Domain ID 1
python3 scripts/run_docker.py teleop --domain 1
The driver handles three critical tasks:
- Haptic Safety (The Virtual Fence): It monitors the controller's position relative to the calibrated
workspaceparameters. If the controller enters the "padding" zone near a wall, it triggers a haptic vibration pulse (2ms) to warn the user. - Jitter Reduction: It uses a OneEuro Filter to smooth out the tracking data, significantly reducing high-frequency jitter in the robot's motion.
- Haptic & Logic Separation: It treats the hardware as a
JointStatesource (buttons) and aPoseStampedsource (tracking).
- OneEuro Filter: Balances low-latency response with high-speed smoothing using three parameters:
mincutoff,beta, anddcutoff. - Workspace Markers: Publishes a semi-transparent red cube to RViz representing the "Safe Zone" defined during calibration.
- Serialized Hardware: Binds to specific controllers using their unique hardware serial numbers (e.g.,
LHR-4BB3817E).
The driver is typically launched via vive_teleop.launch.py. This launch file is designed to be robust against OpenVR initialization race conditions.
# Run with default serials
python3 scripts/run_docker.py teleop
# Run with specific hardware serials
python3 scripts/run_docker.py teleop --serial_right "LHR-12345678"
The launch file includes a 2-second TimerAction delay between starting the left and right controller drivers. This prevents Inter-Process Communication (IPC) conflicts within the OpenVR runtime when multiple nodes attempt to initialize the driver simultaneously.
| Argument | Default | Description |
|---|---|---|
rviz |
true |
Automatically launches RViz with a predefined config. |
serial_left |
LHR-97752221 |
Hardware ID for the left-hand controller. |
serial_right |
LHR-4BB3817E |
Hardware ID for the right-hand controller. |
tracking_reference |
LHB-DFA5BD2C |
The Base Station used as the world origin. |
I have added a detailed breakdown of the joint_states topic to the Data Outputs section. This explains exactly which index corresponds to which physical button, matching the code you provided. |
Each driver node (Left/Right) publishes to its own namespace.
| Topic | Type | Description |
|---|---|---|
vive/left/pose |
geometry_msgs/PoseStamped |
Filtered 6-DOF position and orientation. |
vive/left/workspace_marker |
visualization_msgs/Marker |
The visual boundary box in RViz. |
The driver publishes button inputs as a sensor_msgs/JointState message to vive/left/joint_states. The position array contains the values for the following keys:
| Index | Name | Type | Range | Description |
|---|---|---|---|---|
| 0 | trigger |
Analog | 0.0 - 1.0 |
The index finger trigger. Used for the Clutch. |
| 1 | trackpad_x |
Analog | -1.0 - 1.0 |
Horizontal touch position on the round pad. |
| 2 | trackpad_y |
Analog | -1.0 - 1.0 |
Vertical touch position on the round pad. |
| 3 | grip |
Digital | 0.0 / 1.0 |
The side grip buttons (squeezing the handle). |
| 4 | menu |
Digital | 0.0 / 1.0 |
The small button above the trackpad. |
| 5 | trackpad_touched |
Digital | 0.0 / 1.0 |
True if the thumb is touching the pad. |
| 6 | trackpad_pressed |
Digital | 0.0 / 1.0 |
True if the trackpad is physically clicked down. |
| Vive Button Map |
|---|
![]() |
Note: Digital buttons are published as floats (
0.0for False,1.0for True) to maintain consistency within theJointStatemessage standard.
This node implements a Clutch Mechanism (Deadman Switch) to allow for safe, intuitive teleoperation by separating translation and rotation logic.
To provide the most intuitive experience for the operator, the bridge treats position and orientation differently:
- Relative Translation (The Mouse Metaphor): Position is calculated as a delta () from the moment the clutch is engaged. This allows you to "ratchet" the robot's position, moving it large distances through multiple small controller strokes.
- Absolute Orientation (The Mirror Metaphor): Rotation is not relative. For intuitive control, the robot's end-effector orientation is mapped to match the controller's orientation directly. This ensures that if you tilt the controller 45°, the robot hand tilts 45°, maintaining a consistent mental map for the operator.
To ensure the robot moves in the direction you expect, the controller's internal axes must be understood. The image below shows the coordinate system of the Vive Controller used by the driver:
| Vive Coordinate System |
|---|
![]() |
- Z-axis: Points "out" from the controller tip.
- X-axis: Points to the right side of the controller.
- Y-axis: Points "up" through the trackpad.
Teleoperation Workflow:
- Idle State: The bridge ignores controller movement.
- Activation (Clutch): Press the
trigger. The node saves the current EE position as a reference point. - Execution: Move the controller. The robot translates based on your hand's displacement and rotates to mirror your hand's orientation.
- Repositioning: Release the trigger. Your hand can move freely while the robot stays locked in its current pose.
- PointStamped Button Mapping: All analog and digital button data is published as
geometry_msgs/PointStamped(with the value in the.xfield). This allows for high-compatibility with diverse robot control stacks. - TF2 Integration: Uses standard ROS 2 transform lookups to anchor movement to the robot's coordinate system (e.g.,
base_linkorpelvis). - Frequency Control: Can be set to a fixed rate (e.g., 30 Hz) or event-driven mode (
-1.0) where it publishes only when new data arrives.
The package includes pre-configured launch files for complex platforms, demonstrating how to remap topics and frames for specific hardware.
Designed for the dual-arm TIAGo robot. It maps VR buttons to the TIAGo gripper actions.
- Reference Frame:
base_link - Target Frames:
gripper_left_grasping_frame/gripper_right_grasping_frame - Logic: Maps the
menubutton directly to the gripper command topics.
A specialized setup for high-speed humanoid teleoperation.
- Reference Frame:
pelvis - Target Frames:
left_hand_point_contact/right_hand_point_contact - Logic: Uses Event-Driven publishing (Frequency:
-1.0) to minimize latency for the humanoid's whole-body controller. - Custom Serials: Overrides the default hardware IDs directly in the launch file to match specific lab hardware.
To launch a specific robot configuration inside the Docker container:
# For TIAGo
python3 scripts/run_docker.py tiago
# For Unitree G1
python3 scripts/run_docker.py g1



