What You're Actually Getting
The Unitree G1 EDU is not a toy. It is a 127 cm, 35+ kg bipedal humanoid with real compute, real sensors, and a real potential to fall on things you care about. Before you start configuring DDS domains and flashing firmware, it helps to understand the architecture you are working with.
The G1 uses a dual-computer architecture. The primary development computer is an NVIDIA Jetson Orin NX (either the 8 GB or 16 GB variant, depending on your tier). This handles high-level perception, planning, and any custom applications you deploy. The secondary computer is an internal microcontroller unit (MCU) running Unitree's proprietary low-level control firmware. You do not touch the MCU directly. Instead, you communicate with it through Unitree's SDK via CycloneDDS, a real-time publish-subscribe middleware.
Out of the box, the G1 can stand, walk on flat surfaces, and stream sensor data. It cannot autonomously navigate, pick up objects, or respond to voice commands. Those capabilities are what you build. This guide will get you from unboxing to your first walking demo with verified sensor data flowing to your development machine.
Pre-Setup Checklist
Do not skip this section. Getting the environment right before you power on the robot saves hours of debugging later.
Development Machine
- Operating System: Ubuntu 22.04 LTS (not 20.04, not 24.04). ROS2 Humble has first-class support only for 22.04.
- RAM: 16 GB minimum, 32 GB recommended if you plan to run perception stacks alongside your control code.
- GPU: An NVIDIA GPU with CUDA support is strongly recommended for any vision or learning workloads. RTX 3060 or better.
- Python: 3.10 specifically. The unitree_sdk2_python package pins to this version. Using 3.11 or 3.12 will cause import failures.
Network Equipment
- A dedicated router or managed switch. Do not use your office Wi-Fi. The G1's Jetson Orin connects via Ethernet, and you want a clean, low-latency link between your dev machine and the robot.
- Two Ethernet cables (Cat6 or better).
- Alternatively, the Jetson Orin supports Wi-Fi, but expect higher latency and occasional packet loss. For initial setup, always use wired connections.
Physical Space
- A minimum of 3m x 3m of clear floor space with no obstacles. More is better.
- Flat, hard flooring. Carpet increases friction unpredictably and can cause the robot to stumble during early locomotion tests.
- A safety harness or a second person acting as a spotter. The G1 is heavy enough to damage itself and your floor if it falls.
Software Prerequisites
Install these on your development machine before the robot arrives:
# Install ROS2 Humble
sudo apt update && sudo apt install -y software-properties-common
sudo add-apt-repository universe
sudo apt update && sudo apt install -y curl
sudo curl -sSL https://raw.githubusercontent.com/ros/rosdistro/master/ros.key \
-o /usr/share/keyrings/ros-archive-keyring.gpg
echo "deb [arch=$(dpkg --print-architecture) \
signed-by=/usr/share/keyrings/ros-archive-keyring.gpg] \
http://packages.ros.org/ros2/ubuntu \
$(. /etc/os-release && echo $UBUNTU_CODENAME) main" | \
sudo tee /etc/apt/sources.list.d/ros2.list > /dev/null
sudo apt update
sudo apt install -y ros-humble-desktop
# Source the setup
echo "source /opt/ros/humble/setup.bash" >> ~/.bashrc
source ~/.bashrc
# Install CycloneDDS
sudo apt install -y ros-humble-rmw-cyclonedds-cpp
echo "export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp" >> ~/.bashrc
source ~/.bashrc
Unboxing and Physical Setup
The G1 ships in a reinforced flight case. Inside, the robot is secured in custom foam with the limbs folded. You will also find a power adapter, Ethernet cable, emergency stop (e-stop) remote, and a quick-start card that you can safely ignore in favor of this guide.
Power-On Sequence
- Remove the robot from the case and place it on the floor in a seated or supported position. Do not attempt to stand it up yet.
- Inspect the robot for any shipping damage: loose cables, cracked housings, anything that looks wrong.
- Connect the charger and charge to at least 80% before first use. A full charge takes approximately 2 hours.
- Locate the power button on the robot's back panel. Press and hold for 3 seconds. You will hear a startup chime and see LEDs illuminate on the torso.
- The Jetson Orin takes approximately 30 seconds to boot. Wait for the network interface to come up before proceeding.
Keep the e-stop remote in your hand at all times during testing. One press kills motor power instantly. Practice using it before running any locomotion commands.
Network Configuration
The G1's Jetson Orin NX has a default static IP of 192.168.123.15 on its Ethernet interface. Your development machine needs to be on the same subnet.
Setting a Static IP on Your Dev Machine
# Set static IP on your Ethernet interface (replace enp3s0 with your interface)
sudo ip addr add 192.168.123.100/24 dev enp3s0
sudo ip link set enp3s0 up
# Verify connectivity
ping -c 3 192.168.123.15
If the ping succeeds, your physical network is good. Now configure CycloneDDS so your DDS topics can flow between machines.
CycloneDDS Configuration
Create a file at ~/cyclonedds.xml with the following content:
<?xml version="1.0" encoding="UTF-8"?>
<CycloneDDS xmlns="https://cdds.io/config"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="https://cdds.io/config">
<Domain id="any">
<General>
<Interfaces>
<NetworkInterface name="enp3s0"
priority="default"
multicast="true" />
</Interfaces>
<AllowMulticast>default</AllowMulticast>
<MaxMessageSize>65500B</MaxMessageSize>
</General>
<Discovery>
<ParticipantIndex>auto</ParticipantIndex>
<MaxAutoParticipantIndex>120</MaxAutoParticipantIndex>
</Discovery>
</Domain>
</CycloneDDS>
Then export the config path:
export CYCLONEDDS_URI=file://$HOME/cyclonedds.xml
# Add to .bashrc for persistence
echo 'export CYCLONEDDS_URI=file://$HOME/cyclonedds.xml' >> ~/.bashrc
Important: Replace enp3s0 with your actual Ethernet interface name. Run ip link show to find it. Getting this wrong is the number one reason DDS discovery fails.
SDK Installation
Python SDK
The Python SDK is the fastest way to get started. Install it in a virtual environment to avoid dependency conflicts:
# Create and activate a virtual environment with Python 3.10
python3.10 -m venv ~/g1_env
source ~/g1_env/bin/activate
# Install the SDK
pip install unitree_sdk2py
# Verify installation
python -c "import unitree_sdk2py; print('SDK version:', unitree_sdk2py.__version__)"
C++ SDK
For latency-critical applications, use the C++ SDK:
# Install dependencies
sudo apt install -y cmake build-essential libcyclonedds-dev
# Clone and build
git clone https://github.com/unitreerobotics/unitree_sdk2.git
cd unitree_sdk2
mkdir build && cd build
cmake ..
make -j$(nproc)
sudo make install
Common Pitfalls
- Wrong Python version: If you see
ModuleNotFoundErrorfor native extensions, you are likely running Python 3.11+. The native bindings are compiled for 3.10. - Missing CycloneDDS: The SDK depends on the CycloneDDS C library, not just the ROS2 wrapper. If you get linker errors, install
libcyclonedds-devseparately. - Permission denied on USB: If accessing sensors via USB, add your user to the
dialoutgroup:sudo usermod -aG dialout $USER, then log out and back in.
ROS2 Bridge Setup
The ROS2 bridge translates Unitree's DDS topics into standard ROS2 topics, giving you access to the full ROS2 ecosystem for visualization, navigation, and manipulation.
# Clone the bridge
mkdir -p ~/ros2_ws/src && cd ~/ros2_ws/src
git clone https://github.com/unitreerobotics/unitree_ros2.git
# Build
cd ~/ros2_ws
colcon build --symlink-install
source install/setup.bash
# Launch the bridge
ros2 launch unitree_ros2 bridge.launch.py
Once the bridge is running, you should see topics populating. Verify with:
# List all available topics
ros2 topic list
# Expected output includes:
# /joint_states
# /imu/data
# /camera/color/image_raw
# /camera/depth/image_rect_raw
# /lidar/points
# /odom
# /robot_state
# Check the IMU is publishing at 500 Hz
ros2 topic hz /imu/data
# Inspect a single joint state message
ros2 topic echo /joint_states --once
Sensor Verification
The G1 EDU ships with three primary sensors. Verify each one individually before running any locomotion.
Intel RealSense D435i (Depth Camera)
The D435i is mounted in the robot's head. It provides RGB, stereo depth, and an integrated IMU.
# Install RealSense ROS2 wrapper if not already present
sudo apt install -y ros-humble-realsense2-camera
# Check the camera feed
ros2 topic echo /camera/color/image_raw --field header --once
# For a visual check, use RViz2
rviz2 &
# Add an Image display, set topic to /camera/color/image_raw
# Add a PointCloud2 display, set topic to /camera/depth/color/points
If the camera is not publishing, SSH into the Jetson and verify the RealSense node is running: ssh unitree@192.168.123.15 (default password is typically 123).
Livox MID-360 LiDAR
The MID-360 provides 360-degree point clouds for mapping and obstacle avoidance.
# Check LiDAR point cloud topic
ros2 topic hz /lidar/points
# Should publish at approximately 10 Hz
# Visualize in RViz2
# Add PointCloud2 display, set topic to /lidar/points
# Set Fixed Frame to "base_link"
IMU
The onboard IMU is critical for balance and state estimation. It should be publishing at 500 Hz.
# Verify IMU data
ros2 topic echo /imu/data --once
# Check that angular velocity and linear acceleration fields
# contain non-zero values when you gently tilt the robot
If any sensor is not publishing, check the USB connections on the Jetson Orin. The most common hardware issue we see is a loose USB-C cable between the RealSense and the Jetson.
Basic Locomotion Test
This is where it gets real. Before running any movement commands, read this entire section.
Safety Precautions
- Have the e-stop remote in hand and tested.
- Clear all obstacles in a 3-meter radius around the robot.
- Have a second person acting as a spotter, ready to catch the robot if it loses balance.
- Start in a low-power stance mode before transitioning to walk. Never jump directly to full locomotion.
Standing Up
from unitree_sdk2py.core.channel import ChannelFactoryInitialize
from unitree_sdk2py.go2.sport import SportClient
import time
# Initialize DDS
ChannelFactoryInitialize(0, "enp3s0")
# Create the sport client
client = SportClient()
client.Init()
# Command the robot to stand
# This transitions from sitting to a stable standing posture
client.StandUp()
time.sleep(3)
print("Robot is standing. Press Ctrl+C to sit down.")
try:
while True:
time.sleep(0.1)
except KeyboardInterrupt:
client.StandDown()
time.sleep(2)
print("Robot is seated.")
First Steps
Once the robot is standing stably, you can command forward walking:
# Command forward velocity (m/s)
# Start very slow: 0.1 m/s forward, no lateral, no rotation
client.Move(0.1, 0.0, 0.0)
time.sleep(3)
# Stop
client.Move(0.0, 0.0, 0.0)
time.sleep(1)
# Sit back down
client.StandDown()
What to expect: The robot will take small, deliberate steps forward. The gait is pre-programmed and stable on flat surfaces. You will hear servo motors adjusting in real time. The motion is not smooth like a video demo; it is mechanical and precise. That is normal.
Common Pitfalls and Troubleshooting
1. DDS Discovery Fails ("No topics found")
Cause: CycloneDDS is not configured to use the correct network interface, or a firewall is blocking multicast traffic.
Fix: Double-check your cyclonedds.xml network interface name. Disable the firewall temporarily: sudo ufw disable. Ensure both machines are on the same subnet.
2. Python SDK Import Error
Cause: Running Python 3.11 or 3.12 instead of 3.10. The native extension modules are version-pinned.
Fix: Install Python 3.10 via deadsnakes PPA: sudo add-apt-repository ppa:deadsnakes/ppa && sudo apt install python3.10 python3.10-venv.
3. Robot Falls Immediately on StandUp
Cause: Usually a low battery or an uneven surface. The balance controller needs a minimum voltage to maintain torque.
Fix: Charge to at least 50%. Move to a flat, hard surface. Ensure the robot is placed squarely before sending the stand command.
4. Camera Feed Laggy or Dropping Frames
Cause: Wi-Fi bandwidth is insufficient for raw image streaming, or DDS QoS settings are mismatched.
Fix: Switch to wired Ethernet. If you must use Wi-Fi, reduce the camera resolution in the RealSense node config. Alternatively, run your perception pipeline directly on the Jetson Orin rather than streaming images off-board.
5. "Permission denied" When Sending Commands
Cause: The robot's safety system rejects commands from unrecognized DDS participants, typically because the domain ID does not match.
Fix: Ensure your DDS domain ID matches the robot's. The default is 0. Pass it explicitly in ChannelFactoryInitialize(0, "enp3s0").
What's Next
You now have a G1 EDU standing, walking, and streaming sensor data to your development machine. Here is where it gets interesting.
The G1 is a blank canvas. Unitree provides the hardware and a basic locomotion controller. Everything else, from autonomous navigation to object manipulation to multi-robot coordination, is your software to build.
Common next steps:
- SLAM and navigation: Use the Livox MID-360 and RealSense to build maps and navigate autonomously with Nav2.
- Manipulation: If you have the G1 with dexterous hands, integrate grasp planning with MoveIt2.
- Custom skills: Build task-specific behaviors using the SDK's high-level API or reinforcement learning.
- Sim-to-real transfer: Train policies in Isaac Sim or MuJoCo and deploy to the real robot.
Need help building on the G1 EDU? We work with it every day.
Book a Discovery Call