Phase 2 / Datastory / March 2026

When Language
Learned to Move

A single-session experiment connecting Large Language Models to robotic simulators through the Model Context Protocol — from first install to dual-arm coordination.

scroll to explore

What happens when you give an AI the ability to control physical machines through natural language? We built the pipeline — from Claude Code through MCP to ROS2, Gazebo, MuJoCo, and PyBullet — and discovered both the power and the friction of connecting language models to the physical world.

4
Simulators Attempted
3
Successfully Connected
16+
ROS2 Topics Live
1
Unified MCP Pipeline
The Architecture
Natural Language Claude Code MCP Server rosbridge / PyBullet Robot Simulator
One sentence in, robot motion out. The protocol layer makes it composable.

The Thesis

The Model Context Protocol (MCP) is Anthropic's open standard for connecting AI to external tools. ROS2 is the industry standard for robot communication. The hypothesis was simple: connect them, and you get natural language robot control with zero robot code changes.

The experiment spanned multiple sessions across March 2026, starting from a bare Ubuntu 24.04 installation and ending with Claude orchestrating dual-arm block handoffs in simulation — reasoning about why it chose each action, not just executing commands.

"I chose handoff at x=-0.05 instead of center because the loaded arm does more work — shifting the point reduces its travel distance."
— Claude's reasoning log during block handoff

The Journey

1
ROS2 Foundation + Gazebo
fail

ROS2 Humble incompatible with Ubuntu 24.04

Humble targets Jammy (22.04). GPG key errors on Noble. Pivoted to ROS2 Jazzy.

ok

ROS2 Jazzy installed

Full desktop install on Ubuntu 24.04. All core tools working.

sudo apt install -y ros-jazzy-desktop
fail

TurtleBot3 packages silently rolled back

Glob install with gazebo-ros-pkgs failed, taking turtlebot3 packages with it. Fixed by installing each package explicitly.

ok

Gazebo Harmonic + TurtleBot3 launched

Mobile robot visible in simulation. 13 ROS2 topics active: /cmd_vel, /scan, /odom, /imu, /joint_states.

ros2 launch turtlebot3_gazebo turtlebot3_world.launch.py
debug

Robot invisible in Gazebo — moved 1,443 meters offscreen

Commands were publishing successfully but the robot had traveled far from camera view. Gazebo session wasn't restarting cleanly — background processes needed to be killed with pkill.

ok

Robot movement confirmed visible

After clean restart with fresh clock (sec: 64), robot spinning and driving confirmed in Gazebo. TwistStamped messages working at 20Hz.

Gazebo TurtleBot3 Demo Recording
Claude Code controlling mobile robot via MCP -> ROS2
2
MCP Bridge + Claude Code
ok

ros-mcp-server v3.0.1 installed

FastMCP-based server by robotmcp. Translates LLM tool calls to ROS2 WebSocket commands via rosbridge on port 9090.

git clone https://github.com/robotmcp/ros-mcp-server.git
ok

Claude Code connected via MCP

Native installer on Linux. MCP server registered. First command: "What ROS topics are available?" returned all 13 topics with message types.

claude mcp add ros-mcp-server uv -- --directory ~/ros-mcp-server run server.py

"What ROS topics are available?" — a natural language question that traverses Claude Code → MCP → WebSocket → ROS2 and returns live robot data. The full pipeline, working.

First successful LLM → Robot query
3
MuJoCo Robotic Arm
ok

MuJoCo 3.5.0 installed

Clean pip install. Physics engine running.

fail

mujoco_ros2_control C++ build failed

ResourceManager constructor API changed in Jazzy's ros2_control. Community package not updated.

ok

Custom Python bridge: 90 lines replaced a C++ package

Headless MuJoCo simulation publishing to ROS2 topics. Three new topics: /mujoco/joint_states, /mujoco/joint_commands, /mujoco/end_effector_pos.

fail

MuJoCo viewer segfault with threads

mujoco.viewer.launch() is not thread-safe with ROS2 spin. Crashed after 2 seconds.

ok

Fixed with launch_passive() + viewer.sync()

Thread-safe passive viewer. 3-DOF robotic arm visible and controllable via Claude through same MCP pipeline.

ok

Dual-simulator control achieved

Claude controlling TurtleBot3 in Gazebo AND MuJoCo arm simultaneously through one MCP connection.

MuJoCo Robotic Arm Demo
Claude Code sending joint commands via MCP -> ROS2 -> MuJoCo
4
NVIDIA Isaac Sim (Attempted)
partial

Isaac Sim 5.1 installed but unstable

Required Python 3.11 (Ubuntu 24.04 ships 3.12). EULA accepted. GUI opened but kept crashing. SimulationApp API returned NoneType. 8GB VRAM at minimum threshold. ROS2 bridge not functional.

5
Multi-Arm Coordination (MuJoCo + PyBullet)
ok

MuJoCo two-arm scene loaded

Two 5-DOF arms with parallel-jaw grippers, table, 3 colored blocks (red/green/blue), handoff zone, target zone. Touch sensor XML errors fixed by removing broken references.

ok

MuJoCo local demo ran with AI reasoning

Pre-scripted demo completed full pick → handoff → place sequence for all 3 blocks. Reasoning log showed non-obvious discoveries about handoff positioning and block ordering.

ok

FastMCP server connected to Claude Code

Claude can observe scene, pick blocks, execute handoffs, place blocks — all through natural language. MuJoCo viewer open live alongside Claude Code.

partial

IK reaching issues in MuJoCo

Arms move but don't reach exact block positions. Custom Jacobian-based IK not precise enough for reliable grasping. Commands report success but blocks not actually gripped.

MuJoCo Multi-Arm Demo
Multi-arm coordination attempt with MuJoCo before the PyBullet pivot
ok

Pivoted to PyBullet — built-in IK + Panda arms

PyBullet's calculateInverseKinematics works out of the box with pre-built Franka Panda arm URDFs. Two arms facing each other, colored blocks on table, target box zone. Claude Code picks blocks via MCP.

partial

Pick works, placement in box still tuning

Arm successfully picks blocks. Handoff and box placement need timing refinement — blocks slip during transfer. The coordination logic is correct; grip/release timing is the remaining challenge.

PyBullet Multi-Arm Demo
Block pick works, with box placement still being tuned in the PyBullet stage

What the AI Discovered

The most striking output wasn't the robot motion — it was the reasoning. When Claude orchestrated multi-block sorting, it generated non-obvious optimization insights that a human engineer might not have considered in a first pass.

"Picking block_green second (not block_blue) because block_blue's cylindrical shape might roll and knock into block_green, toppling it. Removing the tall, unstable block_green first reduces risk of cascade failures."

Block ordering strategy — collision avoidance reasoning

"Shifting handoff to y=0.05 for the rectangular block. This gives Arm B a better approach angle — approaching from the y-axis aligns the gripper with the block's wider face (4cm) rather than the narrow face, creating a more stable grasp."

Shape-aware handoff positioning

"For the cylinder, raising handoff height to z=0.57. Higher handoff reduces the chance of the cylinder slipping during transfer — gravity assists the receiving gripper's closure."

Physics-aware transfer optimization

Final Status

Component Status Version Key Insight
ROS2 Jazzy working Jazzy Jalisco Use Jazzy for Ubuntu 24.04, not Humble
Gazebo + TurtleBot3 working Harmonic Kill background gz processes between restarts
rosbridge working Port 9090 Single point connecting MCP to ROS2
ROS MCP Server working 3.0.1 FastMCP + stdio transport to Claude Code
Claude Code working v2.1.81 Native Linux installer, MCP client
MuJoCo (single arm) working 3.5.0 launch_passive() for thread-safe viewer
MuJoCo (two-arm) partial 3.5.0 Scene loads, reasoning works, IK imprecise
PyBullet (two-arm) partial Latest Built-in IK works, handoff timing needs tuning
NVIDIA Isaac Sim failed 5.1.0 Ubuntu 24.04 not supported, 8GB VRAM insufficient

Constraints & Possibilities

Constraints Discovered

  • Ubuntu 24.04 is too new for most robotics packages — the ecosystem targets 22.04
  • RTX 2000 Ada (8GB VRAM) cannot run Isaac Sim reliably
  • No official Claude Desktop for Linux — CLI-only via Claude Code
  • MuJoCo's viewer is not thread-safe — requires launch_passive()
  • Custom IK solvers are unreliable — use simulators with built-in IK (PyBullet)
  • Isaac Sim requires exactly Python 3.11 — not 3.10, not 3.12
  • rosbridge is a single point of failure for the entire pipeline

What's Now Possible

  • Natural language robot control — "move forward, turn left, pick the red block"
  • Multi-robot orchestration through one LLM
  • AI reasoning about physical actions — not just executing, but explaining why
  • Any MCP-compatible LLM can control any ROS robot — zero code changes
  • Shape-aware and physics-aware manipulation strategies discovered by AI
  • Voice → LLM → MCP → Robot pipeline within reach
  • Add Blender MCP for 3D design + Home Assistant MCP for smart devices

Key Takeaways

1. The protocol layer is the breakthrough. MCP between the LLM and the simulator means any AI can control any robot. The intelligence and the hardware are fully decoupled. Today it's Claude; tomorrow it's any model.

2. Python bridges beat C++ packages for AI-driven robotics. When mujoco_ros2_control failed to compile, a 90-line Python script achieved the same result. When MuJoCo's IK was imprecise, PyBullet's built-in IK worked in one function call. Speed of iteration > speed of execution.

3. The reasoning log is more valuable than the motion. Anand's AlphaFold analogy applies: the environment isn't the breakthrough — what the AI discovers inside it is. When Claude explains "I pick the cube first because it's the most stable base for stacking," that's the moment a demo becomes a product.

4. Ubuntu 22.04 remains the safe bet. Every compatibility issue traced back to being on 24.04. For production robotics work, stay on 22.04 until the ecosystem catches up.

Two arms, three blocks, one language model.
The robot doesn't just move — it reasons.
HP ZBook Power G11
Machine
RTX 2000 Ada · 8GB
GPU
580.95.05
Driver
Ubuntu 24.04
OS
ROS2 Jazzy
Middleware