Post

SO-101 Arm | Seeed Studios

Over the last few years I have made everything from an autonomous bipedal robot to a time machine, but the one thing I always thought my workshop was lacking was a maniacal sentient robot with access to neurotoxins. Thankfully, Seeed Studio were able and willing to help me out…


Hugging Face is a popular platform that makes artificial intelligence easier for everyone to use. It hosts thousands of pre-trained models that can understand text, recognize images, and even work with audio and video. Instead of starting from scratch, you can grab a model from Hugging Face and use it in your own projects with just a few lines of code. It’s become a go-to place for beginners and experts alike who want to experiment with AI and share their work with a friendly, growing community.

As an extension of this ethos, the SO-101 robotic arm is a great piece of hardware for bringing projects to life in the real world. It’s designed to be simple to set up and flexible enough to handle different tasks, like moving objects, interacting with tools, or responding to sensors. Because it’s both approachable and precise, the SO-101 makes it easy to explore robotics—even if you’re just starting out. When you combine an intelligent AI model from Hugging Face with the SO-101 arm, you open up exciting possibilities: a robot that can see, understand, and act on its surroundings.

I was recently approached by Seeed Studio and they very kindly offered to send me their version of the SO-101 arm so that I could try it out and see how well it worked.

Assembly

The build process is relatively simple considering the end result. Each step of the arm assembly is documented clearly, not only on the Hugging Face website but also within the Wiki for the Seeed build (see below for references).

Similarly, the servo calibration is straightforward, with a few notable steps:

  1. The LeRobot software should be installed on your computer, this involves a few steps but should be a simple enough if you stick to the guide. Be warned: skipping a step my lead to some interesting errors (ask me how I know…)
  2. Each servo must be given an ID. The installed software has a script to manage this and you simply connect each servo (one at a time) and follow the instructions in the script. Just make sure you’re paying attention to the gear ratios for the leader arm read more
  3. Once the servo IDs are configured, connect them in series to the two driver boards. Again, be careful to get the order correct otherwise the code will get confused.
  4. Train the range of motion. Running the command, move each arm through the full range of motion for each joint. This prevents any damage to the servos and components by defining how far they are able to rotate.
  5. Run the teleoperation command. This will allow you to control the follower arm from the leader arm in realtime.

Machine Learning

The real power of this project is not only allowing teleoperation, but also by utilising machine learning to give the SO-101 arm a sense of intelligence. Instead of being limited to following direct commands from a joystick or interface, the arm can learn patterns, adapt to new situations, and even make decisions based on what it sees or senses. This is where platforms like Hugging Face come into play, providing ready-to-use models that can process images, recognize objects, or understand instructions in natural language. By connecting these models to the SO-101, the arm can go beyond simple motion and start behaving in ways that feel purposeful and responsive.

For example, with computer vision models, the arm can detect objects on a table and decide how to pick them up without the user manually guiding every step. With reinforcement learning, the arm can practice tasks—like stacking blocks or sorting items—improving its performance over time just like a person would learn through repetition. Natural language models even make it possible to control the arm with plain speech, turning a complex robotic system into something that feels intuitive and interactive.

This blend of robotics and machine learning creates opportunities for all kinds of creative projects. Imagine a home assistant that can help tidy up, a lab tool that can carefully handle experiments, or an educational robot that learns alongside students. Each of these becomes possible by combining the precise mechanics of the SO-101 arm with the flexible intelligence provided by machine learning. Rather than being a static piece of hardware, the arm transforms into a platform for exploring how AI can shape the way we interact with the physical world.

References

Generating GLaDOS voice samples from text

curl https://glados.c-net.org/generate?text=Hello%20Again -L --retry 30 --get --fail -o hello_again.wav

Setup and Run the SO-101 Arm after installation

Activate lerobot

conda activate lerobot

Find Port

python -m lerobot.find_port

Teleoperate

1
2
3
4
5
6
7
python -m lerobot.teleoperate \
    --robot.type=so101_follower \
    --robot.port=/dev/ttyACM0 \
    --robot.id=my_awesome_follower_arm \
    --teleop.type=so101_leader \
    --teleop.port=/dev/ttyACM1 \
    --teleop.id=my_awesome_leader_arm
This post is licensed under CC BY 4.0 by the author.