In this video, I share my experience working over the past couple of weeks with the 3D-printable robotic arm SO-ARM100 from TheRobotStudio and the LeRobot library by Hugging Face, which makes controlling the arm straightforward.
🔧 What you’ll see in this video:
– The process of 3D printing most of the arm’s parts at home 🖨️
– Setting up and using teleoperation mode, where the follower arm mirrors the leader arm’s movements 🕹️
– The real magic: ✨ using imitation learning to train a neural network that allows the robot to operate autonomously and complete real-world tasks!
Watch as I build the arm, train it to pick up a toy and place it in a basket, and see the robot in action after training with video feeds from two cameras. The result? A robot capable of completing the task autonomously! 💪
If you’re into robotics, this project is definitely worth exploring. It’s a great AI-first, open-source robotics initiative, and I’m already planning to 3D-print another arm 🤩 for more complex experiments and integrations with my other projects.
💡 Resources:
The arm (SO-ARM100): https://github.com/TheRobotStudio/SO-ARM100
LeRobot library by Hugging Face: github.com/huggingface/lerobot
Leave feedback about this