
At the Soft Robotics Lab at ETH Zurich, the scene looks less like a traditional engineering workshop and more like a child’s playroom mixed with a science museum.
Foam blocks, colorful toys and stuffed animals sit beside sensors, cables and robotic fingers.
This unusual mix has a serious purpose: teaching robots how to move, grip and adapt like living beings.
The lab is led by Professor Robert Katzschmann, who believes the future of robotics lies in copying nature. Instead of building robots from hard metal parts driven by motors, his team designs hands and bodies inspired by human and animal anatomy.
Their latest robotic hands use artificial tendons, much like those in human fingers, rather than motors inside each joint. This makes the hands softer, more flexible and better able to handle different objects.
Katzschmann’s goal is to create robots that are not stiff and fragile, but gentle and adaptable.
By combining soft materials with rigid structures, his team is building machines that can safely interact with the world around them, whether that means picking up a bottle, sorting packages or working alongside people.
A key part of this work is machine learning. In the past, robots were controlled by precise equations and detailed instructions written by engineers. That approach worked well on factory lines, where tasks are repetitive and predictable.
But it breaks down in messy, real-world situations. Even something simple for humans, like placing different-shaped bottles into a crate, can confuse a traditional robot.
To solve this, Katzschmann’s team trains robots by example. Researchers wear special gloves with sensors and demonstrate tasks like grasping objects. Cameras record every movement, creating rich training data.
This data is used to train advanced machine-learning models, similar in structure to the systems behind modern AI chat tools. After training, the robotic hand can handle new objects it has never seen before and still succeed.
This research led to the creation of Mimic Robotics, a start-up founded in 2024. The company aims to bring these intelligent, flexible robotic hands into factories and logistics centers, where adaptability is increasingly important.
Another ETH Zurich researcher, Professor Stelian Coros, focuses on the “brain” of robots. He works on algorithms that help robots learn through experience. One method, called reinforcement learning, allows robots to improve by trial and error, receiving rewards when they perform a task well. Just as humans learn sports by practice rather than observation alone, robots must physically try actions to truly learn them.
Coros and his team gather huge amounts of data using teleoperation, where a human remotely controls a robot, and motion-capture systems borrowed from the film industry. This helps robots move in ways that feel natural and human-like, which is essential if they are to work safely with people.
At the Robotics Systems Lab, led by Professor Marco Hutter, learning happens on an even larger scale. Thousands of virtual robots are trained at the same time in computer simulations. Thanks to powerful graphics processors, researchers can now generate massive amounts of training data in hours instead of years. Much of this training happens in the cloud, but some computing power is placed directly inside robots so they can still function when internet access is limited, such as during disaster response.
Despite all this progress, researchers agree that AI alone is not enough. Robots are physical machines, not just software. Coros argues that combining learning with basic physics models is more efficient than relying on enormous amounts of data. Understanding how a ball flies through the air, for example, allows a robot to throw accurately without endless practice.
Back in Katzschmann’s lab, biologists and chemists are even experimenting with artificial muscles and tendon-like materials. Katzschmann believes that copying nature’s designs is essential for truly versatile robots. Muscles provide softness, bones provide strength, and together they create systems far more adaptable than traditional machines.
In the end, the future of robotics may depend not just on smarter algorithms, but on building bodies that learn, move and feel a little more like our own.


