Imagine robots that can watch us and then do exactly what we do, in real time. They could help with daily tasks around the house or at work without needing a lot of instructions beforehand.
However, making robots that can learn by watching humans hasn’t been easy. One big challenge is that robots and humans move differently because of how their bodies are built.
A team of researchers from U2IS, ENSTA Paris, has been working on a new way to help robots mimic human movements better. They’ve come up with a special program that helps robots copy human actions more closely. Their work is still in the early stages, but it’s quite promising.
The researchers’ program treats copying human movements as a three-part process. First, it watches and understands the human movements.
Then, it tries to translate those movements into something a robot can do, considering the robot’s own way of moving. Lastly, the robot uses these translated movements to figure out how to move itself.
For example, if you wave your hand, the program first identifies how your arm and hand are moving. Next, it calculates how a robot, with its different joints and limits, can make a similar waving motion. Finally, the robot uses this information to wave its own hand.
Creating this kind of program is tricky because robots and humans are built differently, and there isn’t a lot of information on how to match their movements exactly.
The researchers have been experimenting with a type of artificial intelligence that learns by itself to translate human movements to robot movements without needing a direct comparison.
Unfortunately, the program isn’t perfect yet. In tests, it didn’t perform as well as the researchers hoped. This means that while the idea is good, the technology isn’t ready to be used in real-life robots just yet.
The team isn’t giving up, though. They plan to do more experiments to figure out what’s not working and how to fix it. They believe that with some adjustments, their program can get better at helping robots imitate human movements accurately.
Their research suggests that even though it’s challenging, using this kind of artificial intelligence could eventually make it possible for robots to learn from watching us. This could open up new possibilities for robots to assist in our daily lives.
The team is hopeful that with further work, they can overcome the current limitations and bring us closer to having robots that can learn by imitation.
The research findings can be found in arXiv.
Copyright © 2024 Knowridge Science Report. All rights reserved.