Imagine a robot saying, “I’ll have you eating out of the palm of my hand.”
It seems unlikely because most robots don’t even have palms. Creating robots that grip and grasp objects like humans has been a big challenge.
Now, researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have made a breakthrough with a new robotic hand design.
This design, detailed in a paper published on the arXiv preprint server, includes a sophisticated robotic palm called GelPalm.
GelPalm is designed to mimic the soft, flexible nature of human hands.
It has a gel-based sensor embedded in the palm, using advanced color illumination technology with red, green, and blue LEDs to light up an object.
A camera then captures the reflections to create detailed 3D surface models, allowing the robot to handle objects with delicate precision.
To complement the palm, the team also developed robotic fingers named ROMEO (RObotic Modular Endoskeleton Optical).
These fingers are made from flexible materials and use similar sensing technology as the palm. ROMEO fingers have “passive compliance,” which means they can naturally adjust to forces without needing extra motors or controls.
This helps increase the surface area in contact with objects, allowing the robot to grip and hold items more securely. These fingers are created using cost-effective 3D printing.
GelPalm not only improves the robot’s dexterity but also makes interactions with objects safer.
This is especially useful for applications like human-robot collaboration, prosthetics, and biomedical uses where a gentle touch is essential.
Most previous robotic designs have focused on making fingers more dexterous.
However, Sandra Q. Liu, the lead designer of GelPalm and a recent MIT Ph.D. graduate, shifted the focus to the palm. Liu and her team drew inspiration from human hands, which combine rigid bones with soft, compliant tissue.
By merging rigid structures with flexible materials, they aimed to replicate the adaptive abilities of human hands.
One of the major advantages is that GelPalm doesn’t require extra motors to change shape—the natural compliance of the materials allows it to conform around objects automatically, just like human palms do.
The team tested the palm design extensively. Liu compared the tactile sensing performance of blue LEDs and white LEDs integrated into the ROMEO fingers. Both systems produced high-quality 3D tactile reconstructions.
The crucial experiment was to see how well different palm configurations could grasp objects.
The team pressed plastic shapes covered in paint against four palm types: rigid, structurally compliant, gel compliant, and their dual compliant design. The results showed that the dual compliant design, combining structural and material flexibility, provided significantly better grip.
However, there are challenges. Integrating enough sensory technology into the palm without making it bulky or complex is difficult. The use of camera-based tactile sensors introduces size and flexibility issues.
The team suggests developing more flexible materials for mirrors and enhancing sensor integration to maintain functionality without compromising usability.
“This work is remarkable because it introduces a purposefully designed, useful palm that combines articulation and sensing,” said Matei Ciocarlie, an associate professor at Columbia University who was not involved in the study.
“Most robot palms lack either feature. This is a significant innovation.”
Liu hopes this development will lead to more advanced robotic hands that blend soft and rigid elements with tactile sensitivity in the next five to ten years. She emphasized the importance of making this technology low-cost and easy to manufacture to encourage widespread innovation.
Ted Adelson, the John and Dorothy Wilson Professor of Vision Science at MIT and a CSAIL member, is the senior author of the paper. Liu’s dream is that sharing this knowledge will inspire further advancements in the field, ultimately leading to robotic hands that interact with the world as skillfully as human hands do.