Blossom robots can be constructed by users from handcrafted materials, making each one a little bit different.
A few years ago, when social robots began appearing in stores and homes, Guy Hoffman wondered why they all looked so much alike.
“I noticed a lot of them had a very similar kind of feature – white and plasticky, designed like consumer electronic devices,” said Hoffman, assistant professor and the Mills Family Faculty Fellow in the Sibley School of Mechanical and Aerospace Engineering.
“Especially when these social robots were marketed to be part of our families, I thought it would be strange to all have identical family members.”
He envisioned robots built from warmer, homier materials, such as wood and wool; he also imagined robots that could be customized by their owners, so each would be unique.
A friend gave him crocheted models of his robots and he thought: What if the robot itself was crocheted? So he learned to crochet.
Then he watched another friend crochet part of the robot far faster than he could. “That made me think people who are not engineers could also participate in making a robot,” he said.
These ideas led Hoffman to create Blossom – a simple, expressive, inexpensive robot platform that could be made from a kit and creatively outfitted with handcrafted materials.
“We wanted to empower people to build their own robot, but without sacrificing how expressive it is,” said Hoffman, senior author of “Blossom: A Handcrafted Open-Source Robot,” published in March in the Association for Computing Machinery Transactions on Human-Robot Interaction.
“Also, it’s nice to have every robot be a little bit different. If you knit your robot, every family would have their own robot that would be unique to them.”
Blossom’s mechanical design – developed with Michael Suguitan, a doctoral student in Hoffman’s lab and first author of the paper – is centered on a floating “head” platform using strings and cables for movement, making its gestures more flexible and organic than those of a robot composed of rigid parts.
Blossom can be controlled by moving a smartphone using an open-source puppeteering app; the robot’s movements resemble bouncing, stretching and dancing.
The cost of the parts needed to assemble a Blossom is less than $250, and researchers are currently working on a Blossom kit made entirely of cardboard, which would be even cheaper.
Partly because of its simplicity, Blossom has a variety of potential uses, Hoffman said. Human-robot interaction researchers who aren’t engineers could build their own from a kit to use in studies. Because of the ease of interacting with the robot and the hands-on experience of helping to build it, it could help teach children about robotics.
In a case study, children ages 4-8 had a chance to control and make accessories for Blossom at a science fair.
Some children created accessories, such as appendages or jewelry, while others controlled the robot so the new items could be attached, illustrating how Blossom could inspire collaboration.
“The children also had additional expectation of the robot’s movement, such as making it locomote and jump.
These expectations were emphasized by that fact that several children chose to make appendages such as legs and wings,” the authors wrote.
In the coming months, Blossom will be used by the Upper Grand school district in Ontario, Canada, to help teach math to fourth-graders, Hoffman said.
He said his team also has been working on an algorithm to make Blossom react to YouTube videos – performing a certain dance in response to a certain song, for instance, building on previous research showing that a robot’s response to listening to songs can influence a human’s reaction.
This might be particularly useful in modeling behavior for children with autism, Hoffman said.
“It’s meant to be a flexible kit that is also very low cost. Especially if we can make it out of cardboard, you could make it very inexpensively,” he said. “Because of computation becoming so powerful, it could be a really open-ended way for people to do whatever they want with robotics.”
The work was partly supported by a grant from Google Creative Robotics.
Written by Melanie Lefkowitz.
DOI: https://doi.org/10.1145/3310356.