
A new study suggests that something as simple as adding eyes to a humanoid robot can dramatically change how people perceive it.
Researchers from Tampere University in Finland and the University of Bremen in Germany found that robots with eyes are more likely to be seen as having a mind, emotions, and intentions compared with robots that lack them.
Humans naturally use eyes and gaze to understand others. Eye contact helps us detect attention, interpret feelings, and judge whether someone is thinking or aware.
Because of this, eyes play a powerful role in social interaction.
The researchers wanted to know whether the same effect applies when people look at robots.
The team focused on what scientists call “mind perception,” which refers to the tendency to believe that another being has thoughts and feelings.
This perception has two main parts. The first is agency, which includes abilities such as thinking, planning, and controlling actions. The second is experience, which refers to the ability to feel emotions like pain, joy, or fear.
To test how eyes influence these perceptions, the researchers used artificial intelligence to create many realistic images of humanoid robots.
Each robot was produced in two versions: one with eyes and one without. Some robots had childlike features while others looked more adult, and the eyes were either displayed on a screen or built into the robot’s face.
Participants in the study were shown these images and asked to judge how much each robot seemed capable of thinking and feeling.
Across the board, robots with eyes were rated as having stronger agency and experience. In other words, people believed the robots with eyes were more intelligent, more self-aware, and more capable of emotions.
The researchers also conducted a second experiment that did not rely on conscious opinions. Instead, it measured reactions that happen automatically and quickly, before people have time to think. Even at this early level of processing, the presence of eyes made people perceive the robots as more mind-like.
This suggests that the effect of eyes operates deep within our brain’s social systems.
The findings have important implications for the design of future robots. As humanoid machines become more common in homes, workplaces, and healthcare settings, how people perceive them will influence how comfortable they feel interacting with them. Robots that seem more human-like may encourage trust, cooperation, and empathy.
However, the research also raises ethical questions. If people start to treat robots as if they have feelings or awareness, designers must consider how these perceptions affect human behavior and decision-making.
The study shows that eyes are not just a cosmetic feature. They are a powerful signal that can shape how we interpret a robot’s intelligence, emotions, and moral status.
As technology continues to blur the line between machines and living beings, something as small as a pair of eyes may determine whether a robot feels like a tool—or something closer to a companion.


