Robots can read human emotions in real-time

Robots can read human emotions in real-time

In a new study, researchers improved artificial intelligence (AI) in interactive video games and said that robots could read humans’ feelings in real-time.

The study was done by researchers at Case Western Reserve University.

The team suggests that robots can get to know what humans are feeling and thinking just by “looking” into their faces.

The new finding may help develop more sensitive machines to detect changes in a person’s emotional health or mental state.

In the study, the team developed new machines that can correctly identify human emotions from facial expressions instantly.

This is much faster than previous results from other groups.

Identifying emotions instantly is a very important part of communication, and even a little bit delay can be awkward.

It’s hard enough for humans to figure out what someone feels based solely on their facial expressions or body language. It is even harder for robots to do it.

In the study, the team combined two pre-processing video filters to another pair of existing programs to reduce the response time.

This tech helps the robot identify emotions based on more than 3,500 variations in human facial expression.

They also used “deep-learning” to process vast amounts of information once those data are entered into the software and classified.

And because most human expressions can be easily divided into seven emotions, including happiness, anger, sadness, disgust, surprise, neutral and fear, robots can identify emotions well.

The team suggests that this may help robots process facial expressions better because humans can have more than 10,000 expressions, and each person also has a unique way of showing emotions. The social background and culture can influence that, too.

Now the team is working on another machine-learning based approach for facial emotion recognition.

They hope their finding can help develop a personal robot that can accurately notice significant changes in a person through daily interaction.

This may help detect mental disease like depression early.

The leader of the study is Kiju Lee, the Nord Distinguished Assistant Professor in mechanical and aerospace engineering at the Case School of Engineering.

The study was presented at the 2018 IEEE Games, Entertainment, and Media Conference.

Copyright © 2019 Knowridge Science Report. All rights reserved.