Robots Reading Feelings
April 26, 2019 | Case Western Reserve UniversityEstimated reading time: 3 minutes

Researchers boast 98% accuracy for robots recognizing facial cues; could improve video gaming today, health care tomorrow.
Robots are getting smarter—and faster—at knowing what humans are feeling and thinking just by “looking” into their faces, a development that might one day allow more emotionally perceptive machines to detect changes in a person’s health or mental state.
Researchers at Case Western Reserve University say they’re improving the artificial intelligence (AI) now powering interactive video games and which will soon enhance the next generation of personalized robots likely to coexist alongside humans.
And the Case Western Reserve robots are doing it in real time.
New machines developed by Kiju Lee, the Nord Distinguished Assistant Professor in mechanical and aerospace engineering at the Case School of Engineering, and graduate student Xiao Liu, are correctly identifying human emotions from facial expressions 98% of the time—almost instantly. Previous results from other researchers had achieved similar results, but the robots often responded too slowly.
“Even a three-second pause can be awkward,” Lee said. “It’s hard enough for humans—and even harder for robots—to figure out what someone feels based solely on their facial expressions or body language. All the layers and layers of technology —including video capture—to do this also unfortunately slows down the response.”
Lee and Liu accelerated the response time by combining two pre-processing video filters to another pair of existing programs to help the robot classify emotions based on more than 3,500 variations in human facial expression.
But that’s hardly the extent of our facial variation: Humans can register more than 10,000 expressions, and each also has a unique way of revealing many of those emotions, Lee said.
But “deep-learning” computers can process vast amounts of information once those data are entered into the software and classified.
And, thankfully, the most common expressive features among humans are easily divided into seven emotions: neutral, happiness, anger, sadness, disgust, surprise and fear—even accounting for variations among different backgrounds and cultures.
Lee presents to a crowd at the opening the new Smart Living Lab at Ohio Living Breckenridge Village in Willoughby, Ohio.
This recent work by Lee and Liu, unveiled at the 2018 IEEE Games, Entertainment, and Media Conference, could lead to a host of applications when combined with advances by dozens of other researchers in the AI field, Lee said.
The two are also now working on another machine-learning based approach for facial emotion recognition, which so far has achieved over 99-percent of accuracy with even higher computational efficiency.
Someday, a personal robot may be able to accurately notice significant changes in a person through daily interaction—even to the point of detecting early signs of depression, for example.
“The robot could be programmed to catch it early and help with simple interventions, like music and video, for people in need of social therapies,” Lee said. “This could be very helpful for older adults who might be suffering from depression or personality changes associated with aging.”
Lee is planning to explore the potential use of social robots for social and emotional intervention in older adults through collaboration with Ohio Living Breckenridge Village. Senior residents there are expected to interact with a user-friendly, socially interactive robot and help test accuracy and reliability of the embedded algorithms.
Another Future Possibility
A social robot who learns the more-subtle facial changes in someone on the autism spectrum—and which helps “teach” humans to accurately recognize emotions in each other.
“These social robots will take some time to catch in the U.S.,” Lee said. “But in places like Japan, where there is a strong culture around robots, this is already beginning to happen. In any case, our future will be side-by-side with emotionally intelligent robots.”
Suggested Items
Intervala Hosts Employee Car and Motorcycle Show, Benefit Nonprofits
08/27/2024 | IntervalaIntervala hosted an employee car and motorcycle show, aptly named the Vala-Cruise and it was a roaring success! Employees had the chance to show off their prized wheels, and it was incredible to see the variety and passion on display.
KIC Honored with IPC Recognition for 25 Years of Membership and Contributions to Electronics Manufacturing Industry
06/24/2024 | KICKIC, a renowned pioneer in thermal process and temperature measurement solutions for electronics manufacturing, is proud to announce that it has been recognized by IPC for 25 years of membership and significant contributions to electronics manufacturing.
Boeing Starliner Spacecraft Completes Successful Crewed Docking with International Space Station
06/07/2024 | BoeingNASA astronauts Barry "Butch" Wilmore and Sunita "Suni" Williams successfully docked Boeing's Starliner spacecraft to the International Space Station (ISS), about 26 hours after launching from Cape Canaveral Space Force Station.
KIC’s Miles Moreau to Present Profiling Basics and Best Practices at SMTA Wisconsin Chapter PCBA Profile Workshop
01/25/2024 | KICKIC, a renowned pioneer in thermal process and temperature measurement solutions for electronics manufacturing, announces that Miles Moreau, General Manager, will be a featured speaker at the SMTA Wisconsin Chapter In-Person PCBA Profile Workshop.
The Drive Toward UHDI and Substrates
09/20/2023 | I-Connect007 Editorial TeamPanasonic’s Darren Hitchcock spoke with the I-Connect007 Editorial Team on the complexities of moving toward ultra HDI manufacturing. As we learn in this conversation, the number of shifting constraints relative to traditional PCB fabrication is quite large and can sometimes conflict with each other.