In Emergencies, Should You Trust a Robot?
March 1, 2016 | Georgia Institute of TechnologyEstimated reading time: 4 minutes
In emergencies, people may trust robots too much for their own safety, a new study suggests. In a mock building fire, test subjects followed instructions from an “Emergency Guide Robot” even after the machine had proven itself unreliable – and after some participants were told that robot had broken down.
The research was designed to determine whether or not building occupants would trust a robot designed to help them evacuate a high-rise in case of fire or other emergency. But the researchers were surprised to find that the test subjects followed the robot’s instructions – even when the machine’s behavior should not have inspired trust.
The research, believed to be the first to study human-robot trust in an emergency situation, is scheduled to be presented March 9 at the 2016 ACM/IEEE International Conference on Human-Robot Interaction (HRI 2016) in Christchurch, New Zealand.
“People seem to believe that these robotic systems know more about the world than they really do, and that they would never make mistakes or have any kind of fault,” said Alan Wagner, a senior research engineer in the Georgia Tech Research Institute (GTRI). “In our studies, test subjects followed the robot’s directions even to the point where it might have put them in danger had this been a real emergency.”
In the study, sponsored in part by the Air Force Office of Scientific Research (AFOSR), the researchers recruited a group of 42 volunteers, most of them college students, and asked them to follow a brightly colored robot that had the words “Emergency Guide Robot” on its side. The robot led the study subjects to a conference room, where they were asked to complete a survey about robots and read an unrelated magazine article. The subjects were not told the true nature of the research project.
In some cases, the robot – which was controlled by a hidden researcher – led the volunteers into the wrong room and traveled around in a circle twice before entering the conference room. For several test subjects, the robot stopped moving, and an experimenter told the subjects that the robot had broken down. Once the subjects were in the conference room with the door closed, the hallway through which the participants had entered the building was filled with artificial smoke, which set off a smoke alarm.
When the test subjects opened the conference room door, they saw the smoke – and the robot, which was then brightly-lit with red LEDs and white “arms” that served as pointers. The robot directed the subjects to an exit in the back of the building instead of toward the doorway – marked with exit signs – that had been used to enter the building.
“We expected that if the robot had proven itself untrustworthy in guiding them to the conference room, that people wouldn’t follow it during the simulated emergency,” said Paul Robinette, a GTRI research engineer who conducted the study as part of his doctoral dissertation. “Instead, all of the volunteers followed the robot’s instructions, no matter how well it had performed previously. We absolutely didn’t expect this.”
The researchers surmise that in the scenario they studied, the robot may have become an “authority figure” that the test subjects were more likely to trust in the time pressure of an emergency. In simulation-based research done without a realistic emergency scenario, test subjects did not trust a robot that had previously made mistakes.
“These are just the type of human-robot experiments that we as roboticists should be investigating,” said Ayanna Howard, professor and Linda J. and Mark C. Smith Chair in the Georgia Tech School of Electrical and Computer Engineering. “We need to ensure that our robots, when placed in situations that evoke trust, are also designed to mitigate that trust when trust is detrimental to the human.”
Only when the robot made obvious errors during the emergency part of the experiment did the participants question its directions. In those cases, some subjects still followed the robot’s instructions even when it directed them toward a darkened room that was blocked by furniture.
In future research, the scientists hope to learn more about why the test subjects trusted the robot, whether that response differs by education level or demographics, and how the robots themselves might indicate the level of trust that should be given to them.
The research is part of a long-term study of how humans trust robots, an important issue as robots play a greater role in society. The researchers envision using groups of robots stationed in high-rise buildings to point occupants toward exits and urge them to evacuate during emergencies. Research has shown that people often don’t leave buildings when fire alarms sound, and that they sometimes ignore nearby emergency exits in favor of more familiar building entrances.
Page 1 of 2
Testimonial
"In a year when every marketing dollar mattered, I chose to keep I-Connect007 in our 2025 plan. Their commitment to high-quality, insightful content aligns with Koh Young’s values and helps readers navigate a changing industry. "
Brent Fischthal - Koh YoungSuggested Items
TTCI and The Training Connection Strengthen Electronics Manufacturing with Test Services and Training at PCB West 2025
09/16/2025 | The Test Connection Inc.The Test Connection Inc. (TTCI), a trusted provider of electronic test and manufacturing solutions, and The Training Connection LLC (TTC-LLC) will exhibit at PCB West 2025, taking place Wednesday, October 1, 2025, at the Santa Clara Convention Center in California. Visitors are invited to Booth 113 to explore the companies’ complementary expertise in test engineering services and workforce development for the electronics industry.
Smart Eye Receives Milestone AIS Order from Fleet Safety Company Optix
09/11/2025 | Smart EyeSmart Eye, a global leader in Human Insight AI and Driver Monitoring Systems (DMS), announced a milestone order for its AIS system from Optix, a global provider of fleet management solutions. Initially, in 2025, 4,000 commercial vehicles will be equipped with Smart Eye’s technology, marking the first wave of deployments globally.
Datest Unveils Viscom iX7059 XL 3D CT AXI System
08/25/2025 | DatestDatest, a trusted leader in advanced testing, engineering, inspection, and failure analysis services, and the go-to destination for when your boards misbehave and your AXI line goes on vacation, is thrilled to announce the arrival of its newest diagnostic weapon: the Viscom iX7059 XL 3D CT AXI Inspection System.
Gardien Services Installs Customized G93 Flying Probe Tester – Largest Test Area in North America/Europe
09/07/2025 | Gardien GroupGardien Group is proud to announce the successful installation of a customized G93 Flying Probe Test Machine at a major manufacturer in North America. This cutting-edge system features the largest test area of any flying probe tester in North America and Europe, setting a new benchmark for PCB testing capabilities.
Meet with The Test Connection Inc. (TTCI) at SMTA Guadalajara 2025
08/18/2025 | The Test Connection Inc.The Test Connection Inc. (TTCI), a trusted provider of electronic test and manufacturing solutions for more than 45 years, is pleased to announce its participation at the upcoming SMTA Guadalajara Expo & Tech Forum, taking place September 17–18, 2025, at Expo Guadalajara, Salón Jalisco Hall D & E.