Muscle Signals Can Pilot a Robot
April 28, 2020 | MIT News OfficeEstimated reading time: 4 minutes
CSAIL's Conduct-A-Bot system uses muscle signals to cue a drone’s movement, enabling more natural human-robot communication.
Albert Einstein famously postulated that “the only real valuable thing is intuition,” arguably one of the most important keys to understanding intention and communication.
But intuitiveness is hard to teach — especially to a machine. Looking to improve this, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) came up with a method that dials us closer to more seamless human-robot collaboration. The system, called “Conduct-A-Bot,” uses human muscle signals from wearable sensors to pilot a robot’s movement.
“We envision a world in which machines help people with cognitive and physical work, and to do so, they adapt to people rather than the other way around,” says Professor Daniela Rus, director of CSAIL, deputy dean of research for the MIT Stephen A. Schwarzman College of Computing, and co-author on a paper about the system.
To enable seamless teamwork between people and machines, electromyography and motion sensors are worn on the biceps, triceps, and forearms to measure muscle signals and movement. Algorithms then process the signals to detect gestures in real time, without any offline calibration or per-user training data. The system uses just two or three wearable sensors, and nothing in the environment — largely reducing the barrier to casual users interacting with robots.
While Conduct-A-Bot could potentially be used for various scenarios, including navigating menus on electronic devices or supervising autonomous robots, for this research the team used a Parrot Bebop 2 drone, although any commercial drone could be used.
By detecting actions like rotational gestures, clenched fists, tensed arms, and activated forearms, Conduct-A-Bot can move the drone left, right, up, down, and forward, as well as allow it to rotate and stop.
If you gestured toward the right to your friend, they could likely interpret that they should move in that direction. Similarly, if you waved your hand to the left, for example, the drone would follow suit and make a left turn.
In tests, the drone correctly responded to 82 percent of over 1,500 human gestures when it was remotely controlled to fly through hoops. The system also correctly identified approximately 94 percent of cued gestures when the drone was not being controlled.
“Understanding our gestures could help robots interpret more of the nonverbal cues that we naturally use in everyday life,” says Joseph DelPreto, lead author on the new paper. “This type of system could help make interacting with a robot more similar to interacting with another person, and make it easier for someone to start using robots without prior experience or external sensors.”
This type of system could eventually target a range of applications for human-robot collaboration, including remote exploration, assistive personal robots, or manufacturing tasks like delivering objects or lifting materials.
These intelligent tools are also consistent with social distancing — and could potentially open up a realm of future contactless work. For example, you can imagine machines being controlled by humans to safely clean a hospital room, or drop off medications, while letting us humans stay a safe distance.
Muscle signals can often provide information about states that are hard to observe from vision, such as joint stiffness or fatigue.
For example, if you watch a video of someone holding a large box, you might have difficulty guessing how much effort or force was needed — and a machine would also have difficulty gauging that from vision alone. Using muscle sensors opens up possibilities to estimate not only motion, but also the force and torque required to execute that physical trajectory.
For the gesture vocabulary currently used to control the robot, the movements were detected as follows:
- stiffening the upper arm to stop the robot (similar to briefly cringing when seeing something going wrong): biceps and triceps muscle signals;
- waving the hand left/right and up/down to move the robot sideways or vertically: forearm muscle signals (with the forearm accelerometer indicating hand orientation);
- fist clenching to move the robot forward: forearm muscle signals; and
- rotating clockwise/counterclockwise to turn the robot: forearm gyroscope.
Machine learning classifiers detected the gestures using the wearable sensors. Unsupervised classifiers processed the muscle and motion data and clustered it in real time to learn how to separate gestures from other motions. A neural network also predicted wrist flexion or extension from forearm muscle signals.
The system essentially calibrates itself to each person's signals while they're making gestures that control the robot, making it faster and easier for casual users to start interacting with robots.
In the future, the team hopes to expand the tests to include more subjects. And while the movements for Conduct-A-Bot cover common gestures for robot motion, the researchers want to extend the vocabulary to include more continuous or user-defined gestures. Eventually, the hope is to have the robots learn from these interactions to better understand the tasks and provide more predictive assistance or increase their autonomy.
“This system moves one step closer to letting us work seamlessly with robots so they can become more effective and intelligent tools for everyday tasks,” says DelPreto. “As such collaborations continue to become more accessible and pervasive, the possibilities for synergistic benefit continue to deepen.”
Reprinted with permission of MIT News
Suggested Items
Intervala Hosts Employee Car and Motorcycle Show, Benefit Nonprofits
08/27/2024 | IntervalaIntervala hosted an employee car and motorcycle show, aptly named the Vala-Cruise and it was a roaring success! Employees had the chance to show off their prized wheels, and it was incredible to see the variety and passion on display.
KIC Honored with IPC Recognition for 25 Years of Membership and Contributions to Electronics Manufacturing Industry
06/24/2024 | KICKIC, a renowned pioneer in thermal process and temperature measurement solutions for electronics manufacturing, is proud to announce that it has been recognized by IPC for 25 years of membership and significant contributions to electronics manufacturing.
Boeing Starliner Spacecraft Completes Successful Crewed Docking with International Space Station
06/07/2024 | BoeingNASA astronauts Barry "Butch" Wilmore and Sunita "Suni" Williams successfully docked Boeing's Starliner spacecraft to the International Space Station (ISS), about 26 hours after launching from Cape Canaveral Space Force Station.
KIC’s Miles Moreau to Present Profiling Basics and Best Practices at SMTA Wisconsin Chapter PCBA Profile Workshop
01/25/2024 | KICKIC, a renowned pioneer in thermal process and temperature measurement solutions for electronics manufacturing, announces that Miles Moreau, General Manager, will be a featured speaker at the SMTA Wisconsin Chapter In-Person PCBA Profile Workshop.
The Drive Toward UHDI and Substrates
09/20/2023 | I-Connect007 Editorial TeamPanasonic’s Darren Hitchcock spoke with the I-Connect007 Editorial Team on the complexities of moving toward ultra HDI manufacturing. As we learn in this conversation, the number of shifting constraints relative to traditional PCB fabrication is quite large and can sometimes conflict with each other.