Penn State Developing Worker-centered Human Robot Partnerships
May 13, 2021 | Pennsylvania State UniversityEstimated reading time: 4 minutes
In the future, humans may interact with artificially intelligent heavy machines, self-optimizing collaborative robots, unmanned terrestrial and aerial vehicles, and other autonomous systems, according to a team of Penn State engineers.
With the help of humans, these intelligent robots could perform strenuous and repetitive physical activities such as lifting heavy objects, delivering materials to the workers, monitoring the progress of construction projects, tying rebars, or laying bricks to build masonry walls.
However, this partnership can pose new safety challenges to workers, especially in the unstructured and dynamic environments of construction sites.
To a robot, a human operator is an unfailing partner. To a human, the robot’s level of awareness, intelligence and motorized precision can substantially deviate from reality, leading to unbalanced trust.
This creates a need for a change in designing collaborative construction robots toward ones that can monitor workers’ mental and physical stress and subsequently adjust their performance, according to Houtan Jebelli, assistant professor of architectural engineering.
Robots on construction sites are different from other industrial robots because they need to operate in highly fragmented and rugged workspaces with different layouts and equipment. In these environments, safe and successful delivery of work is not possible without human intervention, according to Jebelli.
This research on human-robot collaboration makes possible interaction between human and construction robots using brainwaves as indicators of workers’ mental activity. It is the first of its kind to integrate this technology with human-robot adaptation. The perceptual cues obtained from the brainwaves can also be used to develop a brain-computer interface approach (BCI) to create “hands-free” communication between construction robots and humans, mitigating the limitations of traditional robot control systems in other industries, said Jebelli.
"Once we capture workers' cognitive load, we try to transfer this information into the robot so that the collaborative robot can monitor workers' cognitive load," Jebelli added.
Whenever the cognitive load is recognized to be higher than a specific threshold, the robot will reduce its pace to provide a safer environment for the workers, said Jebelli. This response could help design a collaborative robotic system that understands the human partner’s mental state and hopefully improve workers’ safety and productivity in the long term. The team published their results in two papers in Automation in Construction.
They also proposed a BCI-based system to operate a robot remotely.
“The ability to control a robot by merely imagining the commands can open new avenues to designing hands-free robotic systems in hazardous environments where humans require their hands to retain their balance and perform an action,” said Mahmoud Habibnezhad, a postdoctoral fellow conducting research with Jebelli.
The researchers capture workers’ brainwave signals with a wearable electroencephalogram (EEG) device and convert these signals into robotic commands.
“In our research, first we trained the subjects with a multiple imagery experiment,” said Yizhi Liu, doctoral student of architectural engineering. “The signal is then collected through EEG sensors and a spatial feature extraction technique called a common spatial pattern,” said Liu.
He explained that participants view images of specific actions, such as workers grabbing bricks with their right hands, and then imagine these actions. For example, when a subject imagines their right hand grabbing something, the right cortex of their brain generates a higher EEG signal than their left-brain area. The researchers employed machine learning to train the robots, using participants’ thoughts when imagining the actions. Subsequently, these translated signals will be transferred as digital commands to the robots through ROS, or robotic operating system, Liu added.
For the BCI system to continuously interpret brainwave signals from workers in near real-time, the researchers used three key elements — a wearable EEG device, a signal-interpretation application program interface (API), and a cloud server. The wearable EEG device captures the brainwave signals and sends them to the cloud server, and then the API begins generating commands.
The researchers created a network of channels between workers’ wearable biosensors and robots using ROS that acts as the middleware connecting different systems. Through these channels, commands such as the right-hand movement, left-hand movement and stop signal, can be easily sent to the robot. However, more nuanced commands require more data and improve the performance of the system and teleoperation of the robot, according to Jebelli.
“We developed a brain-computer interface system, which we can think of as a person trying to learn a new language that doesn’t know how to generate commands,” he said. “We try to connect different commands with some predefined patterns of their brainwaves.”
With more commands, the researchers can train and improve the performance of the system, according to Jebelli. These different commands include tasks such as controlling the robot, stopping the robot, or designing some predesigned work plan, such as delivering material from point A to point B by thinking about some specific tasks in the dictionary.
“This is a framework that we tested out for one robot, that is a proof-of-concept that the framework is working,” said Habibnezhad. “We can improve the framework by using different robots or drones or different systems. We can improve the accuracy of the control by using more commands and trying to extract more patterns and defining different controls.”
Suggested Items
Wistron Announces 2024 Financials and BOD Results
02/27/2025 | WistronWistron Corp. held a Board of Directors meeting and announced its financial results for 2024. The company reported consolidated revenue of NTD 1,049 billion, operating income of NTD 38,980 million, profit before tax (PBT) of NTD 40,057 million, and profit after tax (PAT) of NTD 17,439 million, with earnings per share (EPS) of NTD 6.11.
U.S. Department of Commerce Announces CHIPS Incentives Awards with Corning, Edwards Vacuum, and Infinera
01/21/2025 | U.S. Department of CommerceThe U.S. Department of Commerce announced it finalized three separate awards under the CHIPS Incentives Program’s Funding Opportunity for Commercial Fabrication Facilities.
Texas Instruments Announces Award Agreement for CHIPS and Science Act Funding
12/25/2024 | Texas InstrumentsTexas Instruments (TI) (Nasdaq: TXN) and the U.S. Department of Commerce today announced an award agreement of up to $1.6 billion in direct funding through the U.S. CHIPS and Science Act, following the preliminary memorandum of terms announced in August 2024.
Biden-Harris Administration Announces Preliminary Terms with Bosch to Advance U.S. Supply Chain Resiliency of Crucial Semiconductor Manufacturing Components
12/13/2024 | U.S. Department of CommerceThe Biden-Harris Administration announced that the U.S. Department of Commerce and Bosch have signed a non-binding preliminary memorandum of terms (PMT) to provide up to $225 million in proposed direct funding under the CHIPS and Science Act.
From Construction Work to PCB Design in Under a Year
11/27/2024 | Andy Shaughnessy, Design007 MagazineAt the Anaheim Electronics & Manufacturing Show in October, I had the opportunity to talk with some new PCB designers, including Jon Smith of Frontgrade Aethercomm. During the Anaheim show, John Watson, a PCB design instructor at Palomar College, led a panel of his past and present students, including Jon, who shared his story of switching from a construction career to PCB design in a matter of months, courtesy of Watson’s Palomar College design curriculum.