UW Roboticists Learn to Teach Robots from Babies
December 2, 2015 | University of WashingtonEstimated reading time: 5 minutes
Babies learn about the world by exploring how their bodies move in space, grabbing toys, pushing things off tables and by watching and imitating what adults are doing. But when roboticists want to teach a robot how to do a task, they typically either write code or physically move a robot’s arm or body to show it how to perform an action.
Now a collaboration between University of Washington developmental psychologists and computer scientists has demonstrated that robots can “learn” much like kids — by amassing data through exploration, watching a human do something and determining how to perform that task on its own.
“You can look at this as a first step in building robots that can learn from humans in the same way that infants learn from humans,” said senior author Rajesh Rao, a UW professor of computer science and engineering.
“If you want people who don’t know anything about computer programming to be able to teach a robot, the way to do it is through demonstration — showing the robot how to clean your dishes, fold your clothes, or do household chores. But to achieve that goal, you need the robot to be able to understand those actions and perform them on their own.”
The research, which combines child development research from the UW’s Institute for Learning & Brain Sciences Lab (I-LABS) with machine learning approaches, was published in a paper in November in the journal PLOS ONE.
A collaboration between UW developmental psychologists and computer scientists aims to enable robots to learn in the same way that children naturally do. The team used research on how babies follow an adult’s gaze to “teach” a robot to perform the same task.University of Washington
In the paper, the UW team developed a new probabilistic model aimed at solving a fundamental challenge in robotics: building robots that can learn new skills by watching people and imitating them.
The roboticists collaborated with UW psychology professor and I-LABS co-director Andrew Meltzoff, whose seminal research has shown that children as young as 18 months can infer the goal of an adult’s actions and develop alternate ways of reaching that goal themselves.
In one example, infants saw an adult try to pull apart a barbell-shaped toy, but the adult failed to achieve that goal because the toy was stuck together and his hands slipped off the ends. The infants watched carefully and then decided to use alternate methods — they wrapped their tiny fingers all the way around the ends and yanked especially hard — duplicating what the adult intended to do.
Page 1 of 3
Suggested Items
DARPA Selects Cerebras to Deliver Next Generation, Real-Time Compute Platform for Advanced Military and Commercial Applications
04/08/2025 | RanovusCerebras Systems, the pioneer in accelerating generative AI, has been awarded a new contract from the Defense Advanced Research Projects Agency (DARPA), for the development of a state-of-the-art high-performance computing system. The Cerebras system will combine the power of Cerebras’ wafer scale technology and Ranovus’ wafer scale co-packaged optics to deliver several orders of magnitude better compute performance at a fraction of the power draw.
Altair, JetZero Join Forces to Propel Aerospace Innovation
03/26/2025 | AltairAltair, a global leader in computational intelligence, and JetZero, a company dedicated to developing the world’s first commercial blended wing airplane, have joined forces to drive next-generation aerospace innovation.
RTX's Raytheon Receives Follow-on Contract from U.S. Army for Advanced Defense Analysis Solution
03/25/2025 | RTXRaytheon, an RTX business, has been awarded a follow-on contract from the U.S. Army Futures Command, Futures and Concepts Center to continue to utilize its Rapid Campaign Analysis and Demonstration Environment, or RCADE, modeling and simulation capability.
Ansys to Integrate NVIDIA Omniverse
03/20/2025 | ANSYSAnsys announced it will offer advanced data processing and visualization capabilities, powered by integrations with NVIDIA Omniverse within select products, starting with Fluent and AVxcelerate Sensors.
Altair Releases Altair HyperWorks 2025
02/19/2025 | AltairAltair, a global leader in computational intelligence, is thrilled to announce the release of Altair® HyperWorks® 2025, a best-in-class design and simulation platform for solving the world's most complex engineering challenges.