A.I. Will Prepare Robots for the Unknown
June 22, 2017 | NASAEstimated reading time: 3 minutes

How do you get a robot to recognize a surprise?
That's a question artificial intelligence researchers are mulling, especially as A.I. begins to change space research.
A new article in the journal Science: Robotics offers an overview of how A.I. has been used to make discoveries on space missions. The article, co-authored by Steve Chien and Kiri Wagstaff of NASA's Jet Propulsion Laboratory, Pasadena, California, suggests that autonomy will be a key technology for the future exploration of our solar system, where robotic spacecraft will often be out of communication with their human controllers.
In a sense, space scientists are doing field research virtually, with the help of robotic spacecraft.
"The goal is for A.I. to be more like a smart assistant collaborating with the scientist and less like programming assembly code," said Chien, a senior research scientist on autonomous space systems. "It allows scientists to focus on the 'thinking' things -- analyzing and interpreting data -- while robotic explorers search out features of interest."
Science is driven by noticing the unexpected, which is easier for a trained human who knows when something is surprising. For robots, this means having a sense of what's "normal" and using machine learning techniques to detect statistical anomalies.
"We don't want to miss something just because we didn't know to look for it," said Wagstaff, a principal data scientist with JPL's machine learning group. "We want the spacecraft to know what we expect to see and recognize when it observes something different."
Spotting unusual features is one use of A.I. But there's an even more complex use that will be essential for studying ocean worlds, like Jupiter's moon Europa.
"If you know a lot in advance, you can build a model of normality -- of what the robot should expect to see," Wagstaff said. "But for new environments, we want to let the spacecraft build a model of normality based on its own observations. That way, it can recognize surprises we haven't anticipated."
Imagine, for example, A.I. spotting plumes erupting on ocean worlds. These eruptions can be spontaneous and could vary greatly in how long they last. A.I. could enable a passing spacecraft to reprioritize its operations and study these phenomena "on the fly," Chien said.
JPL has led the development of several key examples for space A.I. Dust devils swirling across the Martian surface were imaged by NASA's Opportunity rover using a program called WATCH. That program later evolved into AEGIS, which helps the Curiosity rover's ChemCam instrument pick new laser targets that meet its science team's parameters without needing to wait for interaction with scientists on Earth. AEGIS can also fine-tune the pointing of the ChemCam laser.
Closer to home, A.I. software called the Autonomous Sciencecraft Experiment studied volcanoes, floods and fires while on board Earth Observing-1, a satellite managed by NASA's Goddard Spaceflight Center, Greenbelt, Maryland. EO-1's Hyperion instrument also used A.I. to identify sulfur deposits on the surface of glaciers -- a task that could be important for places like Europa, where sulfur deposits would be of interest as potential biosignatures.
A.I. allows spacecraft to prioritize the data it collects, balancing other needs like power supply or limited data storage. Autonomous management of systems like these is being prototyped for NASA's Mars 2020 rover (which will also use AEGIS for picking laser targets).
While autonomy offers exciting new advantages to science teams, both Chien and Wagstaff stressed that A.I. has a long way to go.
"For the foreseeable future, there's a strong role for high-level human direction," Wagstaff said. "But A.I. is an observational tool that allows us to study science that we couldn't get otherwise."
Andrew Good
Jet Propulsion Laboratory, Pasadena, Calif.
818-393-2433
andrew.c.good@jpl.nasa.gov
Suggested Items
Specially Developed for Laser Plastic Welding from LPKF
06/25/2025 | LPKFLPKF introduces TherMoPro, a thermographic analysis system specifically developed for laser plastic welding that transforms thermal data into concrete actionable insights. Through automated capture, evaluation, and interpretation of surface temperature patterns immediately after welding, the system provides unprecedented process transparency that correlates with product joining quality and long-term product stability.
Smart Automation: The Power of Data Integration in Electronics Manufacturing
06/24/2025 | Josh Casper -- Column: Smart AutomationAs EMS companies adopt automation, machine data collection and integration are among the biggest challenges. It’s now commonplace for equipment to collect and output vast amounts of data, sometimes more than a manufacturer knows what to do with. While many OEM equipment vendors offer full-line solutions, most EMS companies still take a vendor-agnostic approach, selecting the equipment companies that best serve their needs rather than a single-vendor solution.
Keysight, NTT, and NTT Innovative Devices Achieve 280 Gbps World Record Data Rate with Sub-Terahertz for 6G
06/17/2025 | Keysight TechnologiesKeysight Technologies, Inc. in collaboration with NTT Corporation and NTT Innovative Devices Corporation (NTT Innovative Devices), today announced a groundbreaking world record in data rate achieved using sub-THz frequencies.
Priority Software Announces the New, Game-Changing aiERP
06/12/2025 | Priority SoftwarePriority Software Ltd., a leading global provider of ERP and business management software announces its revolutionary aiERP, leveraging the power of AI to transform business operations.
Breaking Silos with Intelligence: Connectivity of Component-level Data Across the SMT Line
06/09/2025 | Dr. Eyal Weiss, CybordAs the complexity and demands of electronics manufacturing continue to rise, the smart factory is no longer a distant vision; it has become a necessity. While machine connectivity and line-level data integration have gained traction in recent years, one of the most overlooked opportunities lies in the component itself. Specifically, in the data captured just milliseconds before a component is placed onto the PCB, which often goes unexamined and is permanently lost once reflow begins.