One Giant Leap for Lunar Landing Navigation
September 19, 2019 | NASAEstimated reading time: 6 minutes

When Apollo 11’s lunar module, Eagle, landed on the Moon on July 20, 1969, it first flew over an area littered with boulders before touching down at the Sea of Tranquility. The site had been selected based on photos collected over two years as part of the Lunar Orbiter program.
But the “sensors” that ensured Eagle was in a safe spot before touching down – those were the eyes of NASA Astronaut Neil Armstrong.
“Eagle’s computer didn’t have a vision-aided system to navigate relative to the lunar terrain, so Armstrong was literally looking out the window to figure out where to touch down,” said Matthew Fritz, principal investigator for a terrain relative navigation system being developed by Draper of Cambridge, Massachusetts. “Now, our system could become the ‘eyes’ for the next lunar lander module to help target the desired landing location.”
This week, that system will be tested in the desert of Mojave, California, on a launch and landing of Masten Space Systems’ Xodiac rocket. The rocket is scheduled to take off Wednesday, Sept. 11.
The rocket flight is made possible with support from NASA’s Flight Opportunities program managed by NASA’s Armstrong Flight Research Center in Edwards, California, and the Game Changing Development program overseen by NASA’s Langley Research Center in Hampton, Virginia. It marks the first test of the system with both a descent altitude and a landing trajectory similar to what is expected on a lunar mission.
But what is terrain relative navigation? And why is it so important to NASA’s Artemis program to return American astronauts to the Moon by 2024, and future human missions to Mars?
Without capabilities like GPS, which is designed to help us navigate on Earth, determining a lander vehicle’s location is much like comparing visual cues (e.g., road signs, important buildings, notable landmarks) while driving a car with those cues identified on road maps.
“We have onboard satellite maps loaded onto the flight computer and a camera acts as our sensor,” explained Fritz. “The camera captures images as the lander flies along a trajectory and those images are overlaid onto the preloaded satellite maps that include unique terrain features. Then by mapping the features in the live images, we’re able to know where the vehicle is relative to the features on the map.”
When astronauts return to the Moon by 2024, a camera-aided terrain relative navigation system will provide real-time, precise mapping of the lunar surface with images laid over preloaded satellite maps on the lander’s onboard computer. The image on the left, taken during a 2019 drone flight over California's Mojave Desert, shows terrain features identified by the navigation system's camera. These are matched to known features identified in satellite images on the onboard computer. Credits: Draper
While the Apollo Guidance Computer was a revolutionary feat of engineering for its time, today’s technology would certainly have been welcome assistance. With the computer sounding alarms and Eagle quickly running out of fuel, Armstrong was doing his best to find a safe parking spot.
So, it’s no surprise that NASA and commercial partners are relying on the most advanced technology to upgrade navigation for future robotic and crewed missions to the Moon. The agency is developing a suite of precision landing technologies for possible use on future commercial lunar landers. NASA is already buying services for robotic Moon deliveries and is planning to ask American companies to build the next generation human landing systems.
The agency’s work to develop navigation sensors and related technologies falls under a larger effort now referred to as SPLICE, or the Safe and Precise Landing – Integrated Capabilities Evolution project. SPLICE has evolved out of other NASA projects dating back to the early 2000s, all created to develop an integrated suite of landing and hazard avoidance capabilities for planetary missions. Contributions hail from several commercial efforts and multiple NASA centers.
Terrain relative navigation is key to the overall SPLICE effort, which also includes navigation Doppler lidar, hazard detection lidar, and a high-performance onboard computer. Working together, the full suite of capabilities promises to give future crewed missions much safer and precise descents and landings on the lunar surface.
Page 1 of 2
Testimonial
"In a year when every marketing dollar mattered, I chose to keep I-Connect007 in our 2025 plan. Their commitment to high-quality, insightful content aligns with Koh Young’s values and helps readers navigate a changing industry. "
Brent Fischthal - Koh YoungSuggested Items
EV Group Achieves Breakthrough in Hybrid Bonding Overlay Control for Chiplet Integration
09/12/2025 | EV GroupEV Group (EVG), a leading provider of innovative process solutions and expertise serving leading-edge and future semiconductor designs and chip integration schemes, today unveiled the EVG®40 D2W—the first dedicated die-to-wafer overlay metrology platform to deliver 100 percent die overlay measurement on 300-mm wafers at high precision and speeds needed for production environments. With up to 15X higher throughput than EVG’s industry benchmark EVG®40 NT2 system designed for hybrid wafer bonding metrology, the new EVG40 D2W enables chipmakers to verify die placement accuracy and take rapid corrective action, improving process control and yield in high-volume manufacturing (HVM).
AV Switchblade 600 Loitering Munition System Achieves Pivotal Milestone with First-Ever Air Launch from MQ-9A
09/12/2025 | BUSINESS WIREAeroVironment, Inc. (AV) a global leader in intelligent, multi-domain autonomous systems, announced its Switchblade 600 loitering munition system (LMS) has achieved a significant milestone with its first-ever air launch from an MQ-9A Reaper Unmanned Aircraft System (UAS).
United Electronics Corporation Unveils Revolutionary CIMS Galaxy 30 Automated Optical Inspection System
09/11/2025 | United Electronics CorporationUnited Electronics Corporation (UEC) today announced the launch of its new groundbreaking CIMS Galaxy 30 Automated Optical Inspection (AOI) machine, setting a new industry standard for precision electronics manufacturing quality control. The Galaxy 30, developed and manufactured by CIMS, represents a significant leap forward in inspection technology, delivering exceptional speed improvements and introducing cutting-edge artificial intelligence capabilities.
IPS, SEL Raise the Bar for ENIG Automation in North America
09/11/2025 | Mike Brask, IPSIPS has installed a state-of-the-art automated ENIG plating line at Schweitzer Engineering Laboratories’ PCB facility in Moscow, Idaho. The 81-foot, fully enclosed line sets a new standard for automation, safety, and efficiency in North American PCB manufacturing and represents one of the largest fully enclosed final finish lines in operation.
Smart Automation: Odd-form Assembly—Dedicated Insertion Equipment Matters
09/09/2025 | Josh Casper -- Column: Smart AutomationLarge, irregular, or mechanically unique parts, often referred to as odd-form components, have never truly disappeared from electronics manufacturing. While many in the industry have been pursuing miniaturization, faster placement speeds, and higher-density PCBs, certain market sectors are moving in the opposite direction.