One Giant Leap for Lunar Landing Navigation
September 19, 2019 | NASAEstimated reading time: 6 minutes

When Apollo 11’s lunar module, Eagle, landed on the Moon on July 20, 1969, it first flew over an area littered with boulders before touching down at the Sea of Tranquility. The site had been selected based on photos collected over two years as part of the Lunar Orbiter program.
But the “sensors” that ensured Eagle was in a safe spot before touching down – those were the eyes of NASA Astronaut Neil Armstrong.
“Eagle’s computer didn’t have a vision-aided system to navigate relative to the lunar terrain, so Armstrong was literally looking out the window to figure out where to touch down,” said Matthew Fritz, principal investigator for a terrain relative navigation system being developed by Draper of Cambridge, Massachusetts. “Now, our system could become the ‘eyes’ for the next lunar lander module to help target the desired landing location.”
This week, that system will be tested in the desert of Mojave, California, on a launch and landing of Masten Space Systems’ Xodiac rocket. The rocket is scheduled to take off Wednesday, Sept. 11.
The rocket flight is made possible with support from NASA’s Flight Opportunities program managed by NASA’s Armstrong Flight Research Center in Edwards, California, and the Game Changing Development program overseen by NASA’s Langley Research Center in Hampton, Virginia. It marks the first test of the system with both a descent altitude and a landing trajectory similar to what is expected on a lunar mission.
But what is terrain relative navigation? And why is it so important to NASA’s Artemis program to return American astronauts to the Moon by 2024, and future human missions to Mars?
Without capabilities like GPS, which is designed to help us navigate on Earth, determining a lander vehicle’s location is much like comparing visual cues (e.g., road signs, important buildings, notable landmarks) while driving a car with those cues identified on road maps.
“We have onboard satellite maps loaded onto the flight computer and a camera acts as our sensor,” explained Fritz. “The camera captures images as the lander flies along a trajectory and those images are overlaid onto the preloaded satellite maps that include unique terrain features. Then by mapping the features in the live images, we’re able to know where the vehicle is relative to the features on the map.”
When astronauts return to the Moon by 2024, a camera-aided terrain relative navigation system will provide real-time, precise mapping of the lunar surface with images laid over preloaded satellite maps on the lander’s onboard computer. The image on the left, taken during a 2019 drone flight over California's Mojave Desert, shows terrain features identified by the navigation system's camera. These are matched to known features identified in satellite images on the onboard computer. Credits: Draper
While the Apollo Guidance Computer was a revolutionary feat of engineering for its time, today’s technology would certainly have been welcome assistance. With the computer sounding alarms and Eagle quickly running out of fuel, Armstrong was doing his best to find a safe parking spot.
So, it’s no surprise that NASA and commercial partners are relying on the most advanced technology to upgrade navigation for future robotic and crewed missions to the Moon. The agency is developing a suite of precision landing technologies for possible use on future commercial lunar landers. NASA is already buying services for robotic Moon deliveries and is planning to ask American companies to build the next generation human landing systems.
The agency’s work to develop navigation sensors and related technologies falls under a larger effort now referred to as SPLICE, or the Safe and Precise Landing – Integrated Capabilities Evolution project. SPLICE has evolved out of other NASA projects dating back to the early 2000s, all created to develop an integrated suite of landing and hazard avoidance capabilities for planetary missions. Contributions hail from several commercial efforts and multiple NASA centers.
Terrain relative navigation is key to the overall SPLICE effort, which also includes navigation Doppler lidar, hazard detection lidar, and a high-performance onboard computer. Working together, the full suite of capabilities promises to give future crewed missions much safer and precise descents and landings on the lunar surface.
Page 1 of 2
Suggested Items
Driving Innovation: Direct Imaging vs. Conventional Exposure
07/01/2025 | Simon Khesin -- Column: Driving InnovationMy first camera used Kodak film. I even experimented with developing photos in the bathroom, though I usually dropped the film off at a Kodak center and received the prints two weeks later, only to discover that some images were out of focus or poorly framed. Today, every smartphone contains a high-quality camera capable of producing stunning images instantly.
Specially Developed for Laser Plastic Welding from LPKF
06/25/2025 | LPKFLPKF introduces TherMoPro, a thermographic analysis system specifically developed for laser plastic welding that transforms thermal data into concrete actionable insights. Through automated capture, evaluation, and interpretation of surface temperature patterns immediately after welding, the system provides unprecedented process transparency that correlates with product joining quality and long-term product stability.
United Electronics Corporation Advances Manufacturing Capabilities with Schmoll MDI-ST Imaging Equipment
06/24/2025 | United Electronics CorporationUnited Electronics Corporation has successfully installed the advanced Schmoll MDI-ST (XL) imaging equipment at their advanced printed circuit board facility. This significant technology investment represents a continued commitment to delivering superior products and maintaining their position as an industry leader in precision PCB manufacturing.
IBM, RIKEN Unveil First IBM Quantum System Two Outside of the U.S.
06/24/2025 | IBMIBM and RIKEN, a national research laboratory in Japan, today unveiled the first IBM Quantum System Two ever to be deployed outside of the United States and beyond an IBM Quantum Data Center.
Excellon Installs COBRA Hybrid Laser at Innovative Circuits
06/23/2025 | ExcellonExcellon is pleased to announce the successful installation of a second COBRA Hybrid Laser System at Innovative Circuits, located in Alpharetta, Georgia. The Excellon COBRA Hybrid Laser System uniquely combines both UV and CO₂ (IR) laser sources on a single platform—making it ideal for high-density prototype and production printed circuit boards (PCBs).