Artificial Intelligence Helps Improve NASA’s Eyes on the Sun
July 26, 2021 | NASAEstimated reading time: 4 minutes

A group of researchers is using artificial intelligence techniques to calibrate some of NASA’s images of the Sun, helping improve the data that scientists use for solar research. The new technique was published in the journal Astronomy & Astrophysics on April 13, 2021.
A solar telescope has a tough job. Staring at the Sun takes a harsh toll, with a constant bombardment by a never-ending stream of solar particles and intense sunlight. Over time, the sensitive lenses and sensors of solar telescopes begin to degrade. To ensure the data such instruments send back is still accurate, scientists recalibrate periodically to make sure they understand just how the instrument is changing.
Launched in 2010, NASA’s Solar Dynamics Observatory, or SDO, has provided high-definition images of the Sun for over a decade. Its images have given scientists a detailed look at various solar phenomena that can spark space weather and affect our astronauts and technology on Earth and in space. The Atmospheric Imagery Assembly, or AIA, is one of two imaging instruments on SDO and looks constantly at the Sun, taking images across 10 wavelengths of ultraviolet light every 12 seconds. This creates a wealth of information of the Sun like no other, but – like all Sun-staring instruments – AIA degrades over time, and the data needs to be frequently calibrated.
Seven of the ultraviolet wavelengths observed by the AIA on NASA’s SDO. The top row is taken from May 2010 and the bottom row shows from 2019, without any corrections, showing how the instrument degraded over time.
Since SDO’s launch, scientists have used sounding rockets to calibrate AIA. Sounding rockets are smaller rockets that typically only carry a few instruments and take short flights into space – usually only 15 minutes. Crucially, sounding rockets fly above most of Earth’s atmosphere, allowing instruments on board to to see the ultraviolet wavelengths measured by AIA. These wavelengths of light are absorbed by Earth’s atmosphere and can’t be measured from the ground. To calibrate AIA, they would attach an ultraviolet telescope to a sounding rocket and compare that data to the measurements from AIA. Scientists can then make adjustments to account for any changes in AIA’s data.
There are some drawbacks to the sounding rocket method of calibration. Sounding rockets can only launch so often, but AIA is constantly looking at the Sun. That means there’s downtime where the calibration is slightly off in between each sounding rocket calibration.
“It’s also important for deep space missions, which won’t have the option of sounding rocket calibration,” said Dr. Luiz Dos Santos, a solar physicist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, and lead author on the paper. “We’re tackling two problems at once.”
Virtual calibration
With these challenges in mind, scientists decided to look at other options to calibrate the instrument, with an eye towards constant calibration. Machine learning, a technique used in artificial intelligence, seemed like a perfect fit.
As the name implies, machine learning requires a computer program, or algorithm, to learn how to perform its task.
First, researchers needed to train a machine learning algorithm to recognize solar structures and how to compare them using AIA data. To do this, they give the algorithm images from sounding rocket calibration flights and tell it the correct amount of calibration they need. After enough of these examples, they give the algorithm similar images and see if it would identify the correct calibration needed. With enough data, the algorithm learns to identify how much calibration is needed for each image.
Because AIA looks at the Sun in multiple wavelengths of light, researchers can also use the algorithm to compare specific structures across the wavelengths and strengthen its assessments.
To start, they would teach the algorithm what a solar flare looked like by showing it solar flares across all of AIA’s wavelengths until it recognized solar flares in all different types of light. Once the program can recognize a solar flare without any degradation, the algorithm can then determine how much degradation is affecting AIA’s current images and how much calibration is needed for each.
“This was the big thing,” Dos Santos said. “Instead of just identifying it on the same wavelength, we’re identifying structures across the wavelengths.”
This means researchers can be more sure of the calibration the algorithm identified. Indeed, when comparing their virtual calibration data to the sounding rocket calibration data, the machine learning program was spot on.
Two lines of images of the Sun. The top line gets darker and harder to see, while the bottom row stays a consistent brightly visible image.
With this new process, researchers are poised to constantly calibrate AIA’s images between calibration rocket flights, improving the accuracy of SDO’s data for researchers.
Machine learning beyond the Sun
Researchers have also been using machine learning to better understand conditions closer to home.
One group of researchers led by Dr. Ryan McGranaghan - Principal Data Scientist and Aerospace Engineer at ASTRA LLC and NASA Goddard Space Flight Center - used machine learning to better understand the connection between Earth’s magnetic field and the ionosphere, the electrically charged part of Earth’s upper atmosphere. By using data science techniques to large volumes of data, they could apply machine learning techniques to develop a newer model that helped them better understand how energized particles from space rain down into Earth’s atmosphere, where they drive space weather.
As machine learning advances, its scientific applications will expand to more and more missions. For the future, this may mean that deep space missions – which travel to places where calibration rocket flights aren’t possible – can still be calibrated and continue giving accurate data, even when getting out to greater and greater distances from Earth or any stars.
Testimonial
"In a year when every marketing dollar mattered, I chose to keep I-Connect007 in our 2025 plan. Their commitment to high-quality, insightful content aligns with Koh Young’s values and helps readers navigate a changing industry. "
Brent Fischthal - Koh YoungSuggested Items
Gartner Says AI PCs Will Represent 31% of Worldwide PC Market by the End of 2025
08/29/2025 | Gartner, Inc.Artificial intelligence (AI) PCs will represent 31% of the total PC market globally by the end of 2025, according to Gartner, Inc. a business and technology insights company. Worldwide shipments of AI PCs are projected to total 77.8 million units in 2025.
AI to Dominate Demand in 2025, While the Electronics Industry Faces Slow Growth in 2026
08/13/2025 | TrendForceTrendForce’s latest investigations find that the global electronics market in 2025 will be sharply divided. AI server demand—driven by data center consumption—will stand out as the sole growth engine, while end products such as smartphones, notebooks, wearables, and TVs are expected to stagnate under the combined pressures of high inflation, a lack of breakthrough innovations, and ongoing geopolitical uncertainty.
Alphabet Boosted by AI, Cloud Demand as Spending Needs Jump
07/24/2025 | I-Connect007 Editorial TeamGoogle’s parent company, Alphabet Inc., said that demand for artificial intelligence products boosted its quarterly sales, and now requires an extreme increase in capital spending to keep up in the AI race, Bloomberg reported. For 2025, the company stated its capital expenditure will be $85 billion—$10 billion more than previously forecast.
TDK Acquires SoftEye to Enable Artificial Intelligence
06/20/2025 | PRNewswireTDK Corporation announced that it has acquired SoftEye, Inc., a U.S.-based systems solution company developing custom chips, cameras, software and algorithms for use in smart glasses.
BAE Systems, Hanwha Systems Sign MOU to Develop an Advanced Multi-Sensor Satellite System
06/12/2025 | BAE SystemsBAE Systems and Hanwha Systems have signed a Memorandum of Understanding (MOU) to develop technology and collaborative working to deliver a multi-sensor satellite system for international markets.