Deep Learning Attitude Sensor Provides Real-Time Image Recognition From Satellite Orbit
January 17, 2019 | Tokyo TechEstimated reading time: 6 minutes

Researchers at Tokyo Institute of Technology (Tokyo Tech) have developed a low-cost star tracker and Earth sensor made from commercially available components. The star tracker is designed for use with micro-satellites in handling calibration observations, operation verification tests, and long-term performance monitoring during orbit. Embodying the concept of edge computing, the Earth camera performs image recognition while in orbit using a simple AI that identifies land use and vegetation distribution. Utilizing the acquired topography data, assessments can also be conducted using a novel 3-axis attitude estimation method. The star tracker and Earth sensor are installed on the Japan Aerospace Exploration Agency's (JAXA) Epsilon-4 rocket, scheduled for launch on January 17, 2019 from the Uchinoura Space Center in Kagoshima Prefecture, Japan.
Research Goals
To perform functions such as communicating with ground stations and directing solar cell paddles toward the sun for power and temperature control, satellites use attitude sensors to determine their orientation (attitude). The Tokyo Tech research group led by Assistant Professor Yoichi Yatsu has developed a star tracker and an Earth sensor that uses deep learning to determine attitude in space. With no ground to distinguish directionality, the device constantly tracks multiple fixed stars to achieve high accuracy, while the Earth sensor performs attitude estimation based on images of the Earth.
This Deep Learning Attitude Sensor (DLAS) was developed with three goals in mind. The first is to demonstrate that a low-cost star tracker made from inexpensive, high-performance commercially available components can effectively operate in space. The plan is to capture images of stars in orbit under various conditions to calibrate the sensor system and determine attitude based on novel algorithms, and demonstrate long-term operation with a test period of one year.
The second goal is to conduct orbital testing of real-time image recognition using deep learning. The Earth is photographed using two compact visible light cameras incorporated in the baffle of the star tracker. The 8-megapixel images taken are processed in about 4 seconds using a specially developed high-speed, lightweight image identification algorithm. Recognition of vegetation and land use is performed over nine categories, including green terrain, deserts, oceans, clouds, and outer space. This will be the first demonstration of real-time image recognition in space using deep learning. In orbit, more than 1,000 images are taken as learning data and transferred to the ground for use in satellite image application tests. The third goal is the application of this image identification technology, and the evaluation of the technologies for estimating 3-axis attitude using land features obscured by clouds and comparing it with map data prerecorded in the onboard computer.
Hardware development for DLAS completed in April 2018 and was incorporated into the Innovative Satellite Technology Demonstration-1 (RAPIS-1) developed by JAXA. After about six months of system environment testing and operation rehearsals, it will be launched from the JAXA Uchinoura Space Center in Kagoshima Prefecture using Epsilon-4 on January 17, 2019, and will enter into Sun-synchronous orbit at an altitude of 500 km. DLAS operation is scheduled to start after completion of the RAPIS-1 checkout, and after confirming initial operation, each sensor system will undergo calibration for about one month. Afterwards, it will enter into mission operation for one year.
JAXA entrusted development of the RAPIS-1 satellite / control system and satellite operation to Axelspace Corporation, which is a startup company established by people involved in the development of nanosatellites at the University of Tokyo and Tokyo Tech. This marks a major turning point in Japanese space development, which until now had been led by major electronics manufacturers, and will be a memorable flight operation for those related to private space development, which was "palm size" at universities fifteen years ago.
Background
It has been fifteen years since the world's first CubeSats, CUTE-I and X-XI developed by Tokyo Tech and the University of Tokyo, were launched in 2003. Nanosatellites were initially for education purposes. After 2010, these began to be actively used for commercial space projects. Today, more than 200 are launched annually, establishing a new global 1-billion-USD space industry. Since Engineering Technology Demonstration Satellite CUTE-I, research groups from the Kawai Laboratory and Matsunaga Laboratory at Tokyo Tech are aiming for nanosatellite space science observation, and have been leading the world's development of the nanosatellites, by conducting the development, launch, and operation of three nanosized observation satellites (Cute-1.7+APD, Cute-1.7+APD Unit 2, and TSUBAME) .
Research Details
In recent years, due to the development of communication networks (Internet) and computer technologies for connecting satellites with the whole Earth, the new field of discovery known as "time domain astronomy" has opened, which involves research on short-term astronomical phenomena that suddenly appear and disappear. The symbolic target is the gravitational wave phenomenon discovered in August 2017, its electromagnetic wave counterpart was discovered through observation of radio/infrared/optical telescopes, observation satellites, and neutrino detectors all over the world observed simultaneously in response to the detection alert of the position of the gravitational wave telescope. Dr. Yatsu's team plans to monitor a super wide field of 100 square degrees with UV light for which there are almost no observation examples, and conduct research with the aim of discovering initial activity of short-term astrophysical phenomena such as gravitational wave sources and unknown astrophysical events.
Satellites are needed since most UV light is blocked by the atmosphere, but in order to obtain sharp photographs of faint stars, high attitude stability is required. In addition, it is difficult to instantaneously transfer all image data back to the ground due to limitations in satellite communication speed. Therefore, in order to have detailed observation using a combination of the above-mentioned terrestrial telescopes, practical measures need to be developed including transmitting only analysis results such as the accurate position and brightness of a target astronomical object by having image analysis performed in the satellite. To accomplish such an advanced observation mission, it was necessary to acquire a highly accurate star tracker, a highly advanced on-board computer that can be mounted on a satellite, and an automatic image analysis technique that utilizes these.
Astronomical observations from a space telescope require data processing programs such as high-accuracy attitude calculation based on the assumption of cosmic radiation removal, star image detection, star alignment pattern matching, noise contamination, etc., that can also be used as a star tracker, which is an attitude sensor for artificial satellites. Unfortunately, there is currently a lack of attitude system suppliers for nanosatellites in Japan. Therefore, the research team has been developing actual equipment with a view toward commercialization of on-board component development.
On the other hand, ground-based image recognition tests were originally inspired by ground-based weather identification tests. Currently, the Kawai Laboratory has set up robotic telescopes with a diameter of 50 cm in Yamanashi Prefecture and Okayama Prefecture to observe gravitational wave phenomena, gamma-ray bursts, etc. However, observation was often suspended because the weather in Japan is unstable. Therefore, research on a cloud identifier based on deep learning was conducted to improve observation efficiency. The cloud identifier developed in cooperation with Tokyo Tech School of Computing professors Koichi Shinoda and Nakamasa Inoue achieved extremely high accuracy even in the initial prototype stage, and it was believed that it could be applied to instantaneous image identification for orbiting satellites, prompting this study.
Until now, analysis of satellite images using AI generally referred to carefully analyzing large numbers of images accumulated in data servers on the ground using supercomputers, but real-time image recognition on orbiting satellites, i.e. at the computing "edge", has the potential to greatly change the value and operation of the nanosatellite platform. For example, there is a huge amount of information that quickly loses its value if it cannot be quickly determined, such as that in defense, disaster monitoring, and capturing debris. This research was embarked upon to achieve autonomous detection by satellites, in place of human eyes.
Suggested Items
Kitron: Q1 2025 - Strong Start to the Year
04/25/2025 | KitronKitron reported first-quarter results characterised by continued momentum in the Defence & Aerospace market sector and a growing order backlog.
RTX's Collins Aerospace Enhances Capabilities to Speed Marine Corps Decision-making in Battle
04/22/2025 | RTXCollins Aerospace, an RTX business, successfully demonstrated new technology that helps the military gather and use information from a wider range of sources at Project Convergence Capstone 5, a large-scale military exercise.
AdvancedPCB Appoints Gary Stoffer as Chief Commercial Officer
04/18/2025 | PRNewswireAdvancedPCB is proud to announce the appointment of Gary Stoffer as its new Chief Commercial Officer (CCO). In this role, Stoffer will lead all sales, marketing, and commercial strategy initiatives as the company continues its mission to deliver cutting-edge PCB solutions to industries worldwide.
Real Time with... IPC APEX EXPO 2025: GreenSource's Growth and Future Developments
04/15/2025 | Real Time with...IPC APEX EXPOThings are looking bright for GreenSource. Michael Gleason shares an update on GreenSource's recent growth and upcoming changes. A recipient of a Defense Production Act Investment Program award, GreenSource is planning for new substrate capabilities. Current investments continue to enhance equipment and sustainability initiatives such as water quality. And their unique collaboration with the University of New Hampshire continues to aid their workforce development, despite recruitment challenges.
Apogee Semiconductor Teams with Arrow Electronics to Expand Distribution of Space-Grade Technologies
04/14/2025 | Apogee SemiconductorApogee Semiconductor, a leading provider of advanced technologies for space and extreme environments, announced its collaboration with Arrow Electronics, a global distributor of electronic components and services.