Using a Camera to Spot and Track Drones
August 17, 2017 | EPFLEstimated reading time: 3 minutes

EPFL researchers have shown that a simple camera can detect and track flying drones. Plus, the lightweight, energy-efficient and inexpensive technology could be installed directly on the drones themselves and enhance safety in the skies.
The rising number of drones in air space poses numerous challenges. Topping that list is our ability to simply detect these small unmanned aerial vehicles. Periodic near-misses between drones and large airplanes raise the specter of disaster, and the drones themselves often lack the necessary technology to locate other moving objects. To address these issues, EPFL researchers have developed algorithms capable of detecting and tracking small flying objects using a simple camera. The proof of concept was conducted as part of a PhD dissertation, and a real-time detection and collision avoidance system is now being developed in a project funded by the Commission for Technology and Innovation (CTI).
Today’s collision avoidance systems operate actively: an airplane in flight calculates its position, altitude and course, and communicates this information to other aircraft using the same technology. Those aircraft can then evaluate the risk of a collision based on their own positioning data and, if necessary, alert the pilot. But this system is only effective as long as all aircraft are equipped with the same technology. In reality, drones often lack such systems, which are costly and heavy and consume more power.
Artificial intelligence and Deep learning
A camera can thus be an effective, non-cooperative (i.e., not every aircraft must be equipped with it) addition to that system, provided the camera can successfully detect a flying drone. Therein lay the obstacle that researchers at EPFL's Computer Vision Laboratory (CVLAB) sought to overcome. The biggest challenge for a moving camera is to spot another moving object. This is much more difficult on a drone than it is on a car, which only moves in two dimensions. Drones move in three dimensions, and the camera is called on to detect objects against the sky or the ground, depending on the angle of sight. Plus, drones need to locate objects as quickly as possible, such as when they are still fuzzy black dots against a dark forest. And the fact that no two drones look alike anymore – new models are constantly being developed – meant that the researchers had to find a way to teach the camera to recognize all sorts of drones.
In his thesis, Artem Rozantsev showed that these challenges can be overcome. The first step consisted in using artificial intelligence and deep learning to teach the camera to recognize drones. His method combined information on both appearance (types of drones, position, etc.) and motion (movement in the camera's field of view), as neither alone was capable of achieving sufficiently reliable detection. He therefore proposed a machine-learning technique that operates on spatio-temporal cubes of image intensities where individual patches are aligned using a regression-based motion stabilization algorithm.
Real-time performance and accuracy
But the recognition algorithm on its own was not enough. To train a detector to recognize all types of drones in all kinds of positions, it has to have "seen" as many as possible. The existing database of images, however, is limited. So Rozantsev filled in the gaps by generating realistic synthetic images. The generated images, which are based only on a small set of real examples and a coarse 3D model of the object, are used together with the real examples to train the detector. A key ingredient to his method, the generated images are as close as possible to the real ones – not in terms of image quality, but according to the features used by the machine-learning algorithm.
The researchers managed to develop a reliable algorithm capable of detecting a drone using a lightweight camera similar to those found in smartphones. The aim of the project, now financed by the CTI, is to train a detector using an even larger data set to improve its real-time performance and accuracy. EPFL's CVLAB researchers are working on this in collaboration with FLARM Technology AG, a leading supplier of affordable collision avoidance technology for civil aviation. The first commercial models are expected to be released next year.
Testimonial
"We’re proud to call I-Connect007 a trusted partner. Their innovative approach and industry insight made our podcast collaboration a success by connecting us with the right audience and delivering real results."
Julia McCaffrey - NCAB GroupSuggested Items
Altair, Wichita State University’s NIAR Sign MoU to Accelerate Aerospace Innovation
09/16/2025 | AltairAltair, a global leader in computational intelligence, and Wichita State University’s (WSU) National Institute for Aviation Research (NIAR), one of the world’s leading aerospace research institutions, have signed a memorandum of understanding (MoU) to advance innovation across the aerospace and defense industries.
AI-Powered Wearables Transform How Consumers Interact with Everyday Technology
09/15/2025 | PR NewswireThe global demand for AI-driven, touchless wearable technologies is accelerating as consumers seek more natural, seamless and intuitive ways to interact with their devices. Traditional touch screens and voice assistants, while effective, are increasingly viewed as limiting in a world where multitasking, mobility and efficiency are key. As industries from consumer electronics to augmented reality and enterprise computing embrace the possibilities of gesture-based control, the market for neural interfaces is rapidly expanding
Hanwha Aerospace to Collaborate with BAE Systems on Advanced Anti-jamming GPS for Guided Missiles
09/15/2025 | HanwhaHanwha Aerospace has signed a contract with BAE Systems to integrate next-generation, anti-jamming Global Positioning System (GPS) technology into Hanwha Aerospace’s Deep Strike Capability precision-guided weapon system.
United Electronics Corporation Unveils Revolutionary CIMS Galaxy 30 Automated Optical Inspection System
09/11/2025 | United Electronics CorporationUnited Electronics Corporation (UEC) today announced the launch of its new groundbreaking CIMS Galaxy 30 Automated Optical Inspection (AOI) machine, setting a new industry standard for precision electronics manufacturing quality control. The Galaxy 30, developed and manufactured by CIMS, represents a significant leap forward in inspection technology, delivering exceptional speed improvements and introducing cutting-edge artificial intelligence capabilities.
Intel Announces Key Leadership Appointments to Accelerate Innovation and Strengthen Execution
09/09/2025 | Intel CorporationIntel Corporation today announced a series of senior leadership appointments that support the company’s strategy to strengthen its core product business, build a trusted foundry, and foster a culture of engineering across the business.