DeepFly3D: The Deep-Learning Way to Design Fly-Like Robots
October 11, 2019 | EPFLEstimated reading time: 2 minutes
EPFL scientists have developed a deep-learning based motion-capture software that uses multiple camera views to model the movements of a fly in three dimensions. The ultimate aim is to use this knowledge to design fly-like robots.
“Just think about what a fly can do,” says Professor Pavan Ramdya, whose lab at EPFL’s Brain Mind Institute, with the lab of Professor Pascal Fua at EPFL’s Institute for Computer Science, led the study. “A fly can climb across terrain that a wheeled robot would not be able to.”
Flies aren’t exactly endearing to humans. We rightly associate them with less-than-appetizing experiences in our daily lives. But there is an unexpected path to redemption: Robots. It turns out that flies have some features and abilities that can inform a new design for robotic systems.
“Unlike most vertebrates, flies can climb nearly any terrain,” says Ramdya. “They can stick to walls and ceilings because they have adhesive pads and claws on the tips of their legs. This allows them to basically go anywhere. That's interesting also because if you can rest on any surface, you can manage your energy expenditure by waiting for the right moment to act.”
It was this vision of extracting the principles that govern fly behavior to inform the design of robots that drove the development of DeepFly3D, a motion-capture system for the fly Drosophila melanogaster, a model organism that is nearly ubiquitously used across biology.
In Ramdya’s experimental setup, a fly walks on top of a tiny floating ball—like a miniature treadmill—while seven cameras record its every movement. The fly’s top side is glued onto an unmovable stage so that it always stays in place while walking on the ball. Nevertheless, the fly “believes” that it is moving freely.
The collected camera images are then processed by DeepFly3D, a deep-learning software developed by Semih Günel, a PhD student working with both Ramdya’s and Fua’s labs. “This is a fine example of where an interdisciplinary collaboration was necessary and transformative,” says Ramdya. “By leveraging computer science and neuroscience, we’ve tackled a long-standing challenge.”
Different poses of the fruit fly Drosophila melanogaster are captured by multiple cameras and processed with the DeepFly3D software. Credit: P. Ramdya, EPFL.
What’s special about DeepFly3D is that is can infer the 3D pose of the fly—or even other animals—meaning that it can automatically predict and make behavioral measurements at unprecedented resolution for a variety of biological applications. The software doesn’t need to be calibrated manually and it uses camera images to automatically detect and correct any errors it makes in its calculations of the fly’s pose. Finally, it also uses active learning to improve its own performance.
DeepFly3D opens up a way to efficiently and accurately model the movements, poses, and joint angles of a fruit fly in three dimensions. This may inspire a standard way to automatically model 3D pose in other organisms as well.
“The fly, as a model organism, balances tractability and complexity very well,” says Ramdya. “If we learn how it does what it does, we can have important impact on robotics and medicine and, perhaps most importantly, we can gain these insights in a relatively short period of time.”
Suggested Items
Podcast Review: On the Line With… Designing for Reality
05/09/2024 | Duane Benson, Positive Edge LLCAs a technologist, if I were forced to come up with just one recurring theme that I might call a professional “nemesis,” it would be the difference between theory and reality. A lot of technology we have at our disposal works well in theory but falls short when reality hits. That’s not the only reason I chose to listen to and review On the Line With… Designing for Reality, featuring a series of conversations with ASC Sunstone’s Matt Stevenson, but it certainly helped that the title caught my eye.
EMA Webinar: Next Generation MCAD/ECAD for SOLIDWORKS
05/09/2024 | EMA Design AutomationLearn how the MCAD and ECAD experts at Hawk Ridge and EMA can help you solve your MCAD/ECAD integration challenges with this unique collaboration environment.
Sondrel Awarded New Video Processor ASIC Design and Supply Contract
05/09/2024 | SondrelSondrel, a leading provider of ultra-complex custom chips for leading global technology brands, is pleased to announce that it has won a major ASIC design and supply contract for a next generation, video processing chip.
Connect the Dots: Designing for Reality—The Pre-Manufacturing Process
05/08/2024 | Matt Stevenson -- Column: Connect the DotsI have been working with Nolan Johnson on a podcast series about designing PCBs for the reality of manufacturing. By sharing lessons learned over a long career in the PCB industry, we hope to shorten learning curves and help designers produce better boards with less hassle and rework. Episode 2 deals with the electronic pre-manufacturing process. Moving from CAD (computer-aided design) to CAM (computer-aided manufacturing) is a key step in PCB manufacturing. CAM turns digital designs into instructions that machines can use to actually build the PCB.
Indium Corporation to Showcase HIA Materials at ECTC
05/07/2024 | Indium CorporationAs an industry leader in innovative materials solutions for semiconductor packaging and assembly, Indium Corporation® will feature its advanced products designed to meet the evolving challenges of heterogeneous integration and assembly (HIA) and fine-pitch system-in-package (SiP) applications at the 74th Electronic Components and Technology Conference (ECTC), May 28‒31, in Denver, Colorado.