Swiss research inspired the design of the drone technology behind the Ingenuity helicopter which flew on Mars earlier this week.
Davide Scaramuzza, who leads the Robotics and Perception Group at the University of Zurich, calls the flight “a great success” and says the next step for Martian drones is to explore places that might harbor life and could even shelter humans who might travel there someday.
Lately, all eyes have been on NASA's Perseverance rover mission to Mars and its helicopter drone Ingenuity, which recently completed its first autonomous flight outside the Earth's atmosphere. This mission is crucial to look for signs of past microbial life on the Red Planet, if it ever existed. The drone could also significantly advance exploration by diving into cavities as deep as lava tube that could be habitable.
Scaramuzza has been working on autonomous drone technology with onboard cameras and without GPS – like the craft used on Mars – since 2009, and his lab is currently collaborating with NASA on future Mars helicopter missions. He spoke to SWI swissinfo.ch about what drones may be able to tell us about the Red Planet, including the exploration of lava tubes.
SWI swissinfo.ch: How did ingenuity's flight to Mars go? What does it mean from a scientific point of view?
Davide Scaramuzza: It was a great success. It demonstrates the first powered autonomous flight on another planet. This is remarkable because Mars' atmosphere is 1% of the volume of that on Earth at our altitudes. To give you an idea, flying on the Martian surface is like flying 30 kilometres above Earth!
It’s also a partially autonomous flight – mission control sends a route which the helicopter then executes. Again, this is remarkable, given that 99% of commercial drones still navigate using GPS. This shows that camera-based autonomous navigation technology, which I have been working on since 2004, is really coming of age!
At this turning point for the technology, what does your work on autonomous drones for space missions involve?
My lab's main goal is to make autonomous drones fly better than human pilots. I have been developing algorithms to make autonomous drones able to perform complex tasks, like exploring and mapping unknown environments for search-and-rescue missions (video1External link) as well as outperforming human pilots (video2External link and video3External link). These are technologies that are already playing a role today in search and rescue missions, the inspection of complex infrastructure and the delivery of goods, and which will play an important role in future space missions where the drone could enter, explore, and map lava tubes on other planets.
Can you share some details about your lab’s collaboration with NASA?
Everything we do in my lab is about autonomous navigation of drones using only onboard cameras. Cameras are cheap and lightweight, which is ideal for mini drones. We are currently collaborating with the NASA Jet Propulsion Laboratory to investigate the suitability of event-based cameras* for future Mars helicopter missions. Event-based cameras are a novel type of camera with higher dynamic range, higher temporal resolution, and lower power consumption than standard cameras. Thanks to these advantages, event-based cameras promise to broaden the operational capabilities of future Mars helicopter missions.
* Event cameras are bio-inspired vision sensors that respond to local changes in brightness.
How could the use of drones support the search for life on Mars, and what’s the significance of lava tubes for that search? What can drones do that rovers can’t?
Drones can cover longer distance in much less time than a rover. In the future, swarms of drones will be used both to identify environments suitable to host human missions as well as microorganisms.
According to recent studies, life on Mars might hide in the ice inside lava tubes, which were formed through volcanic processes. Drones are the ideal way to enter and explore lava tubes in future Mars missions, and event-based cameras could play a crucial role for three reasons: their higher dynamic range would make it possible to analyse lava tube entry points during flights over them; their lower power consumption may increase flight endurance; and their higher temporal resolution reduces motion blur significantly during fast motion, which means that the drone does not need to slow down in the dark.
Lava tubes are also the ideal environments to host human life (they can shield humans from cosmic radiation, dust accumulation, temperature fluctuations, and micrometeorites), and drones are the ideal robot to explore them because some lava tubes may be deep and difficult or impossible to climb even for very skilled rovers.
What was your group's role in making Ingenuity's flight possible?
In 2009, when I was still a postdoctoral researcher at the ETH Zurich’s Autonomous Systems Lab, my team and myself demonstrated the first autonomous flight – take-off, navigation from A to B, and landing – of a mini drone with a camera and an inertial sensor (without GPS).
This workExternal link was the first demonstration of a drone moving autonomously with the use of a camera –all systems up to then and 99% of current drones are still GPS-based.
On Mars there is obviously no GPS. Ingenuity's algorithm is inspired by that work in that it also uses a single camera and an inertial sensor. Moreover, Ingenuity's first flight replicated exactly what we demonstrated in the first experiment in 2009 mentioned above: takeoff, hovering, landing.
How does a drone get its bearings even millions of miles away from Earth? What are the biggest technical hurdles to overcome?
A rover (like Perseverance) is somewhat easier to teleoperate from Earth than a drone (despite the several-minute delay in communication) because its wheels are continuously in contact with the ground while the robot “waits” for the next command from Earth.
A drone (like Ingenuity) is more difficult to control because it is very sensitive to air turbulence, which requires control commands to be sent dozens of times a second. These commands are not sent from Earth but directly by the drone’s onboard “autopilot”, or the drone guidance software.
The drone autopilot works in two stages: first, it fuses the information from its onboard sensors (an inertial measurement unit, an altimeter, and downward-facing camera) to estimate the drone’s 3D position and orientation relative to where it started. Then, it uses the estimated position and orientation to follow a pre-computed path.
This pre-computed path is the only piece of information sent from Earth by NASA engineers. It consists of a sequence of waypoints at which they specify the position, orientation, and velocity that the drone must reach.
An example of pre-computed path is: take off and reach a height of one metre; fly straight and horizontally for three metres at a constant speed of 1 metre per second and, finally, land.
The Ingenuity helicopter flight was a technological demonstration, whose only purpose was to show that we can hover and fly a drone on a short path in the thin Martian atmosphere.
In the future, more advanced drones should be able to execute high-level commands, such as “go take a picture of that rock”, “enter that lava tube, build a 3D map of it, and come back”.
These are, however, high-level capabilities that are generally still confined to research labs and are not yet robust enough to be deployed on Earth. So we need to wait a few more years for this vision to become reality.
In compliance with the JTI standards