In 2018, Gavin Ananda joined Zipline, fresh off his PhD program in aircraft and wind turbine design. He was an aerodynamicist, not a sound person. But Zipline’s perception team was miking drones to see if they could use acoustics to detect and avoid other aircraft, and “I got nerd-sniped to join the cause,” he says. Five years later, Zipline is leading the way in onboard Detect and Avoid, also called DAA, which is helping Zipline safely scale from hundreds of Zips in seven countries to hundreds of thousands all over the world. “Zipline is one of a number of companies that have spent years trying to figure out a technology solution that can fit on a drone to safely navigate the airspace and maintain separation from other aircraft,” Ananda says. “We’ve done it – and not the way you might expect.” We’re just listeningIf using microphones on a drone to listen for other aircraft is such a promising technology, you might ask, why isn’t it everywhere? Listening to detect objects isn’t new. During World War I, soldiers put their ears up to giant funnels to hear aircraft coming over the horizon. Submarines are regularly tracked by the sounds they make. But Zipline is uniquely motivated to perfect and scale this technology for drones to detect aircraft for a few reasons.Soldiers in WWI used giant listening horns to locate enemy aircraft.First, Zips fly in the band of airspace about 300-400 feet above the ground. They share the air with small, private planes that, unlike commercial jets, aren’t all required to transmit their location. About 30% don’t — these are known as “non-cooperative” aircraft — and their system for avoiding other aircraft is the pilot looking out the window. Second, Zipline tried an exhaustive amount of other technologies and none of them could both reliably detect other aircraft and fit on the drone. This meant that even though listening was difficult to solve, the team went for it. “At every step along the way, you really have to challenge conventional wisdom to find a way to make it work,” says Zipline engineer Rohit Sant. “The first question we get a lot of times is: ‘How do you send out sound and capture the reflections?’ They assume we’re using echolocation.’” Sant says. “And we say, ‘We’re not — we’re listening.’”More conventional methods such as radar and lidar on drones won’t work. Because drones have to navigate complex airspace, they need a long-range, 360-degree detection system. Radar and lidar systems with that capability would weigh more than the drone. “At every step along the way, you really have to challenge conventional wisdom to find a way to make it work.”Cameras, by themselves, can’t solve the problem either. Though they capture accurate data, they struggle to see a plane even half as far away as microphones can “hear” one, and the difference is exacerbated in cloudy weather.This makes intuitive sense. Picture standing on the ground on a nice day —you’ll hear an aircraft before you see it. On an overcast day you may not be able to see it at all.But listening while standing on the ground is different than listening from a drone flying 60 miles per-hour. “We had to solve a catch-22 situation, where the Zip is using the airflow to fly — that’s how flight works,” Sant says. “But that airflow, moving across a microphone, is noisy.”Most people have experienced this type of air acoustic noise. Think about riding in a moving car, talking to a friend in the back seat. Now think about sticking your head out the window and trying to continue the conversation — anything you’d try to say would be drowned out by the sound of wind flowing over your ears. Sorting signal from noiseThe team needed to solve the wind noise problem, both in the hardware and software of the system. Starting with hardware, the DAA team designed and tested dozens of microphone shapes, sizes and configurations. Eventually, they landed on long, slim microphones shaped like drinking straws to reduce both noise and drag, and placed eight along the wing of the drone. The microphones picked up noise from the air rushing over them as well as sound emitted by other aircraft. To separate the two, the team needed to solve the problem in software. That software system starts with a method called beamforming. Beamforming works on the principle that a sound coming from a specific source will reach each of the eight microphones at a slightly different time. “Let’s say you record one sound at one microphone,” Sant explains. “And then, we record the same sound at the second microphone a fraction of a second later.” Because you know the speed of sound, and the distance between each microphone, “You can compute the angle from which the sound came.”Sound from an emitter reaches microphones on the Zip’s wing at different times. With that information, engineers can determine the angle from which the sound came.With that information, fed into machine learning models Zipline has developed over years, the sound of air traffic such as a propeller-driven plane will emerge, creating a high level of certainty that a specific aircraft is approaching the Zip from a certain angle. Next, the team needed to build a tracker, Zipline’s tool for creating a comprehensive mental map of the drone’s environment. “The tracker turns acoustic data into a series of hypotheses about the location of objects in the Zip’s environment and the certainty of their location, tracked over time,” says Zipline engineer Dan Bronstein. “Every fraction of a second, the tracker gets more information, which it compares against its previous hypotheses.” The tracker represents these projections with a cloud of particles that look like dots. Each particle is actually a hypothesis of the emitter’s location, direction, velocity and type of object. Over a few seconds, particles representing less likely emitter locations fade, creating a cluster of particles in the approximate direction of an emitter. Zipline’s tracker, a component of its on-board DAA system, detects another aircraft.Based on that 3-D probabilistic map, Zipline’s motion planning capability guides the drone to take one of a series of actions. It can stay the course, deviate from its course to increase overall safety, or return home.The key to getting the entire system ready to operate in the airspace is a ton of testing and mountains of data. “Perception relies on deep learning models that are trained on data,” says Ananda. “You’re only as good as how much data you have. And studying that data is key for understanding the true complexity of the problem.” First, the team trained its deep learning model on data that represented the types of encounters Zips would have in the airspace. “We run millions of scenarios,” said James Ferrese, a systems engineer at Zipline. “We vary all sorts of parameters in simulation like the wind, the relative geometry of other aircraft, the terrain underneath us, the routes we fly, and the type of air traffic we encounter. What’s most important is that we capture the parameters that matter in simulation and that we vary those parameters in a conservative yet realistic way.” Next, Zipline engineers validated the system, which had been tested in simulation, against data from test flights. The team tested how acoustic perception performed using tens of thousands of flight test encounters with aircraft flown by hired pilots. A Zip uses acoustic DAA to avoid a hired pilot flying a helicopter.“Today, acoustic DAA has been fully integrated into Zipline’s comprehensive on-board safety system,” says Matt Lubbers, Zipline’s head of systems engineering and safety system. “It’s the culmination of five years of intense testing and research, developed in close partnership with the FAA.”“At the end of the day, DAA is about safety,” says Zipline CTO Keenan Wyrobek. “Acoustic perception may not have been an obvious choice to solve DAA. It definitely wasn’t the easy choice. But after years of testing and refining the technology, we’re the only ones who’ve solved it. That’s why we’ve been able to put a reliable, long-range, 360-degree perception system on Zipline drones.”
Source link