Not All Drone Detectors are Created Equally
How the Boston Marathon could have been better protected from drones
- By Phil Wheat, Zain Naboulsi
- Jun 01, 2015
On a recent Monday morning, about 1 million spectators and
30,000 runners participated in the Boston Marathon, two years
after the bombing tragedy that impacted so many people’s lives.
An article from BostonHerald.com titled, “Commissioner: ‘No
threats out there’ for race,” mentioned that the city had deployed
a number of extra security measures, and one of those was the use of about 10
drone-detection units.
Protecting people from drones is a forward-thinking initiative for public security
and is absolutely the right thing to do. Perhaps the detectors made people feel
a bit more protected from a variety of amorphous threats, too. Unfortunately, the
reality is that Boston’s drone-detection was less effective than it might have been.
Let’s take a look at the situation.
The system DroneShield deployed uses audio detection—very similar to gunshot-
detection systems that have been used in a number of large metropolitan areas.
These normally provide suitable validation because there are many of these
systems around, but detecting a drone is different and much more difficult. Gunshot
detectors are focused specifically on the loud, sharp report of a shot—this is
extremely loud and very simple to hear—a single pulse of sound.
Drones, however, are much more complex. Drones are quiet compared to a
gunshot—most of them are visually and audibly noticeable within a few feet, but
that drops off quickly as they get further away. The noise they make is so variable
that each type of drone has a unique signature, and the drones themselves change
their sound depending on whether they’re hovering, moving or even if their propeller
blades get worn or nicked.
Hearing the drone is just part of the challenge—recognizing it in a noisy environment
is almost impossible. Computer programs exist that are adept at matching
sounds against audio patterns—this is how YouTube is able to detect unlicensed
songs automatically on its site. But in those cases, the audio track is the
only sound and is therefore isolated; if there are other sounds mixed in, it becomes
more difficult to make a match.
For example, if you listen to a YouTube video in which someone is in public and
a song can be heard mixed in with the regular day-to-day noise, you’ll likely find
the song isn’t tagged—its audio is different enough that the pattern the software
is looking for doesn’t match. This same problem exists when detecting drones in
locations with plenty of ambient noise.
This weakness is noted by DroneShield’s founder, Brian Hearing, who says he’s
eager to see how effectively the sensors filter out crowd and other noises.
He’ll be lucky to have heard anything but a clash of noises.
What about once the drone is detected? The DroneShield system comes with
net guns that were given to police officers—the same types that scientists often use
to capture birds for tagging. This seems like a great idea, except the range of the
nets are generally 50 feet or less. Drones have to essentially be stationary and quite
close to the officer to be caught.
Finally, how do cities go about protecting the public from malicious drones,
and why do we care?
“We are detection experts and we take our job very seriously,” Hearing said. “We
know how the various types of drone detectors work and don’t work. We have spent a
great deal of time on this and, full disclosure, we sell a product called Drone Detector.
“Ours is a system that leverages multiple methods to detect if a drone is in use
and, if so, what information can be determined about it. We use audio, too, but we
amplify the detector’s ability by adding radio frequency and GPS location services
so we can spot a drone lots further out—roughly 400 meters. Once we find the
drone, we can find the operator.”
As drones change and evolve, people will need to continually assess detection systems
to ensure that they work as effectively as possible. We encourage everyone interested
in this space to do competitive evaluations
and determine what works most efficaciously in
their area and for their specific needs.
This article originally appeared in the June 2015 issue of Security Today.
About the Authors
Phil Wheat is a co-founder and CTO of Drone-Shield.
Zain Naboulsi is a co-founder, and is the CEO of DroneShield.