The Flaws and Dangers of Facial Recognition

The Flaws and Dangers of Facial Recognition

The best way to prevent your facial identity from being stolen is to limit it to airport and border security use cases

When dealing with airport and border security, we need databases, we need to share information and we need law enforcement. However, for day-to-day authentication use cases like IoT, we should resort to physiologic biometrics that rely on unique live signals. Such live signals allow for effective authentication while at the same time protecting privacy and democracy.

Problem: Facial Recognition Can be Spoofed and Hacked

By 2020 it is expected that more than one billion smartphones will feature facial recognition solutions (Counterpoint Research). In 2017, when Apple announced Face ID would be one of the newest features incorporated into the iPhone X model, it was not long before mobile phone companies followed suit. Users merely look at their smartphone screen and it unlocks, creating the most contactless mobile authentication to date. It quickly surpassed the coveted fingerprint authentication. However, it did not come without flaws.

Quickly becoming ubiquitous, study after study exposed vulnerabilities in facial recognition. Researchers from the University of Toronto were able to use adversarial learning to beat a neural net using another neural net. According to the study, by adjusting only a few pixels at the corner of a person’s eye or mouth would be unrecognizable to the facial recognition technology. Apple has set the highest standard for facial recognition with Face ID, developing a second camera called the “True Depth Camera,” which maps your face and takes special 3D pictures that are used to authenticate you with an infrared camera, flood illuminator and dot projector. However, not every device can withstand extensive tests. Dutch organization Consumentenbond found that 42 out of 110 devices tested were unlocked by using a picture of the device’s owner. Lenovo/Motorola, LG, Nokia, Samsung, and BlackBerry were all compromised.

Not only has facial recognition been spoofed and hacked, but the use of databases has added vulnerabilities, including widespread breaches. In 2018, there were many soon-to-be historic data breaches— how can we trust our facial identity is protected in this climate? Another important question: what’s stopping big companies from selling this information to the highest bidder? Nothing. In fact, Amazon offers Face Rekognition, which allows clients to build their own facial recognition system. According to Amazon’s blog post, Washington’s sheriff office has been using Amazon Rekognition since 2016 to “reduce the identification time of reported suspects from two to three days down to minutes and had apprehended their first suspect within a week by using their new system.”

Facial recognition databases can compromise democracy or be used for big data—or far worse—they can be wrong. According to a study done by the ACLU, Amazon’s Face Rekognition software incorrectly matched 28 members of Congress, identifying them as people previously arrested for a crime. Out of the 28 members of Congress wrongly identified, 40 percent of them were people of color.

Solution: Resort to Physiologic Biometrics for Everyday Use Cases to Protect Facial Identity

Facial recognition has the potential to be dangerous. In practice, we see that it can be hacked or spoofed, databases can be breached or sold, and sometimes it’s just not effective; as such, we should restrict facial recognition to viable use cases like airport and border security. In the example of airport and border security, we need facial recognition technology to use databases to make sure someone boarding a plane is not on a no-fly list. This will uphold a level of security we expect when traveling. Security and safety are not synonymous. As a society, we need to define which biometric solution will be the most successful for each given use case.

When we talk about biometric authentication for the IoT, we need to act safely, as all connected devices are susceptible to online threats. We cannot rely on facial recognition that is easily compromised by a mere picture of the device’s owner or tricked by an adversarial neural net. And, further, what is to happen if someone steals your facial identity? You can’t simply “cancel” your face like you would a stolen credit card.

We turn to the brain for answers. According to an article in Fast Company, researchers from Binghamton University used a combination of how the human brain reacts to stimuli along with the unique brain structure to create a “brain password,” a biometric solution relying on the brain’s “inexhaustible source of secure passwords.” Still in its infancy, this technology is contingent on 32 electrical sensors placed on one’s head; in the future these sensors can be put in a headset to compute accurate readings. However, there are other, less invasive ways to obtain this neural information. We can capture neuro-muscular data with high sensitivity kinetic sensors using Micro-Electro-Mechanical Systems (MEMS) present in standard mobile devices. Extracting this information can yield a stable and unique neural signature with the potential to act as our key to the IoT.

At Aerendir we believe the future of biometrics should be as frictionless as facial recognition, but as strong as “brain passwords;” with this in mind, NeuroPrint was born. While our NeuroPrint technology can extract a unique neural signal from any muscle in the body, we started with the hands due to their connection to mobile devices. We are currently focused on adding sensors and microcontrollers into the seat of a car—the possibilities are endless.

The body can truly become our own personal password, our digital identity. Our brain provides a solution that authenticates, while at the same time shielding us from preying actors. Because if our neural signal is equivalent to a one million character-long password, we can safely encrypt all of our activities and communications if we were to decide to do so.

Of course, there are other physiological biometrics that could be used, including heartbeat and voice, but using the physiological signals of the body seems to be the most promising and SAFE way to avoid the associated dangers.

Safety is not security. The IoT needs safety. Safety is, by definition, something we as users should control. The IoT has to be usercentric to be the powerful tool it is bound to be; it should never become the door of a prison, which it could potentially become if we allow facial recognition to enter every facet of our lives.

This article originally appeared in the March 2019 issue of Security Today.

Featured

  • The Next Generation

    Video security technology has reached an inflection point. With advancements in cloud infrastructure and internet bandwidth, hybrid cloud solutions can now deliver new capabilities and business opportunities for security professionals and their customers. Read Now

  • Help Your Customer Protect Themselves

    In the world of IT, insider threats are on a steep upward trajectory. The cost of these threats - including negligent and malicious employees that may steal authorized users’ credentials, rose from $8.3 million in 2018 to $16.2 million in 2023. Insider threats towards physical infrastructures often bleed into the realm of cybersecurity; for instance, consider an unauthorized user breaching a physical data center and plugging in a laptop to download and steal sensitive digital information. Read Now

  • Enhanced Situation Awareness

    Did someone break into the building? Maybe it is just an employee pulling an all-nighter. Or is it an actual perpetrator? Audio analytics, available in many AI-enabled cameras, can add context to what operators see on the screen, helping them validate assumptions. If a glass-break detection alert is received moments before seeing a person on camera, the added situational awareness makes the event more actionable. Read Now

  • Transformative Advances

    Over the past decade, machine learning has enabled transformative advances in physical security technology. We have seen some amazing progress in using machine learning algorithms to train computers to assess and improve computational processes. Although such tools are helpful for security and operations, machines are still far from being capable of thinking or acting like humans. They do, however, offer unique opportunities for teams to enhance security and productivity. Read Now

Featured Cybersecurity

New Products

  • Connect ONE’s powerful cloud-hosted management platform provides the means to tailor lockdowns and emergency mass notifications throughout a facility – while simultaneously alerting occupants to hazards or next steps, like evacuation.

    Connect ONE®

    Connect ONE’s powerful cloud-hosted management platform provides the means to tailor lockdowns and emergency mass notifications throughout a facility – while simultaneously alerting occupants to hazards or next steps, like evacuation. 3

  • FEP GameChanger

    FEP GameChanger

    Paige Datacom Solutions Introduces Important and Innovative Cabling Products GameChanger Cable, a proven and patented solution that significantly exceeds the reach of traditional category cable will now have a FEP/FEP construction. 3

  • Compact IP Video Intercom

    Viking’s X-205 Series of intercoms provide HD IP video and two-way voice communication - all wrapped up in an attractive compact chassis. 3