faceapp screen

Viral FaceApp Draws Concerns From Users About Data Privacy

The app, which is based out of Russia, was popular for its ability to make users look older. But critics say there’s a catch.

An app that uses artificial intelligence to make users look older, younger or a different gender went viral over the past week, with everyone from Kevin Hart to K-pop stars BTS posting selfies of their aged faces. 

But almost as quickly as the trend caught on, the reality of privacy concerns caught up with people who downloaded FaceApp, an application with over 80 million active users that was created by a Russia-based startup in early 2017.

Among the concerns pointed out by cybersecurity experts, journalists, lawmakers and app users alike: There was not much known about whether FaceApp uploads users’ photos to the cloud or if they had access to all photos on an individual phone even if the user had not granted access to their photo library.

The photo library issue is actually allowed in Apple’s operating system, as iOS allows users to select specific photos to upload to apps even if they did not give permission to access the entire library, TechCrunch reported.

However, the answer to the cloud issue is not as simple. FaceApp told TechCrunch and other media outlets that most of the processing that powers its transformations of people’s faces is done in the cloud.

In its statement, the company said it only uploads photos selected by the user for editing.

“We might store an uploaded photo in the cloud,” FaceApp said. “The main reason for that is performance and traffic: we want to make sure that the user doesn’t upload the photo repeatedly for every edit operation.”

The startup added that “most images” are deleted from their servers within 48 hours of the upload and that they accept requests from users who desire to have all of their data deleted from their servers. It said the company’s support team is “currently overloaded” with these requests.

In addition, FaceApp responded to concerns about the company’s location in Russia and the potential for the Russian government to access users’ facial data. Although the company’s core research and development team is located in Russia, the user data is not transferred to Russia, according to FaceApp.

“We don’t sell or share any user data with any third parties,” the statement said.

The assurances did little to quiet concerns among lawmakers and users. Sen. Chuck Schumer (D-NY), the senate minority leader, sent a letter to the Federal Bureau of Investigation and Federal Trade Commission Wednesday asking for the agencies to investigate the app’s claims about its protections of user data.

Schumer pointed out that the terms and conditions that users agree to when they use FaceApp allow the company to use or publish content shared with the application, including a username or real name, without notifying or paying users.

“I have serious concerns regarding both the protection of the data that is being aggregated as well as whether users are aware of who may have access to it,” Schumer wrote.

Schumer also focused on FaceApp’s location in Russia and how the company “provides access” to the data of American citizens to third parties or foreign governments.

“It would be deeply troubling if the sensitive personal information of U.S. citizens was provided to a hostile foreign power actively engaged in cyber hostilities against the United States,” Schumer wrote.

He added: “In the age of facial recognition technology as both a surveillance and security use, it is essential that users have the information they need to ensure their personal and biometric data remains secure, including from hostile foreign nations.”

It remains to be seen how the privacy concerns will affect FaceApp’s popularity, but tech experts advise users to think more carefully about what apps they download and what information they share.

“I completely understand that it's nearly impossible to protect your data around the web,” tech journalist Charlie Warzel wrote on Twitter. “But downloading/not downloading apps is a really concrete way to protect your privacy. It's a rare situation where the user is in control.”

Featured

  • Report: 47 Percent of Security Service Providers Are Not Yet Using AI or Automation Tools

    Trackforce, a provider of security workforce management platforms, today announced the launch of its 2025 Physical Security Operations Benchmark Report, an industry-first study that benchmarks both private security service providers and corporate security teams side by side. Based on a survey of over 300 security professionals across the globe, the report provides a comprehensive look at the state of physical security operations. Read Now

    • Guard Services
  • Identity Governance at the Crossroads of Complexity and Scale

    Modern enterprises are grappling with an increasing number of identities, both human and machine, across an ever-growing number of systems. They must also deal with increased operational demands, including faster onboarding, more scalable models, and tighter security enforcement. Navigating these ever-growing challenges with speed and accuracy requires a new approach to identity governance that is built for the future enterprise. Read Now

  • Eagle Eye Networks Launches AI Camera Gun Detection

    Eagle Eye Networks, a provider of cloud video surveillance, recently introduced Eagle Eye Gun Detection, a new layer of protection for schools and businesses that works with existing security cameras and infrastructure. Eagle Eye Networks is the first to build gun detection into its platform. Read Now

  • Report: AI is Supercharging Old-School Cybercriminal Tactics

    AI isn’t just transforming how we work. It’s reshaping how cybercriminals attack, with threat actors exploiting AI to mass produce malicious code loaders, steal browser credentials and accelerate cloud attacks, according to a new report from Elastic. Read Now

  • Pragmatism, Productivity, and the Push for Accountability in 2025-2026

    Every year, the security industry debates whether artificial intelligence is a disruption, an enabler, or a distraction. By 2025, that conversation matured, where AI became a working dimension in physical identity and access management (PIAM) programs. Observations from 2025 highlight this turning point in AI’s role in access control and define how security leaders are being distinguished based on how they apply it. Read Now

New Products

  • PE80 Series

    PE80 Series by SARGENT / ED4000/PED5000 Series by Corbin Russwin

    ASSA ABLOY, a global leader in access solutions, has announced the launch of two next generation exit devices from long-standing leaders in the premium exit device market: the PE80 Series by SARGENT and the PED4000/PED5000 Series by Corbin Russwin. These new exit devices boast industry-first features that are specifically designed to provide enhanced safety, security and convenience, setting new standards for exit solutions. The SARGENT PE80 and Corbin Russwin PED4000/PED5000 Series exit devices are engineered to meet the ever-evolving needs of modern buildings. Featuring the high strength, security and durability that ASSA ABLOY is known for, the new exit devices deliver several innovative, industry-first features in addition to elegant design finishes for every opening.

  • FEP GameChanger

    FEP GameChanger

    Paige Datacom Solutions Introduces Important and Innovative Cabling Products GameChanger Cable, a proven and patented solution that significantly exceeds the reach of traditional category cable will now have a FEP/FEP construction.

  • Compact IP Video Intercom

    Viking’s X-205 Series of intercoms provide HD IP video and two-way voice communication - all wrapped up in an attractive compact chassis.