We can’t get enough of FaceApp. But should we be giving away the rights to our faces?

Celebrities and ordinary people alike are flocking to FaceApp. Photo via Instagram

Amid the filtered and Photoshopped images of glamorous fashion models, social media feeds this past week have been inundated with photos of aging faces, thanks to a viral smartphone app that allows users to upload a headshot and get a glimpse of what they’d look like in their golden years. 

FaceApp has been shunned by some privacy analysts and politicians, who have expressed concerns about its terms of service, which gives the Russian development team behind the app the ability to use photos uploaded by users for commercial purposes.

But users—celebrities included—can’t get enough of FaceApp, helping it go viral with a staggering 12.7 million downloads since early July.

We asked David Choffnes, an assistant professor at Northeastern’s Khoury College of Computer Sciences whose research includes designing solutions to internet security and privacy, to break down why users find the app so appealing, the risks associated with it, and measures we can take to protect our privacy online.

It seems like just yesterday when we had the public outcry over the Cambridge Analytica scandal. Have we not learned anything from that?

I think that there’s a combination of factors when people make decisions about what to use when they’re online. Thinking about privacy is not something that we as a society have been taught to prioritize, and also instinctively it’s not something that’s innate to us. We often value convenience, entertainment, trust in our friends, and peers when they share something with us; you know, we’re social creatures. There are a number of factors that are baked into us as human beings that these kinds of technologies exploit. 

And then on top of that, the regulations in the U.S. and how companies are required to disclose information can play a role. It’s not the case that you get a big warning when you open the app that says, ‘Hey, this data can be used in any way.’ I mean you get fine prints, [but] nobody reads the fine print.

What are the privacy risks associated with this app? 


The obvious privacy concern is it’s your face that is now in the hands of a company that can go and use that face however they want, according to their terms and conditions. So let’s say they just Photoshop you into an image—make a deep fake, for example, where you’re doing something compromising or embarrassing. 

There’s the potential that they could sell that data. If we go in a different direction, let’s say health insurance. Maybe as a result of this picture they could tell that you’re in the sun too much and you’re likely to get skin cancer, so your health insurance rates are going to go up because you’re at higher risk for costing them money down the road. 

There are security concerns. We use our faces now to unlock devices or log into devices, so you’ve just given one of your key credentials to someone else for free; pictures you upload might not just be your face, so you may be giving other people’s faces to this company, and again they can do whatever they want with it according to their terms.

They could sell faces to the Department of Homeland Security and the U.S. Immigration and Customs Enforcement and try to use this technology to identify undocumented immigrants for deportation. 

How warranted is the panic over Russians being behind the app?

It may not matter, but ultimately, at least in the form of current geopolitics, there is strong evidence, including successful prosecutions of parties that tried to influence our elections from Russia. This is essentially a state-sponsored activity. So now a Russian company that may or may not have ties with the Russian government has access to faces and may be able to use those in an influence campaign, and when they set up these fake accounts, convince you that they’re real people because they’re actually real faces that they’re using. That’s an example of something that they could do and why it would be concerning that this was in the hands of a Russian company.

What are some things we should be looking for the next time we find ourselves enticed by a viral app?

Whenever someone asks you to do something—visit a site, or if it’s something online and that something is asking for information and that information could be text that you enter or it could be an image; it could be an audio recording—whatever it is—you’ve got to ask yourself why are they collecting that data, and are you okay with them having that data. 

It’s one of those things where I think consumers need to be much more skeptical of things that are online and just because it sounds fun and your friends have done it does not mean that you should necessarily do it.

For media inquiries, please contact Shannon Nargi ats.nargi@northeastern.edu or 617-373-5718.