Somewhere on the internet lives a hidden collection of photos depicting Woodrow Hartzog and his family. The blog is technically public. But it’s only accessible to a small audience of family and friends, as intended.
“Theoretically, anyone could find it,” says Hartzog, a law and computer science professor who specializes in data protection and privacy. Still, he says, he’d be surprised if anyone reading this could track it down. It’s not searchable on Google, and he doesn’t use full names in the blog posts. Obscured by the vast lattice of other sites that comprise the world wide web, his blog is effectively private.
People constantly use this needle-in-a-haystack effect to protect their identities on- and offline, says Hartzog. For example, you would never vent about your boss in a small office space. But you might air your complaints at a crowded restaurant where the chances of your boss overhearing you, or even seeing you, are much lower.
As surveillance technology becomes more abundant though, Hartzog fears the protective shroud of obscurity is at risk. With facial recognition technology being introduced in airports and deployed by law enforcement officers, getting lost in the crowd is becoming much harder.
“The value we place on obscurity runs headfirst into the concept of facial recognition,” says Hartzog, who advocates for an outright ban of these technologies. “I can’t envision any reality where people are better off thanks to facial recognition technology, even with permission requirements in place.”
In a law review article published earlier this year, Hartzog and his co-author, Evan Selinger, a philosophy professor at Rochester Institute of Technology, argue that even regulations such as consent requirements for facial recognition are not sufficient to protect people’s privacy.
Consent agreements are a broken regulatory tool in technology, and facial recognition is no exception, he says. (Even Hartzog accepts the “I Agree” button without reading the Terms of Service.)
Individual consent for facial recognition is particularly hairy because it jeopardizes collective privacy, he argues in the article. When one person accepts the terms of a certain facial recognition technology, peoples’ collective privacy diminishes—the haystack becomes smaller, and as more individual people press “I Agree,” order of elimination exposes even those who don’t.
Hartzog has dedicated his career to researching how surveillance and data practices affect individual privacy. As a legal scholar, he primarily focuses on proposing regulations that would better respond to peoples’ vulnerabilities in regards to technology.
This year, Hartzog earned the Excellence in Research and Creative Activity Award for the prolific body of work he produced in 2020, including several law review articles and a co-authored book titled Breached! Why Data Security Law Fails and How to Improve It.
For media inquiries, please contact firstname.lastname@example.org.