While thousands of Taylor Swift fans were watching the singer at a concert earlier this year, Swift was also watching them.
Her security team used hidden facial recognition technology to scan the crowd for stalkers, employing a mode of surveillance that’s still in its infancy and prone to corruption, said Woodrow Hartzog, a Northeastern law professor who studies technology and privacy.
A kiosk located inside the concert venue for Swift’s show at the Rose Bowl in May played a highlight reel of her rehearsals, according to Rolling Stone.
Concealed within the kiosk was a camera equipped with facial recognition technology that scanned the faces of those watching the video and sent the information to a command post, where it was checked against a database of Swift’s known stalkers.
Using mass surveillance equipment this way may have been based on good intentions, Hartzog said. But it also presents a slippery slope.
“This technology does get co-opted for harmful purposes, particularly for marginalized and vulnerable populations,” he said.
How does it work?
There are several uses for facial recognition technology, Hartzog said, and some present more potential for misuse than others.
“Sometimes ‘facial recognition’ really means ‘facial detection,’” Hartzog said.
This is technology trained to discern the difference between a human face and a coffee mug, for example. It’s what the makers of autonomous cars use to teach the vehicles to avoid hitting a person, Hartzog said.
There’s also facial characterization, in which a computer is trained to make assumptions about the expression on a face, or whether it belongs to a man or a woman.
Then there’s technology that latches onto a person’s unique face and uses the information as authentication. This is the technology that enables iPhone users to unlock their phones with their face.
The “most pernicious” use of facial recognition, and the kind used at Swift’s concert, Hartzog said, is the identification of one face among many. It’s looking for specific people in a crowd of people, determining a person’s identity in real time.
“That’s one of the things I fear most,” Hartzog said. “Because if we get to that point, it means we have mass real-time surveillance.”
What does facial recognition technology record?
Facial recognition technology uses a series of measurements to create a sort of “face print,” similar to a thumbprint, Hartzog said. Sensors measure the distances between our eyes, nose, and lips, as well as specific contours of our faces that are unique to each person. So, it’s not a photo, but a series of measurements and data that, together, match only a single face.
Is that legal?
Yes, Hartzog said. Only three states in the United States (Washington, Texas, and Illinois) have passed laws governing the collection and use of biometric data. Unique biological measurements or calculations, such as facial information, fingerprints, or DNA, fall within this umbrella term of “biometrics.”
“The default rule for this sort of surveillance in the U.S. is that all sorts of facial recognition and surveillance is allowed unless it’s expressly prohibited,” Harzog said. “So, there’s a lot of freedom for people to experiment with these things.”
The steady, experimental rollout of this sort of technology “only adds to the urgency to get clear rules and moratoriums in place across America,” he added.
OK, but is it good?
There are two general benefits for the widespread use of facial recognition technology, Hartzog said. First, it makes things marginally more convenient.
“It’s easier to unlock our phones with our face than by having to put our thumb on them,” he said.
Second, it could make it possible to identify and find missing children, criminals on the run, or people in the process of committing a crime. Or, in Swift’s case, thwart stalkers before they act.
But to see the benefits in this second scenario, there would have to be “cameras everywhere, face print data that is widely shared and easily accessible,” Hartzog said.
“There are significant benefits to facial recognition technology that shouldn’t be dismissed outright,” he said. “But to realize them, we’d have to give up so much of our autonomy and have to submit ourselves to so much surveillance that at that point, we’ve already lost the surveillance debate.”