Advertisement

Surveillance fears grow after Taylor Swift uses face recognition tech on fans

Swift has stayed silent following the Rolling Stone report that she secretly surveilled fans.
Swift has stayed silent following the Rolling Stone report that she secretly surveilled fans. Photograph: Jun Sato/TAS18/Getty Images

Taylor Swift secretly secretly surveilling her fans using facial recognition technology might sound like science fiction – but Rolling Stone reported on Thursday that the pop star has been doing exactly that in an effort to root out stalkers.

Swift has stayed silent on the report, declining to comment to the Guardian and other news organizations. But the episode has raised ethical questions for civil rights groups concerned about privacy.

“Stalkers are a generally scary phenomenon and everyone understands why someone like Taylor Swift would want to be protected against them,” says Jay Stanley, the ACLU’s senior policy analyst. “But this does have larger implications. It is not about this one deployment, it is about where this is technology is headed.”

According to a study by Georgetown University in 2016, roughly 117 million people’s identities are already in facial recognition databases and there is minimal legal instruction on how that data can be used.

“It is generally the wild west when it comes to the use of this technology,” Stanley says, adding that Illinois was the only state with comprehensive laws on the books about how biometric technology can be accessed and used.

Critics are concerned about the implications. The technology, Stanley explains, isn’t perfect and can cause security staff to target innocent people. There’s also the issue of how the data is collected, managed and stored, and whether it will be breached or shared.

“Even if they do everything right at the Taylor Swift concert, as we see this technology spread there are going to be a lot of problems along these lines,” he says.

But some say there the technology brings important opportunities, as long as it is ethically administered. Mary Haskett, the co-founder of a facial recognition company called Blink Identity, founded in February this year, thinks the technology will make people’s lives better. Her company uses facial recognition to enable faster entry at events for people who have opted-in. Blink Identity hopes to soon expand into other sectors.

“We wanted to do something with a lot of respect to privacy and turn this into something people can use to make life easier,” she says.

Users enroll in the service by sending a selfie, and a small sensor “about half the size of a lunchbox” captures a person’s image as they walk by. Haskett says most people are happy to opt-in to the system in the name of convenience, but she says the secrecy surrounding the Taylor Swift situation is unsettling.

Critics and advocates agree and are calling for a crackdown.

“It appears that whoever set up this system was using subterfuge to get people to look into a screen and record people’s faces without their knowledge,” says Jennifer Lynch, the surveillance litigation director at the advocacy organization Electronic Frontier Foundation, adding that this is the first time she’s heard about such secret use of the technology in the private sector.

It could be more widespread than we realize: “If no one had said anything about this, how would we have found out about it?”

Aside from the obvious creepiness factor, Lynch says the move could make Swift and her team vulnerable to lawsuits if they are holding on to personably identifiable data that can be breached. “She would be subject to any of the number of breach notification laws across the country and potentially subject to class action litigation,” she says, pointing to the data breaches at Target and Marriott hotels.

Lynch emphasizes that the news should be a wake-up call for the US government. “This is stuff we have seen happen in China, and in the United Arab Emirates,” she adds, “but not in the United States.”