Featured
The comparison was drawn almost immediately.
When OpenAI demoed its upgraded voice assistant during its “Spring Update” event last week for its new ChatGPT-4o product, many in attendance and those watching from home found the assistant’s voice sounded strikingly similar to the female AI in the science fiction film “Her.”
In the movie, a middle-age man played by Joaquin Phoenix falls in love with an artificial intelligence system named Samantha, voiced by Scarlett Johansson.
OpenAI’s chief technology officer at the time denied claims that the company deliberately designed the chatbot in the likeness of the fictional character. But shortly after the demo, OpenAI CEO Sam Altman tweeted out the word, “her,” which many took as a direct reference to the 2013 film.
Among those who took exception to the assistant? Johansson herself.
The actress released a statement describing her dismay that the assistant’s voice sounded similar to her own, especially considering that OpenAI CEO Sam Altman had reached out to her in September to voice the assistant, and had followed up again just two days prior to the demo event.
OpenAI has “paused” that voice option in the assistant, and has denied claims that it is a copy of Johansson. The Washington Post has reported that it had reviewed documents and recordings that showed the company did not, in fact, copy Johansson’s voice.
Johansson and her legal team are requesting the company provide a detailed explanation on how the “Sky” assistant was made. The Verge, a technology news publication, also noted that the “Sky” voice was actually released in September when the company added voice mode to ChatGPT, but did not spark widespread attention until last week’s demo.
“The voice of Sky is not Scarlett Johansson’s, and it was never intended to resemble hers,” Altman said in a statement. “We cast the voice actor behind Sky’s voice before any outreach to Ms. Johansson. Out of respect for Ms. Johansson, we have paused using Sky’s voice in our products. We are sorry to Ms. Johansson that we didn’t communicate better.”
her— Sam Altman (@sama) May 13, 2024
her
David Simon, an associate professor at Northeastern’s School of Law, says the situation is far from black and white but there are some legal levers Johansson and her team could potentially pull if the case escalates.
Simon says there are at least three potential claims Johansson could take against OpenAI: a right of publicity claim, a trademark claim and a copyright claim.
The first one, a right of publicity claim, is based on state law and therefore differs from state to state, he says, but “generally speaking it gives people the right to commercially appropriate their image or likeness, which can include a person’s appearance or voice.”
“Scarlett Johansson would argue that OpenAI is commercially appropriating her likeness without consent. Even if OpenAI did not use Scarlett’s voice to train its model, there’s case law that stating that trying to mimic someone’s distinctive and commercially valuable voice for financial advantage can be actionable,” he adds.
There are two cases that are similar to Johansson’s that took advantage of the publicity claim, explains Alexandra Roberts, a Northeastern University law and media professor — Midler v. Ford Motor Co. and Tom Waits v. Frito-lay, Inc. In both cases, Ford and Frito-lay had hired musicians to impersonate Bette Midler and Tom Waits for their commercials since both singers had declined to participate in their ads.
Second, Johansson could make a trademark claim, which “protects individuals against the possibility that someone confuse consumers by falsely suggesting that Johansson endorsed the OpenAI product,” Simon says.
“Scarlett Johansson might be able to say, ‘I have this distinctive voice. It’s associated with me. It has commercial value. It’s essentially like a trademark. By mimicking my voice you’re giving the impression that I’m endorsing the use of your product and consumers are confused about whether I’m doing that.’”
Simon says those first two claims would be the strongest, but Johansson could also potentially make a copyright claim, which “covers original works of authorship that are fixed in a tangible medium of expression, including sound recordings.
“If AI used sound recordings to train its AI model and Scarlett Johansson owned the copyright to those sound recordings, then she might be able to sue them for copyright infringement,” he says.
He points to The New York Times lawsuit against OpenAI, which alleges that the company illegally used New York Times articles to train ChatGPT.
But whether Johansson has a case here will depend on a lot of factors, he notes. And a lot of questions need to be answered.
“Are they trying to create a voice that is similar to Scarlett Johansson? Did they use Scarlett Johansson voice in doing that? Did they try to train AI to mimic Scarlett Johansson’s voice for commercial advantage? Did they seek a license from Johansson? Would people be confused about whether Scarlett Johansson endorsed the product?”
Sign up for NGN’s daily newsletter for news, discovery and analysis from around the world.
Roberts says she believes things will settle down and Johansson will probably not sue OpenAI since the company is no longer using the “Sky” voice.
“If they stopped using it, and they promised her they’re not going to use it, then she probably doesn’t have a case,” she says. “She probably doesn’t have anything to sue on anymore, and since it was just a demo, and it wasn’t a full release to the general public that offers the full range of services they plan to offer, it would be really hard for her to show any damages.”
Johansson approached the situation smartly in several ways, Roberts explains. By first asking for information and more transparency on the part of OpenAI, Johansson is aiming to get a better understanding of the situation without going into full lawsuit territory.
“When you send a letter that says, ‘Hey, I just want some information,’ as opposed to a letter that says, ‘You’re in big trouble, and I’m about to sue you,’ you avoid creating declaratory judgment jurisdiction. When you send a threatening letter, the recipient could say, ‘Oh she’s about to sue us. Let’s sue her first.’”
That could muddy the situation because then the recipient could be the one that sets the terms of what they will be suing for, Roberts explains.
“When they sue, they could say that we’re seeking a declaratory judgment of non-infringement and non-misappropriation, so she probably wants to avoid that because she wants to decide whether she sues and how she sues,” Roberts says.
Additionally, she has been able to broadcast the message that the voice attached to the assistant is not her’s and that she is angry.
“That’s useful in dispelling any confusion and turning public sentiment against Altman and OpenAI and garner some sympathy,” Roberts says. “I think more broadly it generates anger and investment in this bigger conversation that I think lawmakers, celebrities, and entertainment industry players are working through, which is how much is OK, what’s allowed and what’s not in this new environment.”