[ad_1]
What it’s essential to know
- Scarlett Johansson says she was approached by OpenAI final 12 months about utilizing her voice for a ChatGPT voice assistant.
- Although Johansson didn’t comply with the proposition, OpenAI shipped its GPT-4o mannequin with a voice referred to as “Sky” that sounds fairly much like Johansson‘s.
- After authorized strain, OpenAI eliminated Sky from GPT-4o, and stated that the voice was not primarily based on Johansson‘s.
- Nonetheless, in attempting to make use of a pleasant and welcoming voice to make AI really feel extra comforting, OpenAI ended up doing the other.
OpenAI made waves final week when it introduced GPT-4o, a multimodal AI mannequin that could be probably the most superior and futuristic one we’ve seen thus far. It feels like a human, can work together with customers through imaginative and prescient and audio, and is educated. OpenAI ended up beating Google to the punch, and GPT-4o appears extra superior than Venture Astra, which Google previewed at Google I/O 2024.
However one of many voices OpenAI selected for GPT-4o has been drawing consideration on-line for all of the mistaken causes. First, some customers on social media identified they thought the voice “Sky” was overly flirty and sultry to the purpose it was unsettling. Then, folks began noticing the similarities between the voice of Sky and that of Scarlett Johansson, the award-winning actress. Now, it seems that could have been intentional.
To be clear, OpenAI denies that the voice of Sky was primarily based on Johansson and even launched a weblog put up explaining how the voices had been chosen. Nevertheless, Johansson put out a scathing assertion telling the story of how OpenAI approached her about formally voicing GPT-4o, which she declined. After going through authorized strain from Johansson’s legal professionals, the corporate eliminated the Sky voice possibility from GPT-4o.
As distressing as this case is, it’s virtually ironic. OpenAI’s CEO Sam Altman advised Johansson that her voice, being the official voice of ChatGPT, could be extra comforting to customers. And but, by releasing a voice so much like Johansson‘s with out her permission, Altman and OpenAI ended up completely encapsulating all the things that makes folks uncomfortable about AI.
Did OpenAI steal Scarlett Johansson‘s voice?
Although OpenAI says that it sought out skilled voice actors for GPT-4o and didn’t search somebody who gave the impression of Johansson particularly, the proof may inform a special story. It begins in September 2023, based on Johansson, when OpenAI’s Altman reached out about hiring her as a voice actor for ChatGPT.
“He advised me that he felt that by my voicing the system, I might bridge the hole between tech firms and creatives and assist customers to really feel comfy with the seismic shift regarding people and Al,” she stated in a press release to NPR’s Bobby Allyn. “He stated he felt that my voice could be comforting to folks.”
Johansson ultimately determined to not go ahead with voicing GPT-4o. Nevertheless, it’s straightforward to listen to her resemblance within the Sky voice that ended up being demoed and shipped with the AI mannequin. To say that Johansson was displeased with the outcome could be an understatement.
Assertion from Scarlett Johansson on the OpenAI scenario. Wow: pic.twitter.com/8ibMeLfqP8Could 20, 2024
“We consider that AI voices shouldn’t intentionally mimic a star’s distinctive voice— Sky’s voice isn’t an imitation of Scarlett Johansson however belongs to a special skilled actress utilizing her personal pure talking voice,“ OpenAI stated in a weblog put up.
The entire cause that OpenAI wished a voice like Johansson’s, as Altman is claimed to have advised her, is to make AI extra comforting. Individuals could also be extra scared about AI than they’re enthusiastic about it. Particularly these in artistic industries are discovering that AI is getting used to automate writing, visible artwork, music, and different mediums. This isn’t one thing distinctive to OpenAI — Apple not too long ago got here beneath hearth and apologized for an commercial that actually noticed devices being crushed into items and changed with an iPad.
By utilizing her likeness in a GPT-4o voice with out her permission — whether or not deliberately or unintentionally — OpenAI ended up validating the discomfort related to AI that it was desperately attempting to deal with. Creatives, from actors and actresses to writers and photographers, are frightened about being changed by AI. The concept that OpenAI might have mimicked Johansson‘s voice for GPT-4o is precisely the form of factor that worries and alarms folks in artistic industries.
“After I heard the launched demo, I used to be shocked, angered and in disbelief that Mr. Altman would pursue a voice that sounded so eerily much like mine that my closest buddies and information retailers couldn’t inform the distinction,” Johansson wrote, explaining that she requested OpenAI to indicate the way it developed the Sky voice. “In a time once we are all grappling with deepfakes and the safety of our personal likeness, our personal work, our personal identities, I consider these are questions that deserve absolute readability.”
We shouldn’t need AI to sound this human
Other than the unsettling concept that an organization might rip off an actress‘ voice after disagreeing with a deal, there are different explanation why we don’t need AI voices to sound like Sky. All of OpenAI’s GPT-4o voices, and particularly Sky, sound very human-like. It is a drawback, as a result of there’s a excessive stage of belief and familiarity folks have with human voices. If you discuss to a voice assistant like Siri or Alexa, it’s clear that you just’re speaking to — for lack of a greater phrase — a robotic. After having a dialog with GPT-4o, that stage of readability gained’t all the time be at the back of your thoughts.
Proper now, AI fashions have an issue, and it’s that they confidently state their solutions as reality even when they’re blatantly mistaken. Individuals nonetheless find yourself believing AI responses as true regardless of the array of warnings that come together with them. As voices for AI fashions change into extra human-sounding, this drawback will solely worsen. It’ll be straightforward for the common consumer of an AI instrument to consider what’s being stated because of the welcoming human voice it makes use of.
In attempting to make folks extra comfy with the way forward for AI, OpenAI ended up making it really feel extra dystopian. We shouldn’t need AI to sound as human as GPT-4o, and there are many explanation why. It might foster an unwarranted stage of belief between customers and AI fashions, in addition to put creatives like Johansson In a precarious place.
[ad_2]