ElevenLabs, whose tech was used for the pretend Biden robocalls, companions with AI-detection firm

[ad_1]

In short: ElevenLabs, the startup that made headlines earlier this yr when its AI tech was used to clone President Biden’s voice in a collection of robocalls, has partnered with an AI detection firm to combat deepfakes. It is an import transfer throughout an election yr during which misinformation is proving an enormous drawback.

In January, a 39-second robocall went out to voters in New Hampshire telling them to not vote within the Democratic main election, however to “save their votes” for the November presidential election. The voice handing out this recommendation sounded nearly precisely like Joe Biden. Voice-fraud detection firm Pindrop Safety later analyzed the decision, concluding that it was created utilizing a text-to-speech engine made by ElevenLabs, which the corporate confirmed. The one that created the deepfake was suspended from ElevenLabs’ service.

Now, Bloomberg stories that ElevenLabs has partnered with Actuality Defender, a US-based agency that provides its deepfake detection companies to governments, officers and enterprises.

The partnership seems to be a mutually helpful one. Actuality Defender will get entry to ElevenLabs’ voice cloning knowledge and fashions, permitting it to higher detect fakes, whereas the AI agency can use Actuality Defender’s instruments to assist stop its merchandise from being misused. An ElevenLabs spokesperson stated there was no monetary ingredient to the deal.

ElevenLabs prohibits the cloning of an individual’s voice with out their permission. Its companies are marketed as having the ability to bridge language gaps, restore voices to those that have misplaced them, and make digital interactions really feel extra human.

Deepfakes which can be capable of create extremely real looking video and audio of actual individuals are changing into more and more superior. Not solely is it getting used to put victims in porn, but it surely’s more and more being abused within the political area – an enormous concern throughout an election yr.

The incident with Biden’s voice led to the FCC’s proposal to make using AI-generated voices in robocalls unlawful underneath the Phone Client Safety Act, or TCPA. The company cited considerations that know-how was getting used to confuse and deceive customers by imitating the voices of “celebrities, political candidates, and shut relations.”

Throughout his State of the Union deal with in March, Biden known as for a ban on AI voice impersonation, although the extent of the proposed ban was unclear.

[ad_2]

Leave a Reply

Your email address will not be published. Required fields are marked *