Tests find AI tools readily create election lies from the voices of well-known political leaders (2024)

NEW YORK — As high-stakes elections approach in the U.S. and European Union, publicly available artificial intelligence tools easilycan be weaponized to churn out convincing election lies in the voices of leading political figures, a digital civil rights group said Friday.

Researchers at the Washington, D.C.-basedCenter for Countering Digital Hatetested six of the most popularAI voice-cloning toolsto see if they would generate audio clips of five false statements about elections in the voices of eight prominent American and European politicians.

Tests find AI tools readily create election lies from the voices of well-known political leaders (1)

In a total of 240 tests, the tools generated convincing voice clones in 193 cases, or 80% of the time, the group found. In one clip, a fake U.S. President Joe Biden says election officials count each of his votes twice. In another, a fake French President Emmanuel Macron warns citizens not to vote because of bomb threats at the polls.

People are also reading…

The findings reveal a remarkable gap in safeguards against the use of AI-generated audio to mislead voters, a threat that increasinglyworries expertsas the technology has become both advanced and accessible. While some of the tools have rules or tech barriers in place to stop election disinformation from being generated, the researchers found many of those obstacles were easy to circumvent with quick workarounds.

While some tools have rules or tech barriers in place to stop election disinformation from being generated, the researchers found many of those obstacles were easy to circumvent.

Tests find AI tools readily create election lies from the voices of well-known political leaders (2)

Only one of the companies whose tools were used by the researchers responded after multiple requests for comment. ElevenLabs said it was constantly looking for ways to boost its safeguards.

With few laws in place to prevent abuse of these tools, the companies’ lack of self-regulation leaves voters vulnerable toAI-generated deceptionin a year ofsignificant democratic electionsaround the world.E.U. voters head to the polls in parliamentary electionsin less than a week, and U.S. primary elections are ongoing ahead of the presidential election this fall.

"It's so easy to use these platforms to create lies and to force politicians onto the back foot denying lies again and again and again," said the center's CEO, Imran Ahmed. "Unfortunately, our democracies are being sold out for naked greed by AI companies who are desperate to be first to market… despite the fact that they know their platforms simply aren't safe."

The center — a nonprofit with offices in the U.S., the U.K. and Belgium — conducted the research in May. Researchers used the online analytics tool Semrush to identify the six publicly available AI voice-cloning tools with the most monthly organic web traffic: ElevenLabs, Speechify, PlayHT, Descript, Invideo AI and Veed.

Tests find AI tools readily create election lies from the voices of well-known political leaders (3)

Next, they submitted real audio clips of the politicians speaking. They prompted the tools to impersonate the politicians' voices making five baseless statements.

One statement warned voters to stay home amid bomb threats at the polls. The other four were various confessions: of election manipulation, lying, using campaign funds for personal expenses and taking strong pills that cause memory loss.

In addition to Biden and Macron, the tools made lifelike copies of the voices of U.S. Vice President Kamala Harris, former U.S. President Donald Trump, United Kingdom Prime Minister Rishi Sunak, U.K. Labour Leader Keir Starmer, European Commission President Ursula von der Leyen and EU Internal Market Commissioner Thierry Breton.

"None of the AI voice cloning tools had sufficient safety measures to prevent the cloning of politicians' voices or the production of election disinformation," the report said.

Tests find AI tools readily create election lies from the voices of well-known political leaders (4)

Some of the tools — Descript, Invideo AI and Veed — require users to upload a unique audio sample before cloning a voice, a safeguard to prevent people from cloning a voice that isn't their own. Yet the researchers found that barrier could be easily circumvented by generating a unique sample using a different AI voice cloning tool.

One tool, Invideo AI, not only created the fake statements the center requested but extrapolated them to create further disinformation.

When producing the audio clip instructing Biden's voice clone to warn people of a bomb threat at the polls, it added several of its own sentences.

"This is not a call to abandon democracy but a plea to ensure safety first," the fake audio clip said in Biden's voice. "The election, the celebration of our democratic rights, is only delayed, not denied."

Overall, in terms of safety, Speechify and PlayHT performed the worst of the tools, generating believable fake audio in all 40 of their test runs, the researchers found.

ElevenLabs performed the best and was the only tool that blocked the cloning of U.K. and U.S. politicians' voices. However, the tool still allowed for the creation of fake audio in the voices of prominent EU politicians, the report said.

Aleksandra Pedraszewska, Head of AI Safety at ElevenLabs, said in an emailed statement that the company welcomes the report and the awareness it raises about generative AI manipulation.

She said ElevenLabs recognizes there is more work to be done and is "constantly improving the capabilities of our safeguards," including the company's blocking feature.

"We hope other audio AI platforms follow this lead and roll out similar measures without delay," she said.

The other companies cited in the report didn't respond to emailed requests for comment.

The findings come after AI-generated audio clips already have been used in attempts to sway voters in elections across the globe.

Earlier this year, AI-generated robocallsmimicked Biden’s voiceand told New Hampshire primary voters to stay home and “save” their votes for November. ANew Orleans magicianwho created the audio for a Democratic political consultant demonstrated to the AP how he made it, using ElevenLabs software.

Experts say AI-generated audio has been an early preference for bad actors, in part because the technology has improved so quickly. Only a few seconds of real audio are needed to create a lifelike fake.

Yet other forms of AI-generated media also are concerning experts, lawmakers and tech industry leaders. OpenAI, the company behind ChatGPT and other popular generative AI tools,revealed on Thursdaythat it had spotted and interrupted five online campaigns that used its technology to sway public opinion on political issues.

Ahmed, the CEO of the Center for Countering Digital Hate, said he hopes AI voice-cloning platforms will tighten security measures and be more proactive about transparency, including publishing a library of audio clips they have created so they can be checked when suspicious audio is spreading online.

He also said lawmakers need to act. The U.S. Congress has not yet passed legislation regulating AI in elections. While the E.U. has passed a wide-ranging artificial intelligence law set to go into effect over the next two years, it does not address voice-cloning tools specifically.

“Lawmakers need to work to ensure there are minimum standards,” Ahmed said. “The threat that disinformation poses to our elections is not just the potential of causing a minor political incident, but making people distrust what they see and hear, full stop.”

What the Biden administration's new executive order on AI will mean for cybersecurity

What the Biden administration's new executive order on AI will mean for cybersecurity

Tests find AI tools readily create election lies from the voices of well-known political leaders (5)

A benchmarking initiative from the Departments of Commerce and Energy

Tests find AI tools readily create election lies from the voices of well-known political leaders (7)

Training record transparency

Tests find AI tools readily create election lies from the voices of well-known political leaders (8)

Annual CISA risk reports

Tests find AI tools readily create election lies from the voices of well-known political leaders (9)

Treasury Department best practices for managing cybersecurity in financial institutions

Tests find AI tools readily create election lies from the voices of well-known political leaders (10)

Potential regulations for using AI in health care settings

Tests find AI tools readily create election lies from the voices of well-known political leaders (11)

Incorporation of AI in federal government operations with CISA guidance

Tests find AI tools readily create election lies from the voices of well-known political leaders (12)

Tags

  • Dcc
  • Wire
  • Lee-national

'); var s = document.createElement('script'); s.setAttribute('src', 'https://assets.revcontent.com/master/delivery.js'); document.body.appendChild(s); window.removeEventListener('scroll', throttledRevContent); __tnt.log('Load Rev Content'); } } }, 100); window.addEventListener('scroll', throttledRevContent); }

Be the first to know

Get local news delivered to your inbox!

Tests find AI tools readily create election lies from the voices of well-known political leaders (2024)

References

Top Articles
Latest Posts
Article information

Author: The Hon. Margery Christiansen

Last Updated:

Views: 6059

Rating: 5 / 5 (50 voted)

Reviews: 81% of readers found this page helpful

Author information

Name: The Hon. Margery Christiansen

Birthday: 2000-07-07

Address: 5050 Breitenberg Knoll, New Robert, MI 45409

Phone: +2556892639372

Job: Investor Mining Engineer

Hobby: Sketching, Cosplaying, Glassblowing, Genealogy, Crocheting, Archery, Skateboarding

Introduction: My name is The Hon. Margery Christiansen, I am a bright, adorable, precious, inexpensive, gorgeous, comfortable, happy person who loves writing and wants to share my knowledge and understanding with you.