General electionsTop Storyus elections

Intelligence chairman: US may be less prepared for election threats than it was 4 years ago

With only five months before voters head to the polls, the U.S. may be more vulnerable to foreign disinformation aimed at influencing voters and undermining democracy than it was before the 2020 election, the leader of the Senate Intelligence Committee said. Sen. Mark Warner, a Virginia Democrat, based his warning on several factors: improved disinformation tactics by Russia and China, the rise of domestic candidates and groups who are themselves willing to spread disinformation, and the arrival of artificial intelligence programs that allow the rapid creation of images, audio and video difficult to tell from the real thing.

Quick Read

  • U.S. Senate Intelligence Committee Chairman Mark Warner warns that the U.S. may be less prepared for foreign disinformation threats in the 2024 election than it was in 2020, citing advanced disinformation tactics and the rise of artificial intelligence.
  • Warner highlighted several factors contributing to this increased vulnerability, including the improved disinformation methods of Russia and China, the willingness of domestic groups to spread disinformation, and the development of AI programs that can create realistic fake content.
  • Despite previous efforts to combat disinformation, Warner’s assessment contrasts with recent assurances from security officials that the U.S. has improved its capabilities to counter foreign interference.
  • Tech companies have rolled back measures against misinformation, with X (formerly Twitter) reducing content moderation and YouTube reversing its policy against debunked election claims.
  • Meta, while prohibiting content that interferes with election processes and labeling AI-made content, still allows political ads claiming the 2020 election was rigged, raising concerns about the effectiveness of its efforts.
  • The rise of AI-generated content, including deepfakes and misleading robocalls, poses new challenges for safeguarding the 2024 election, according to Warner.
  • Efforts by federal agencies to communicate with tech companies about disinformation have been complicated by legal and political debates over government surveillance and censorship.

The Associated Press has the story:

Intelligence chairman: US may be less prepared for election threats than it was 4 years ago

Newslooks- WASHINGTON (AP) —

With only five months before voters head to the polls, the U.S. may be more vulnerable to foreign disinformation aimed at influencing voters and undermining democracy than it was before the 2020 election, the leader of the Senate Intelligence Committee said.

Sen. Mark Warner, a Virginia Democrat, based his warning on several factors: improved disinformation tactics by Russia and China, the rise of domestic candidates and groups who are themselves willing to spread disinformation, and the arrival of artificial intelligence programs that allow the rapid creation of images, audio and video difficult to tell from the real thing.

In addition, tech companies have rolled back their efforts to protect users from misinformation even as the government’s own attempts to combat the problem have become mired in debates about surveillance and censorship.

As a result, the U.S. could face a greater threat of foreign disinformation ahead of the 2024 election than it did in the 2016 or 2020 presidential election cycles, Warner said.

“We may be less prepared 155 days out in 2024 than we were under President Trump (in 2020),” Warner told The Associated Press in an interview Monday.

FILE – Mark Warner, D-Va., chairman of the Senate Intelligence Committee, speaks during a hearing at the Capitol in Washington, March 8, 2023. Warner said Monday, June 3, 2024, the U.S. may be less prepared for the threat of foreign election disinformation ahead of this year’s election than it was four years ago. Warner based his assessment on the development of powerful new AI programs that make it easier than ever to generate deepfake audio and video that can fool voters. (AP Photo/Amanda Andrade-Rhoades, File)

Noting similar campaigns in 2016 and 2020, security officials, democracy activists and disinformation researchers have warned for years that Russia, China, Iran and domestic groups within the U.S. will use online platforms to spread false and polarizing content designed to influence the race between Trump, a Republican, and President Joe Biden, a Democrat.

Warner’s assessment of America’s vulnerability comes just weeks after top security officials told the Intelligence Committee that the U.S. has greatly improved its ability to combat foreign disinformation.

Several new challenges, however, will make safeguarding the 2024 election different than past cycles.

AI programs have already been used to generate misleading content, such as a robocall that mimicked the voice of Biden telling New Hampshire voters not to cast a ballot in that state’s primary. Deceptive deepfakes created with AI programs have also popped up ahead of elections in India, Mexico, Moldova, Slovakia and Bangladesh.

Attempts by federal agencies to communicate with tech companies about disinformation campaigns have been complicated by court cases and debates over the role of government in monitoring political discourse.

Tech platforms have largely moved away from aggressive policies prohibiting election misinformation. X, formerly Twitter, laid off most of its content moderators in favor of a hands-off approach that now allows Neo-Nazi hate speech, Russian propaganda and disinformation.

Last year YouTube, owned by Google, reversed its policy prohibiting debunked election claims and now allows videos that argue the 2020 election was the result of widespread fraud.

Questions about China’s influence over TikTok prompted Congress to pass a law that would ban the popular site in the U.S. if its Beijing-based owner refuses to divest.

Meta, the owner of Facebook, WhatsApp and Instagram, prohibits information that interferes with election processes and regularly removes foreign influence operations when it identifies them. The platform also says it will label content made with AI. But the company is also allowing political advertisements that claim the 2020 election was rigged, which critics say undercuts its promises.

“I’m not sure that these companies, other than the press release, have done anything in a meaningful way,” Warner said.

Representatives from X and TikTok did not immediately respond to messages on Monday.

Read more U.S. news

Previous Article
MLB bans Tucupita Marcano for life for betting on baseball, 4 others get 1-year suspensions
Next Article
Ippei Mizuhara, ex-interpreter for baseball star Shohei Ohtani, pleads guilty in sports betting case

How useful was this article?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this article.

Latest News

Menu