
A document about the Foreign Malign Influence Center, a division of the Office of the Director of National Intelligence, is shown in June. Federal officials who track disinformation campaigns say they are issuing more warnings to political candidates, government officials and others targeted by foreign groups as America’s adversaries seek to influence the 2024 election. Jon Elswick/Associated Press
WASHINGTON — A Russian propaganda campaign backed by the Kremlin that spread online disinformation in the United States and was boosted by artificial intelligence has been disrupted, the Justice Department said Tuesday.
U.S. officials described the internet operation as part of an ongoing effort to sow discord in the U.S. through the creation of fictitious social media profiles that purport to belong to authentic Americans but are actually designed to advance the aims of the Russian government, including by spreading disinformation about its war with Ukraine.
U.S. officials said the scheme was organized in 2022 after a senior editor at RT, a Russian-state-funded media organization that has registered with the Justice Department as a foreign agent, helped develop technology for a so-called social media bot farm. It received the support and financial approval of the Kremlin, with an officer of Russia’s Federal Security Service – or FSB – leading a private intelligence organization that promoted disinformation on social media through a network of fake accounts.
The RT press office did not respond directly to a question about the allegations.
The disruption of the bot farm comes as U.S. officials have raised alarms about the potential for AI technology to impact this year’s elections and amid ongoing concerns that foreign influence campaigns by adversaries could sway the opinions of unsuspecting voters, as happened during the 2016 presidential campaign when Russians launched a huge but hidden social media trolling campaign aimed in part at helping Republican Donald Trump defeat Democrat Hillary Clinton.
“Today’s actions represent a first in disrupting a Russian-sponsored Generative AI-enhanced social media bot farm,” FBI Director Christopher Wray said in a statement. “Russia intended to use this bot farm to disseminate AI-generated foreign disinformation, scaling their work with the assistance of AI to undermine our partners in Ukraine and influence geopolitical narratives favorable to the Russian government.”
Among the fake posts, according to the Justice Department, was a video that was posted by a purported Minneapolis, Minnesota resident that showed Russian President Vladimir Putin saying that areas of Ukraine, Poland and Lithuania were “gifts” to those countries from liberating Russian forces during World War II.
In another instance, the Justice Department said, someone posing as a U.S. constituent responded to a federal candidate’s social media posts about the war in Ukraine with a video of Putin justifying Russia’s actions.
As part of the disruption, the Justice Department seized two domain names and searched 968 accounts on X, the social media platform formerly known as Twitter.
According to a joint cybersecurity advisory released Tuesday by U.S., Dutch and Canadian authorities, the software was used to spread disinformation to countries including Poland, Germany, the Netherlands, Spain, Ukraine and Israel.
The advisory said that as of last June, the software – known as Meliorator – only worked on X but that its functionality probably could be expanded to other social media networks.
Send questions/comments to the editors.
Join the Conversation
We believe it’s important to offer commenting on certain stories as a benefit to our readers. At its best, our comments sections can be a productive platform for readers to engage with our journalism, offer thoughts on coverage and issues, and drive conversation in a respectful, solutions-based way. It’s a form of open discourse that can be useful to our community, public officials, journalists and others. Read more...
We do not enable comments on everything — exceptions include most crime stories, and coverage involving personal tragedy or sensitive issues that invite personal attacks instead of thoughtful discussion.
For those stories that we do enable discussion, our system may hold up comments pending the approval of a moderator for several reasons, including possible violation of our guidelines. As the Maine Trust’s digital team reviews these comments, we ask for patience.
Comments are managed by our staff during regular business hours Monday through Friday and limited hours on Saturday and Sunday. Comments held for moderation outside of those hours may take longer to approve.
By joining the conversation, you are agreeing to our commenting policy and terms of use. More information is found on our FAQs.
You can modify your screen name here.
Show less
Join the Conversation
Please sign into your Press Herald account to participate in conversations below. If you do not have an account, you can register or subscribe. Questions? Please see our FAQs.