Tesla CEO Elon Musk leaves the Senate bipartisan Artificial Intelligence Insight Forum on Capitol Hill on Sept. 13. Elizabeth Frantz/The Washington Post

In the weeks following the Oct. 7 Hamas attack on Israel, Twitter user @breakingbaht criticized leftists, academics, and “minorities” for defending the militant group. But it wasn’t until the user spoke up on behalf of antisemites that he struck a viral chord with X owner Elon Musk.

The user blamed Jewish communities for bringing antisemitism upon themselves by supporting immigration to the United States, welcoming “hordes of minorities” who don’t like Jews, and promoting “hatred against whites.”

“You have said the actual truth,” Musk responded. Soon, @breakingbaht had gained several thousand new followers – and the antisemitic conspiracy theory that Jews are causing the replacement of White people was ricocheting across the internet once again.

Antisemitism has long festered online, but the Israel-Gaza war and the loosening of content moderation on X have propelled it to unprecedented levels, coinciding with a dramatic rise in real-world attacks on Jews, according to several monitoring organizations.

Since Oct. 7, antisemitic content has surged more than 900% on X and there have been more than 1,000 incidents of real-world antisemitic attacks, vandalism, and harassment in America, according to the Anti-Defamation League – the highest number since the human rights group started counting. (That includes about 200 rallies the group deemed to be at least implicitly supporting Hamas.)

Factors that predate the Gaza war laid the groundwork for the current atmosphere, say experts and advocates: the feeling of empowerment some neo-Nazis felt during the Trump presidency, the decline of enforcement on tech platforms in the face of layoffs and Republican criticism, even the 11-day war between Israel and Hamas in 2021, which gave rise to harsh criticism of Israel’s actions and sustained antisemitism online.

Advertisement

But Musk plays a uniquely potent role in the drama, disinformation specialists say. His comments amplifying antisemitic tropes to his 163.5 million followers, his dramatic loosening of standards for what can be posted, and his boosting of voices that previously had been banned from the platform formerly known as Twitter all have made antisemitism more acceptable on what is still one of the nation’s most influential social media platforms.

Musk’s endorsement of comments alluding to the great replacement theory – a conspiracy theory espoused by neo-Nazi demonstrators in Charlottesville in 2017 and the gunmen who killed people inside synagogues in Pittsburgh in 2018 and Poway, Calif., in 2019 – brought condemnation from the White House and advertising cancellations from IBM, Apple, Comcast, and Disney, among others.

Late Friday, Musk was unrepentant: “Many of the largest advertisers are the greatest oppressors of your right to free speech,” he tweeted after word of the cancellations spread. He did not respond to an emailed request for comment.

Joan Donovan, a former research director at Harvard University’s Shorenstein Center who now teaches at Boston University, included Musk in what she described as “a strata of influencers . . . who feel very comfortable condemning Jewish people as a political critique.”

“In moments where there is a lot of concern, these right-wing influencers do go mask-off and say what they really feel,” she said.

The Israel-Gaza war also has given new life to prominent Holocaust deniers who have proclaimed on X, Telegram, and other platforms that the Hamas attacks that left hundreds of Israelis dead were “false flags.” The #Hitlerwasright hashtag, which surged during the 2021 war, has returned, with Memetica, a digital investigations firm, tallying 46,000 uses of the phrase on X since Oct. 7. Previously, the hashtag appeared fewer than 5,000 times per month.

Advertisement

The Center for Countering Digital Hate, a nonprofit focused on online extremism and disinformation, identified 200 posts that promoted antisemitism and other forms of hate speech amid the conflict. X allowed 196 of them to remain on the platform, the group said in a report.

Seventy-six of those posts amassed a collective 141 million views in 24 hours after an explosion at the al-Ahli hospital in Gaza City on Oct. 17. The majority of the posts appeared on X Premium accounts, a subscription service that grants a blue “verified” checkmark to anyone willing to pay a monthly fee. Previously, such status was available only to public figures, journalists, and elected officials.

“Elon Musk has shaped X into a social media universe that revolves around his beliefs and whims while still shaping politics and culture around the world. And he’s using it to spread the most disgusting lies that humans ever invented,” said Emerson Brooking, resident fellow at the Digital Forensic Research Lab of the Atlantic Council think tank and co-author of the 2018 book “LikeWar: The Weaponization of Social Media.”

ANTISEMITISM GOES MAINSTREAM

Hatred against Jews has long been a feature of the internet. Extremists were early adopters of social media platforms, using them to find like-minded people to share views that would be distasteful in other settings, Brooking said.

In the mid-2000s, lies spread by anonymous users on platforms such as 4chan and Usenet blamed Jews for the Sept. 11, 2001, attacks and for the 2008 financial crisis. But the most extreme antisemitism, such as Holocaust denial, remained largely confined to the fringe, said Oren Segal, vice president of the Center on Extremism at the ADL. Well-known Holocaust deniers had little access to mainstream news media.

Advertisement

By the 2010s, however, an internet subculture that repackaged antisemitism into something seemingly more palatable started to take shape – often on newer and less moderated platforms like Discord, 8chan, and Telegram, and also on mainstream services like Facebook and YouTube. Instead of swastikas, the currency became jokes, memes like Pepe the Frog, and terms for White supremacy like “alt-right.” The election of former president Donald Trump galvanized this group; Richard B. Spencer, then president of the white supremacist National Policy Institute, made headlines by telling a meeting of supporters after Trump’s election victory, “Hail Trump! Hail our people! Hail victory!”

“Suddenly, racists and antisemites who had lived at the margins of society found that they had new legitimacy. And a rising generation of far-right Americans saw that it was okay to say and do hateful things because the president was doing them already,” Brooking said.

The 2017 Unite the Right rally in Charlottesville, organized on Facebook and the gaming platform Discord, became the first time a broad group of Americans, watching on television and online, heard the slogan “Jews will not replace us,” chanted by a torch-carrying crowd seeking to prevent the removal of a statue of Confederate Gen. Robert E. Lee.

“We saw an inflection point where online expression had turned into bigger real-world organizing,” the ADL’s Segal said of the demonstration.

Trump did little to tamp down these ideas and often amplified them, occasionally retweeting antisemitic memes and famously saying “there were very fine people on both sides” of the Charlottesville rally, at which a neo-Nazi sympathizer drove his car into counterprotesters, killing a woman.

In an emailed statement, the Trump campaign denounced any effort to link the former president to antisemitism. “The real racists and anti-Semites are deranged Democrats and liberals who are marching in support of terrorist groups like Hamas and calling for the death of Israel,” the statement said. “There has been no bigger champion for Israel than President Trump, as evidenced by moving the U.S. embassy to Jerusalem, signing laws that curb anti-Semitism, and much more.”

Advertisement

The statement added, “For a media organization like The Washington Post to make such a ridiculous charge proves it has its own racism and anti-Semitism issues they must address before casting stones.”

The Trump years also saw the rise of mass shooters steeped in antisemitic conspiracy theories. In New Zealand, El Paso, Buffalo, and at the Tree of Life synagogue in Pittsburgh, shooters cited the great replacement theory as their inspiration, and in some cases posted manifestos about it.

Amid the growing violence, tech platforms that had taken a tolerant approach to antisemitic posts cracked down. YouTube banned Holocaust denial in 2019 and Meta did so in 2020, after CEO Mark Zuckerberg had defended not prohibiting such content just two years earlier. Both companies expanded their hate speech policies to include white supremacist content in 2019.

Those actions sent antisemitism back to the fringes, and to newer services, such as Gab, that specifically catered to right-wing audiences. “What I can tell you is major accounts that were spreading antisemitism . . . were falling like dominoes,” said Heidi Beirich, co-founder of the Global Project Against Hate and Extremism. “They were quickly re-platforming themselves in places like Gab. But there they were more preaching to the choir as opposed to being able to radicalize random people.”

Then in 2022, Musk’s $44 billion purchase of Twitter closed.

THE RIPPLE EFFECT

Advertisement

Musk had been saying for months that one of the reasons he wanted to buy Twitter was to embrace “free speech” and relax the platform’s content moderation practices. Hours after he took over, anonymous trolls flooded the site with racist slurs.

The rise in bigotry on the platform prompted civil rights groups to pressure advertisers – sometimes successfully – to pause spending on Twitter. Last November, Musk extended an olive branch to those activists, pledging in a private meeting not to reinstate banned accounts until there was a process to do that. That concession angered far-right influencers on the site, who accused him of being a traitor to their cause.

Later that month, Musk reinstated thousands of accounts – including Trump’s – that had been banned for threats, harassment, and misinformation. Since then, hateful rhetoric on the platform has increased, researchers said.

Musk invited back banned Hitler apologists, sent out his own antisemitic tweets to his followers, and promoted the work of Great Replacement backers including former Fox News host Tucker Carlson. Those actions demolished the previous bounds of acceptable speech, inviting more people to weigh in with wild theories and emotions about religious and ethnic minorities.

On Wednesday, Gab’s official X account shared a meme celebrating that Musk had affirmed “Jews are the ones pushing anti-White hatred” along with the caption, “We are so back.” (The X post, which has since been deleted, was liked 19,000 times and viewed 720,000 times.)

On Friday, several major companies announced that they were pulling advertising from X, including Apple, Lionsgate Entertainment, and Comcast, parent of NBCUniversal. In the first quarter of 2022, Apple was Twitter’s top advertiser, accounting for nearly $50 million in revenue. Media Matters, a nonprofit media watchdog, published a report showing that X has been placing ads for Apple, Bravo, IBM, Oracle, Amazon, and more next to pro-Nazi content. On Saturday, Musk threatened to sue Media Matters, accusing it of misrepresenting “the real experience on X.”

Advertisement

Some news publishers also have pulled out of the platform. NPR shut down its X account in April after Musk falsely labeled the nonprofit broadcaster “state-controlled media.” On Thursday, the journalist Casey Newton announced that he would be pulling Platformer, the independent tech news outlet he founded, from X and would no longer include posts on X in the Platformer newsletter.

“It’s the only way I know how to send the message that no one should be there, that this is not a place where you should be going to get news or to discuss news or to have a good time,” he told The Post. “It is just over. If you wouldn’t join Gab, Parler, or Truth Social, there’s no reason you should be on X. I think it’s time for journalists and publishers, in particular, to acknowledge the new reality and to get the heck off that website.”

Newton said that media companies, including The Post, that continue to pay to advertise on the site are funding Musk’s hate campaigns. “Publishers have to look themselves in the mirror and ask, why did they get into this business in the first place?” he said. “Didn’t it have something to do with speaking out against oppression and bigotry and standing up in the face of oppression?”

A Post spokesperson declined to comment.

Hateful rhetoric that appears on X ripples out to the whole internet, normalizing an unprecedented level of antisemitic hate, experts said. “Twitter is the most influential platform in shifting sentiments,” said Imran Ahmed, CEO of the Center for Countering Digital Hate. “[It] has always had an outsize influence in determining what takes start to be perceived as the vox populi.” Musk has sued the CCDH for defamation over its reports on X.

The international reach of big social platforms such as Instagram and TikTok also has served to highlight tensions. TikTok has come under fire for videos critical of Israel or supportive of Palestinians that carry the #freepalestine hashtag; TikTok data show that many of those arise from predominantly Muslim countries, such as Malaysia and Lebanon, where support for Palestinians has long been high.

Advertisement

Dozens of high-profile Jewish content creators issued an open letter to TikTok earlier this month, saying that the platform hadn’t done enough to counter hatred and abuse toward the Jewish community on the app. On Wednesday, many of those creators, along with prominent celebrities including Amy Schumer, Debra Messing, and Sacha Baron Cohen, met with representatives from the company to voice their concerns. The conversation was heated and intense, according to creators who attended.

“We recognize this is an incredibly difficult and fearful time for millions of people around the world and in our TikTok community,” TikTok said in a statement. “Our leadership has been meeting with creators, civil society, human rights experts, and stakeholders to listen to their experiences and feedback on how TikTok can remain a place for community, discovery, and sharing authentically.” Since Oct. 7, TikTok has removed more than 730,000 videos for hate speech, including content promoting antisemitism, the company said.

Content creator Montana Tucker, the granddaughter of Holocaust survivors who has more than 9 million followers on TikTok and 3 million on Instagram, attended the meeting with TikTok. She said she’s noticed a sharp uptick in antisemitism across all platforms and plans to stay on X for now.

“It’s happening on every single app, unfortunately,” she said. “All of these people, I’m sure they would love for us to hide and to not post and to not share . . . but we need to be more vocal. We need to be on these apps and we need to continue to share. I think it’s more of a reason I need to start posting more on [X].”

Outside of social media, white supremacists and neo-Nazis have continued to use lightly moderated messaging platforms such as Telegram and group-run websites to distribute hate messages and propaganda since the Israel-Gaza war began, according to the Counter Extremism Project, a nonprofit that tracks the groups. The Global Project Against Hate and Extremism found that antisemitic and anti-Muslim posts on 4chan, Gab, Odysee, and Bitchute increased 461 percent from 618 to 3,466 from Oct. 6 to Oct. 8.

A researcher at the Institute for Strategic Dialogue, a London think tank that tracks hate and disinformation, said online extremists were having a “field day,” with far-right groups using Hamas propaganda to bolster antisemitic messages.

Advertisement

Russia’s sophisticated disinformation apparatus also has seized on the conflict. One of Russia’s widest ongoing campaigns, known as Doppelganger, promotes fake articles on clones of major media websites. Links to the pages are sent out rapidly by large networks of automated accounts on X and Facebook.

For the past year, most of these articles have been aimed at undermining Western support for Ukraine, Russia’s top priority. But not long after Oct. 7, some Doppelganger assets started promoting the idea that the United States cared far more about Israel and would stop sending Ukraine as much aid, according to Antibot4Navalny, a group of volunteers who track Russian disinformation on the internet.

More recently, the social media accounts amplified pictures of the Jewish Star of David spray-painted on buildings in Paris, according to the nonprofit EU DisinfoLab. That advanced multiple objectives, the organization said: It generated additional concern about possible increases in antisemitism in France. It likely encouraged antisemites to think they are greater in number. And above all, it focused attention on Israel, rather than Ukraine and Russia.

Benjamin Decker, founder of Memetica, said that a major portion of 4chan links to outside coverage of Israel and Hamas go to articles from media sources in Iran, China, or Russia. “You can’t attribute it to these actors yet, but from the beginning, there have been cross-platform communities with a vested interest in stoking hate,” he said. “There is a highly digital far-right community who loves celebrating the deaths of Jews, and that dovetails with Hamas.”

“We’re in a really dangerous place,” the CCDH’s Ahmed said. “There’s no clearer link between disinformation, conspiracy theories, and real-world hate than there is with antisemitism.”

Will Oremus and Drew Harwell contributed to this report.

Copy the Story Link

Related Headlines


Only subscribers are eligible to post comments. Please subscribe or login first for digital access. Here’s why.

Use the form below to reset your password. When you've submitted your account email, we will send an email with a reset code.

filed under: