Ten weeks into its existence, Meta’s Threads is trying desperately to avoid a fight about what content it allows on its site.

The logo for Meta’s new app Threads, right, appears next to that of its competitor, the platform formerly known as Twitter. After Meta finally added the ability to search for keywords, users looking for posts on several significant topics – such as “COVID” – were surprised to turn up no results. Richard Drew/Associated Press

After Meta finally added the ability to search for keywords, users looking for posts on several significant topics – such as “COVID” – were surprised to turn up no results. Posts with that word existed, of course, but Meta is making them much harder to find (instead pointing users to the Centers for Disease Control and Prevention website). To prevent the risk of users finding something potentially dangerous or incorrect on Threads, its competitor to the platform formerly known as Twitter, Meta has decided it is better for them to see nothing at all.

Meta has acknowledged the blocking on Threads but declined to share its list of banned words. In addition to “coronavirus,” “vaccine” and “long COVID,” The Washington Post discovered that “sex,” “nude,” “gore” and “porn” were restricted. Following the publication of this column, Meta in a statement said: “The search functionality temporarily doesn’t provide results for keywords that may show potentially sensitive content. People will be able to search for keywords such as ‘COVID’ in future updates once we are confident in the quality of the results.”

It’s a flawed solution to the intractable problem of having to handle platform safety and manage politically charged content in the current climate. Contradictory court decisions and ever-persistent accusations of bias and inaction have made such caution inevitable. Just more than a year out from a presidential election, we’re arguably as far as we’ve ever been from answering the question of who gets to decide what is allowed on social media. So it’s no surprise that having previously bent over backward to put in place policies to please (or maybe appease) everyone, Meta’s latest approach is to attempt to duck the conversation as best it can.

Consider the ruling issued by the 5th U.S. Circuit Court of Appeals late Friday. Three Republican judges upheld an earlier court’s view that the White House, government health officials and the FBI all likely violated the First Amendment through their badgering of social media companies to remove posts about COVID-19 and the 2020 election, effectively “commandeering their decision-making processes.”

Media reports described the ruling as plainly a “victory for conservatives,” which struck me as odd: I agreed with it, and I’m no conservative. Aggressive messages to Meta’s staff from the White House strategy director at the time, Rob Flaherty – such as “I want an answer on what happened here and I want it today” – surely crossed a line, infringing on the right for Meta to run its business how it sees fit.


But cutting off contact entirely would be irresponsible. Thus, the more sensible 5th Circuit ruling, much narrower in scope than the lower court’s order, made it legal again for the White House and other agencies to engage with networks on content moderation matters as long as requests didn’t contain “significant encouragement” to act. Fair enough.

At least, that’s my view. My line. Yours might be drawn somewhere else. You might feel that the hassling was justified if it meant preventing misinformation during a pandemic and that online platforms with enormous reach have a duty to listen to elected officials entrusted with protecting citizens in times of crisis. Your view might also be that government agencies, as we approach next year’s election, should be able to quickly step in to make sure the public has all the information it needs to cast an informed vote – especially because some social networks have a poor track record on protecting election integrity.

But your line might move depending on the source of the demand. How do you feel, for instance, about laws passed by Republicans in Texas and Florida that seek to force social networks to carry political speech they might otherwise choose to remove as policy violations? Supporters of this requirement, such as presidential candidate and Florida Gov. Ron DeSantis, say companies in the liberal haven of Silicon Valley can’t be trusted to fairly host the views of conservatives running for office.

Forcing social media companies to host material they don’t find acceptable under their own policies goes way too far. The ACLU agrees, and so does the 11th Circuit, which deemed the Florida law for the most part unconstitutional. Yet the 5th Circuit, the same court that now says the government can’t demand posts be taken down, previously upheld the Texas law that forces companies to keep posts up.

If you’re in the business of running a social network, this is an unworkable mess. That’s without even going into calls to reform Section 230, the law which, for now, gives platforms a layer of legal protection over content they host.

Clarity may be some ways off. The Justice Department has requested the Supreme Court rule on the Florida and Texas laws, and the Biden administration can appeal its case regarding its communications with social networks (though it’s not yet clear whether it plans to do so).

Given all this, we can perhaps forgive Meta for seeking to minimize its exposure to uncertainty, though it is of course futile. “COVID censors are back” read a Fox News chyron this week during a segment about Meta. Blocking controversial words is in itself an unavoidably controversial act – it all depends on where you draw the line.

Only subscribers are eligible to post comments. Please subscribe or login first for digital access. Here’s why.

Use the form below to reset your password. When you've submitted your account email, we will send an email with a reset code.