What Elon Musk’s Twitter files don’t tell us

0
What Elon Musk’s Twitter files don’t tell us

Conservatives have long accused Big Tech of being biased against them, without much evidence.

Now, the “Twitter files,” a trove of internal Twitter documents, is providing new ammo for these conservatives. Twitter’s new CEO, Elon Musk, has released the files to journalists Bari Weiss and Matt Taibbi, who, like him, are active critics of liberal “woke” culture,

This past week, journalists Weiss and Taibbi shared details of some of the documents and their own analysis in two long Twitter threads. The revelations are ongoing, with plans to post more in the coming days. Their central accusation so far is that Twitter has long silenced conservative or contrarian voices, and they reference internal emails, Slack messages, and content moderation systems to show how Twitter limited the reach of popular right-wing accounts like Dan Bongino, Charlie Kirk, and Libs of TikTok.

But these claims and the internal documents lack crucial context.

We don’t have a full explanation, for example, of why Twitter limited the reach of these accounts — i.e., whether they were violating the platform’s rules on hate speech, health misinformation, or violent content. Without this information, we don’t know whether these rules were applied fairly or not. Twitter has long acknowledged that it sometimes downranks content that is violative of its rules instead of all-out banning it. It’s a strategy that Musk himself has advocated for by arguing that people should have “freedom of speech, but not freedom of reach” on the platform.

And while Weiss has surfaced specific examples of Twitter limiting the reach of conservative accounts known for spreading hateful content about the LGTBQ+ community or sharing the “big lie” about the US presidential elections, we don’t know if Twitter did the same for some far-left accounts that have also been known for pushing boundaries, such as some former Occupy movement leaders who have complained about Twitter’s content moderation in the past.

Musk, Weiss, and Taibbi are also assuming these decisions were made with explicit political motivation. Historically, most Twitter employees — like the rest of Big Tech — lean liberal. Twitter’s conservative critics argue that this presents an inherent bias in the company’s content moderation decisions. Former Twitter employees Recode spoke with this week insisted that content moderation teams operate in good faith to execute on Twitter’s policy rules, regardless of personal politics. And research shows that Twitter’s recommendation algorithms actually have an inherent bias in favor of right-wing news. What’s been shared so far in the Twitter files doesn’t offer clear proof that anyone at Twitter made decisions about specific accounts or tweets because of their political affiliation. We need more context and information to clarify what’s really going on here.

But to right-wing politicians, influencers, and their supporters, none of this nuance ultimately matters. Former President Donald Trump has used the files’ release to call for terminating parts of the US Constitution, Fox News host Tucker Carlson has said it’s proof that liberals are censoring conservatives online, and Rep. Marjorie Taylor Greene (R-GA) warned that “Oversight is coming.”

“We ALWAYS knew we were a target of the Twitter suppression machine. ALWAYS. Yet liberals insisted it was another ‘conspiracy theory,’” Bongino, a popular conservative commentator who Weiss’s reporting showed was seemingly barred from search results on Twitter at one point, tweeted on Thursday evening. “Tonight is vindication,” he wrote.

What the Twitter files do — and don’t — tell us

The first installment of the Twitter files, written by Taibbi, dissected the controversial decision by Twitter to block a New York Post story about Hunter Biden before the 2020 US elections. Twitter’s rationale for blocking the story at the time was that it may have been based on hacked or fake materials — in the end, it was based on real information seemingly from Hunter Biden’s laptop that he left in a repair shop — but the veracity of the materials and where they came from was unclear at the time Twitter was making its decision.

Taibbi’s breakdown of the internal debate at Twitter over whether or not to block the New York Post story was seen by some journalists as a “snoozefest” because Twitter executives’ disagreement and regret about the decision, including by then-CEO Jack Dorsey, has already been reported. Nor do the new files reveal any clear intention of political preference — instead, the internal debate at Twitter at the time focused on whether or not the story violated Twitter’s policies around hacked materials and publishing of “personal and private” information.

The second installment of the Twitter files, by Weiss, shared previously unreported details about Twitter enforcing what it calls “visibility filtering” on certain conservative figures’ accounts, meaning that fewer people saw their tweets because Twitter appeared to take actions like blocking their names in search, stopping their tweets from trending, or downranking their tweets in people’s feeds. In doing so, Weiss has accused Twitter of “shadow banning” these accounts, but there’s dispute about what that term means.

Twitter defined shadow banning in a company blog post in 2018 as “deliberately making someone’s content undiscoverable to everyone except the person who posted it, unbeknownst to the original poster.”

One source who used to work in content moderation at Twitter told Recode that the examples Weiss reported on isn’t true shadow banning because those tweets were still visible to other people.

There’s a lot of confusion around the many ways that Twitter can demote people’s tweets without erasing them entirely. While Twitter denied that it ever shadow banned users, it has never fully explained what “visibility filtering” meant or which accounts it was applied to. It’s easy to see how that could cause confusion and accusations of political manipulation. Still, for some former Twitter employees, the decisions to demote accounts pushing hateful speech in itself isn’t controversial.

“I don’t see the scandal,” said another former Twitter employee, who spoke with Recode on the condition of anonymity because of fear of professional repercussions. The employee said that Libs of TikTok, an account that Weiss revealed had its reach limited by Twitter, is a “harmful” user that forced the company to restrict its visibility. The account has been blamed for harassment of children’s hospitals, including bomb threats.

“Why wouldn’t you want to restrict amplification of an account like that?” the former employee said. “No one has a right to be amplified.”

But Twitter’s lack of transparency around why these accounts were limited opens the company to accusations that it overreached and showed political bias.

How Elon Musk is reacting

Musk says that Twitter is working on a feature that will show users if they’ve been shadow banned, the reason why, and how to appeal.

Several sources Recode spoke with who currently or formerly work for major social media companies said that, historically, companies like Facebook or Twitter haven’t done this because it could make it easier for bad actors to game content moderation systems and evade rules.

But despite that risk, if Musk were to publicly reveal why users have been downranked, it may actually solve a bigger problem for Twitter: the perception that the company is secretly silencing conservative voices. What it might reveal instead is that in order to have a well-functioning platform, it’s necessary to downrank harmful content, even if it’s posted by prominent conservative figures.

And sometimes it’s important to kick off rule-breaking users — as Musk himself learned when Kanye West’s account was reinstated and then West repeatedly tweeted anti-Semitic comments. Musk suspended his account again in response about a month later.

If we had more information about the full extent of accounts Twitter applies “visibility filtering” to and the rationale for why it does so, the Twitter files might provoke deeper conversations. If conservatives are the ones repeatedly breaking the rules around hateful content, does that mean they should be held to a different standard on the platform? Or should Twitter rewrite its rules around hate speech? So far, neither Musk nor his conservative supporters decrying the Twitter files seem to have an answer.

Leave a Reply