So far, Elon Musk’s ‘Twitter Files’ amount to ‘a tempest in a teapot,’ expert says by Tanner Stening December 5, 2022 Share Facebook LinkedIn Twitter Photo illustration by Photo by Matthew Modoono/Northeastern University There was a lot of buzz over the weekend about the so-called “Twitter Files,” a collection of communications released on Friday that revealed how the social media company, among other things, suppressed a story about Hunter Biden, President Joe Biden’s son, ahead of the 2020 presidential election. The story, published by the New York Post, details most prominently the young Biden’s shady business dealings abroad using files and communications that are alleged to have come from his personal laptop. Prominent news outlets have since confirmed the authenticity of some of the communications obtained by the Post, but the most damning allegations contained therein remain unproven (though, others would argue, not unworthy of investigation). Left to right: Claudia Haupt, associate professor of law and political science at Northeastern, and John Wihbey, associate professor of media innovation and technology at Northeastern. Photos by Matthew Modoono/Northeastern University Twitter’s response to the story surfacing on its platform then (in 2020) became a jumping-off point for the Twitter Files, which began to be published by independent journalist Matt Taibbi in a lengthy Twitter thread, with the social media platform’s new billionaire owner looking on in complicit amusement (Elon Musk provided Taibbi with the documents, with certain unspecified conditions attached). The info dump fueled longstanding right-wing distrust of “big tech” by suggesting a nexus between pre-Musk Twitter and the Democratic Party. But on the left, it elicited a collective yawn. That’s because the files so far reveal little more than how Twitter bungled the response to the Hunter Biden story, airing disagreements between Twitter employees about how the company handled it initially, which was to block the story and prevent users from sharing it, before correcting course. But content moderation decisions aren’t always easy or straightforward, says John Wihbey, associate professor of media innovation and technology at Northeastern—especially given the implications of unchecked speech on democracy, such as was seen during the Jan. 6 attacks on the U.S. Capitol, and vulnerable communities, such as those Twitter has helped provide a platform for. “I’ve seen these decisions up close and they’re really hard,” says Wihbey, who worked as a consultant on Twitter’s misinformation team for a couple years. “You have to weigh things like, is this some kind of ruse or scam. And the New York Post runs with quite crazy and badly sourced stories all of the time.” It’s important to note, Wihbey says, that the decision to suppress the story on its merits “was wrong-headed,” but was overturned by then-CEO Jack Dorsey. More evidence, he says, of just how messy content moderation is in practice. “So it’s a bit of a tempest in the teapot,” he says. In the process of publicizing the largely unredacted files, Musk and Taibbi released the names and email addresses of Twitter employees. The move, “which is essentially doxxing,” Wihbey says, goes against the very safety and trust platforms like Twitter work to foster. “That is deeply troublesome for a whole variety of reasons,” he says. The files also served as a rallying cry for “free speech absolutists,” such as Musk, who in the past has repeatedly knocked the platform for censorship. The evidence presented of pre-Musk Twitter’s alleged political bias in favor of the Democrats must translate into widespread suppression of conservative speech, the argument goes. After all, research has shown, that Republicans are “slightly more likely” than Democrats and independents to report having social media posts flagged or removed by the platforms. On the one hand, it’s a tired argument, says Claudia Haupt, associate professor of law and political science at Northeastern, a “very common rhetorical move to make,” often by conservatives, to contrive a First Amendment problem “where there isn’t one.” “Private companies are free to moderate speech according to their own community standards, terms of service and moderation guidelines,” Haupt says. “This is based on our current understanding of the First Amendment.” On the other hand, Haupt wonders whether the courts might look to broaden their reading of the First Amendment to account for the integration of platforms like Twitter into the fabric of public discourse and debate. “If most speech is actually taking place on a private platform, then maybe we ought to rethink how we interpret and apply the First Amendment in these spaces,” she says. Until then, certain political factions and their associated media platforms may well continue to frame such decisions as censorship. After publishing the first chunk of documents, Musk promised a second installment in the coming days. For media inquiries, please contact email@example.com.