Skip to content

A platform for violent hate speech has been implicated in three mass shootings. Should regulators step in?

Photo by iStock

Twenty minutes before the alleged shooter stormed into a Walmart in El Paso, Texas and killed 22 people, he reportedly posted an anti-immigrant screed onto 8chan, an online message forum that’s been linked to violent extremism. The post marked the third time in six months that a mass shooter has published a violent message on the site before carrying out an act of terrorism.

Professor Jessica Silbey is a professor of law at Northeastern who studies the intersection of law and social sciences. Photo by Adam Glanzman/Northeastern University

Government officials, pundits, and members of the public have been debating whether 8chan played a role in inciting the shooters to carry out mass violence. Some have called for the site to be shut down, including the original founder of 8chan, while the current owner of 8chan, Jim Watkins, has repeatedly refused to take it offline

Northeastern professors argue that the federal government can’t take legal action to shut down 8chan, but, they say, internet intermediaries such as cybersecurity, domain, and search engine providers can minimize the spread of violent hate speech on 8chan by refusing service to the online message board.

“The answer isn’t to ban the speech, but we should try to minimize the network effects,” says Jessica Silbey, a professor of law at Northeastern who studies the intersection of law and social sciences. “8chan doesn’t want to censor anybody, but 8chan lives in an ecosystem where there are other internet actors who might want to prevent the spread of their stories. Changing the law isn’t the only answer. It’s a social choice that can be made.”

Kay Mathiesen is an associate professor of philosophy. Photo by Matthew Modoono/Northeastern University

The Monday after the shooting, Cloudflare, the security provider for 8chan, decided to terminate its services and take the website offline. The site remains down at the time of publishing. Matthew Prince, the chief executive officer of Cloudflare, justified his decision by calling the site a “cesspool of hate.”

But what makes 8chan so volatile? It’s that the site has become “an echo chamber for the vile and awful points of view” prevalent on the site, says Kay Mathiesen, a professor of philosophy at Northeastern who studies political philosophy and ethics.

“It’s a well-known theory in social psychology that when you’re in a group of like-minded people, the views start to get more extreme,” Mathiesen says. “Basically you’re creating a community where people with extreme points of views can feel validated, can feel like ‘if I do this violent act, this community is going to think I’m doing the right thing.’”

At the center of the conversation is Section 230 of the 1996 Communications Decency Act, which protects Internet intermediaries from liability for content created by users, and whether or not the law needs to be revised. But even if the federal government stripped the immunity granted by the law, it would be hard to hold 8chan liable.

John Wihbey is an assistant professor of journalism and media innovation. Photo by Matthew Modoono/Northeastern University

Silbey says you would either have to prove a direct connection between an employee at 8chan and a post by its users, or have evidence that 8chan knew about harmful behavior and chose not to report it. Both of which would still be hard to prove in a court of law, Silbey says.

Another concern Silbey voices about mounting a legal challenge to 8chan is the potential for the erosion of First Amendment rights.

“Creating liability for speech alone is very hard,” Silbey says. “The First Amendment doesn’t allow the government to pass laws that restrict non-commercial speech unless doing so will prevent physical harm, so laws creating liability for hate speech are very narrowly tailored to only create liability if the hate speech is largely akin to a threat. There’s a lot of damage that could be done to the First Amendment if we start making more speech illegal.”

John Wihbey, a professor of journalism at Northeastern who studies social networks, agrees and says that the solution lies in service providers taking social and moral responsibility over content posted on websites they provide service to.

“You’re not going to eliminate the threat of online hate speech, but companies can make a difference, just like how you can reduce gun violence by passing background checks and reducing assault weapons,” Wihbey says. “It’s the same thing. You can probably get some societal benefit by doing what Cloudflare and Google have done.”

For media inquiries, please contact media@northeastern.edu.

Cookies on Northeastern sites

This website uses cookies and similar technologies to understand your use of our website and give you a better experience. By continuing to use the site or closing this banner without changing your cookie settings, you agree to our use of cookies and other technologies. To find out more about our use of cookies and how to change your settings, please go to our Privacy Statement.

Like what you see? Sign up for our daily newsletter to get the latest stories right in your inbox.