Skip to content

Extremist communities continue to rely on YouTube for hosting, but most videos are viewed off-site, Northeastern research finds

Research finds that extremist and alternative content on YouTube is predominantly viewed off-platform via embedded clips. Photo by Alyssa Stone/Northeastern University

It’s easy to fall down the rabbit hole of the online video-sharing platform YouTube. After all, is there really a limit to cute pet videos?

But what about the platform’s more sinister side? After the 2016 U.S. presidential election, YouTube was so criticized for radicalizing users by recommending increasingly extremist and fringe content that it changed its recommendation algorithm.


Research four years later by Northeastern University computer scientist Christo Wilson found that — while extremist content remained on YouTube — subscriptions and external referrals drove disaffected users to extremist content rather than the recommendation algorithm. 

“We didn’t see this kind of ‘rabbit-holing effect,’” says Wilson, an associate professor at Khoury College of Computer Sciences at Northeastern. “There was in fact a lot of problematic content that was still on YouTube and still had a fairly significant audience of people who were watching it. It’s just that they haven’t been radicalized on YouTube itself.”

So if not on YouTube, where was this audience being radicalized? 

In new research presented at a conference on Tuesday, Wilson finds that extremist communities continue to rely on YouTube for video hosting — it’s just that off-site is where the “rabbit-holing” begins.

“If you’re already a political partisan and you’re going to websites with a particular leaning, that’s then leading you to YouTube channels and videos with the same kind of lean,” Wilson says. “If you started in a place where you’re being exposed to bad stuff, you end up in a place where you’re being exposed to more bad stuff.”

Headshot of Christo Wilson.
Northeastern University Associate Professor Christo Wilson found YouTube continues to host problematic videos, but most of those videos are seen on other websites via embedded clips. Photo by Matthew Modoono/Northeastern University

YouTube is an online video-sharing platform owned by Google. Following criticism for its role in hosting and elevating fringe conspiracy content, particularly through its recommendation algorithm, the platform changed that algorithm in 2019

But the extremist content never fully disappeared. 

Much of it migrated.

“YouTube is not just YouTube itself — you can embed the videos into any website,” Wilson says. “This is the first study where we’re looking at all this stuff that happens off-platform.” 

Wilson looked at over 1,000 U.S. residents from three cohorts: demographically representative users, heavy YouTube users, and users with high racial resentment. He analyzed all YouTube videos encountered by the users over a period of six months. All users accessed the web via desktop, rather than through mobile devices. 

The research resulted in several interesting conclusions. 

Wilson found that users saw more YouTube videos on sites other than YouTube than on the platform’s website itself.

He also found that politically right-leaning websites tend to embed more videos from “problematic” YouTube channels than centrist or left-leaning websites. Channels were considered problematic if they were classified as either alternative or extremist, using grades assigned by professional fact checkers or other academics. 

Wilson says an alternative channel would be, for example, Steven Crowder — a personality who interviews both mainstream scientists and vaccine deniers and is “sort of intellectually open.”  Extremist channels, Wilson said, would be “openly hateful” — something like former Ku Klux Klan Grand Wizard David Duke’s old YouTube channel.

Most notably, users exposed to off-platform videos from problematic channels are significantly more inclined to browse toward on-platform videos from problematic channels. 


“Your off-platform activity very quickly becomes on-platform activity,” Wilson says.

So, what can YouTube do? After all, Wilson admits the platform can’t control what people do when on other sites.

Wilson recommends stronger content-moderation policies. 

“YouTube can tell where videos are being embedded off platform,” Wilson notes. “If they see a particular channel being embedded in a website that is a known purveyor of misinformation, that channel should probably be scrutinized.”

Plus, Wilson notes that YouTube still hosts the videos, even if they appear on other sites. 

“They are aiding and abetting these fringe communities out there on the web by hosting videos for them,” Wilson says. “If they had stronger content-moderation policies, that would definitely help address this.”