Skip to content

A missed opportunity or the ‘first domino?’ Northeastern experts differ on Australia’s youth social media ban

More than 1 million social media accounts of users under 16 are to be deactivated in Australia on Wednesday

The silhouette of a young person holding a cellphone up in front of their face.
Northeastern University experts differ on whether Australia’s social media ban for youth is the answer to addictive algorithms and exposure to harmful content. Photo by Anna Barclay/Getty Images

The wailing sound you may hear emanating from the southern hemisphere is the collective angst of Australian youth who lose access to TikTok and its peers this week, when their country enacts the world’s first social media ban for persons under 16.

The new regulation comes as many countries grapple with how to reduce adolescents’ exposure to addictive algorithms and harmful content online which, in some cases, has led to bullying, criminal behavior and even suicide.

But social media experts are split on whether prohibiting social media is the answer. 

“To be very clear, I think that reducing children’s exposure to social media and the amount of time they spend on it is a good thing, and I think that using legislation and policy as a lever to achieve this is also a good thing,” said Rachel Rodgers, associate professor of psychology at Northeastern University. “But I think a ban is unhelpful.” 

Rodgers, who studies socio-cultural influences on body image and eating concerns, said that the ban could make social media more attractive to teens. 

What bothers me most is the missed opportunity for passing legislation that would really have put Big Tech companies in the position of making drastic changes,” Rodgers said.

But John Wihbey, associate professor of media innovation and co-founder of Northeastern’s Internet Democracy Initiative, said Australia could be the “first domino” that could reshape global norms.

“Australia’s ban represents the kind of proactive ‘response principle’ we need – holding platforms responsible for systemic harms like the widespread endangerment of children, rather than continuing America’s laissez-faire approach that has left us with chaos,” Whibey said. “Australia’s experiment could establish new international standards for protecting children online – showing that democratic societies can confront Big Tech’s power without sacrificing fundamental principles.

More than 1 million social media accounts of users under 16 are to be deactivated in Australia on Wednesday.

It affects platforms including X, Facebook, Instagram, TikTok, Snapchat, YouTube and Reddit. The platforms have sole responsibility for enforcement – risking fines up to 49.5 Australian dollars, (around $32 million), for serious or repeated breaches. 

Many in the tech industry criticized the ban.

Paul Taske from NetChoice, a trade group representing several big tech companies, told the BBC that the ban was “blanket censorship.” 

Two Australian teens are challenging the ban in court, alleging it censors their right to free speech.

“We didn’t do anything wrong,” Noah Jones, one of the teens, told SkyNews last week. “It’s social media companies that have the dangerous content.”

The platforms have said they will comply by using an array of age estimation and verification technologies, in addition to the user’s self-reported age, to deactivate youth accounts.

Rodgers called this method of enforcement “prohibitive,” arguing that it is problematic and “difficult to implement.”

“I worry that this will make it very attractive to young people to find ways to subvert this and to be on platforms anyway,” Rodgers said. That could lead to even greater exposure to social media harms, she added.

“If they are on platforms where they are no longer being recognized for the age that they actually are, then they no longer benefit from the protections that those platforms have in place for kids of that age, and therefore they’re actually being exposed to something that is likely more risky than they would have been previous to the ban,” Rodgers said. 

Moreover, the psychology professor said that it was a shame that the legislation essentially neglects the root of the problem: the proliferation of harmful content on social media and its addictive nature. 

“It’s not going to place social media companies under any pressure to further improve the experience for young people, it’s not going to push them to change anything for people who are 16 or over,” Rodgers said. “I think the most helpful thing we can do is make sure that young people are accessing content that is appropriate for them and to really focus on tailoring and scaffolding that experience in a way that involves guardians and trusted adults.

Wihbey agreed with many of Rodgers’ points. He acknowledged some teenagers will circumvent the ban through false ages and virtual private networks and that age restrictions don’t address root causes like addictive algorithms and exploitative design. But he viewed the legislation as an important first step. 

“The goal isn’t perfection,” Wihbey said. “It’s bending the arc toward better outcomes by forcing platforms to invest in age-assurance technologies, to demonstrate duty of care, and to accept that hosting millions of children carries real responsibility, not just profit opportunities.”

“Bans work alongside broader reform, not against it,” Wihbey said.

He praised Australia’s actions. 

“Australia is demonstrating that platforms can and must be held accountable through regulation, catalyzing a vital global conversation about more comprehensive reforms that mandate platform duty of care, safe design settings, and algorithmic transparency,” Wihbey said. “Although I would rather see more ‘effects-based’ regulation that judges companies on specific outcomes for children, Australia is nevertheless taking important action where the U.S. has failed.”