Featured
New Meta age-appropriate content controls can play a crucial role in minimizing mental health risks for teenagers, says a Northeastern University expert who serves on the company’s youth safety advisory committee.
Rachel Rodgers, associate professor of applied psychology at Northeastern who has served on Meta’s youth safety advisory committee for the last two years, says the controls should foster better communication between young people and their parents regarding safe social media use and encourage the platforms to proactively guide teens in beneficial ways.
“It’s terribly important that they have put together these new policies to further restrict what young people are able to see on the platforms,” Rodgers says.
There are a number of mental health risks, she says, that play out for younger users on social media related to social relationships, self-harm, loneliness and body image.
In a recent announcement, Meta revealed that it will now be automatically placing all teens into the most restrictive content control settings on Instagram and Facebook, restrict additional search terms on Instagram and start to hide more types of content for teens on Instagram and Facebook, in line with expert guidance.
A young person can currently create and manage their own Meta account, including Facebook and Instagram accounts, at the age of 13, while children ages 10 to 12 can only use Meta Oculus for virtual reality headsets with a children’s account managed by a parent or a guardian. The company encourages users to report underage accounts on its platforms and is developing AI and other solutions to find and remove underaged accounts and to verify people’s ages.
Meta’s revised policies are set up according to age blocks, Rodgers says.
“Research shows that adolescents do go through phases during the teenage years, and, therefore, there is some type of content that will be more appropriate for older teens, than it would be for younger teens,” she says.
Social media differ from other media such as movies, magazines or TV, Rodgers says, because they are interactive and anybody can contribute to them, making users more susceptible to the content’s influence. Children and teenagers are especially vulnerable on social media, because they do not yet have the same level of critical appraisal skills as adults do and don’t understand nuances.
“You perceive people who you’re seeing on social media as being very proximal to you in a way that you might not interpret a model in a magazine as being somebody similar to you,” Rodgers says.
But many social media users, Rodgers says, are trying to gain something, whether it’s popularity or an ego boost. That is why they often try to make their content more emotionally impactful to disseminate it widely.
Young people lack the ability to discern that and can often be affected by the messaging they see. Her work, Rodgers says, shows that body image is one of the central concerns that teenagers start to associate with.
It is challenging for adolescents to pick up on commercial intent as well. Research has shown, she says, that children start to understand they’re seeing advertisements on television, for example, that someone is trying to sell them cereal, around the age of 8 years old. But on social media the line between something that has a clearly commercial intent and something that does not is very blurred.
“It’s more of a stretch to understand that their favorite sports character is wearing those sneakers for a particular reason,” she says.
Another concern Meta addresses in the updated policies are posts about self-harm. If somebody posts about such ideations, this content will be filtered out and nobody, even the followers of this person, will see it.
Rodgers is hoping that if a young person is aware that nobody will see this “call for help,” it will prompt the poster to seek help in other, more effective ways.
“I would also hope that platforms have other policies in place that allow them to be offering resources and other types of supports to young people when they are seeing that they’re posting that type of content,” she says.
Instagram, for example, already does display a message with alternative resources if somebody searches for self-harm-related terms.
Her research done with parents, Rodgers says, does indicate that social media use generally is a concern for them because they don’t know or understand what their teens are looking at on these platforms and worry that they might be looking at things that are not age-appropriate. Meta, for example, does not give access to parents to the teens accounts.
Rodgers hopes that content restrictions will generate conversations between teens and parents.
“We know that one of the important predictors of good outcomes in terms of using social media is parental involvement in their teen’s social media activities,” Rodgers says.
But parents should not be too controlling or too laissez-faire, she says, when adults give their children complete freedom in using social media.
Parents can provide context and meaning and external discussion around some of the things that teens are doing on social media. That context is really important for young people, Rodgers says, to be able to process and understand the way a post relates to them and their lives.
“Discussions with parents are really critical parts of that,” she says.
At the same time, her research has shown that teenagers are aware that there are things on social media that they might not want to be seeing, she says, because they might find them upsetting or they wouldn’t be appropriate for them. She says, teens seem to be on board with the idea of content restriction.
The changes in Meta’s policies can also make everybody else think about the ways they use social media, Rodgers says.
“In my research I certainly see that periods of vulnerability don’t stop at 18, particularly in terms of body image and disordered eating,” she says.
She is eager to see how these changes in content controls are able to help to minimize the impacts on mental health that experts currently see. Further adjustments might also be required, she says.
“A lot of what we’ve seen so far is reactive,” she says.
Social media companies have been making changes after realizing there was a problem.
“What I would love to see is more proactive activity,” she says. “Most youth use social media platforms and, therefore, I think it’s the perfect place for us to be disseminating content that actually proactively helps them build skills to show up better in the digital world and be useful to society.”