YouTube’s new ad policy is designed to protect kids. Will it drive them away instead? by Molly Callahan January 9, 2020 Share Facebook LinkedIn Twitter Illustration by Hannah Moore/Northeastern University This week, YouTube announced changes to its advertising policies for children viewing videos on the site. The changes are designed to protect against the collection of children’s personal information, but may have unintended consequences, says Northeastern assistant professor Keith Smith: Reducing the overall amount of content available for children and pushing them to streaming sites that are less child-friendly. Keith Smith is an assistant professor of marketing in the D’Amore-McKim School of Business at Northeastern. Photo by Matthew Modoono/Northeastern University YouTube officials say the company will ban all targeted advertising on videos made for children, a move meant to address concerns that the media company may have violated the federal Children’s Online Privacy Protection Act. The act prohibits “unfair or deceptive acts or practices in connection with the collection, use, and/or disclosure of personal information from and about children on the internet.” For children’s video creators who rely upon ad revenue to make a living, the change could spell trouble, says Smith, who studies the way digital products and social media affect the bottom line for managers and marketers. “Many of these content providers make a living on YouTube, and specifically, make a living through the revenue from targeted ads,” Smith says. Targeted ads are advertisements that are directed toward specific audiences with certain personal traits. Those personal traits are determined by a person’s online browsing habits—information that’s turned into data that is collected, and sold, by tech companies to advertisers. “When you show an ad on the Disney Channel, or Nickelodeon, those ads are rarely targeted to any level of depth,” Smith says. YouTube is changing its ad policies to prevent the collection of children’s personal information. But the changes may have unintended consequences, says assistant professor Keith Smith. Photo by Ruby Wallau/Northeastern University In those cases, companies know that it’s likely children are watching the shows, but not, say, their ages, genders, or which shows they’ve seen before—data that YouTube’s parent company, Google, does collect. Targeted ads are “an influential way to target people who don’t find it creepy,” Smith says, because they’re so much more tailored to a person’s individual interests. “But for a child, who is sensitive to that kind of targeted appeal to their personality, [a targeted ad] is going to be much more persuasive.” And because targeted ads are sent to people who have a higher likelihood of buying whatever product or service is being advertised (or at least who are more likely to click on the ad itself), they’re more valuable than general ads, Smith says. So, he says, people who make videos typically seen by a smaller audience—people who have been able to compensate for the size of their audience by bookending their videos with relatively high-revenue targeted ads—“are going to find themselves pushed out.” Smith says the new policy may push those “fringe-level content creators” to other video-streaming sites, such as Twitch or Mixer, which were designed for people to livestream video gameplay. “These platforms aren’t as conducive to child-based consumption, so you might end up with children looking for videos on a platform that is not infrequently focused on games that have [a mature] rating,” Smith says. For media inquiries, please contact Marirose Sartoretto at m.sartoretto@northeastern.edu or 617-373-5718.