Skip to content
man with long hair sitting in red chair with laptop on his lap

Facebook ad algorithms may be harmful to well-informed democratic society, Northeastern research scientist tells European Parliament

Piotr Sapiezynski, associate research scientist at Khoury College of Computer Sciences, traveled to Brussels to testify about Facebook ad delivery algorithms that make it more difficult and more expensive for political parties to reach the potential voters who don’t already agree with the advertised message. Photo by Matthew Modoono/Northeastern University

Northeastern University research scientist Piotr Sapiezynski recently told the European Parliament that Facebook’s ad delivery algorithms may be harmful both to political campaigns and to society at large.

Sapiezynski testified during a hearing on draft legislation concerning transparency and targeting of political advertising in Brussels. 

“Facebook’s algorithms make it more difficult and more expensive for political parties to reach the potential voters who, according to Facebook, don’t already agree with the advertised message,” said Sapiezynski, associate research scientist at the Cybersecurity and Privacy Institute at the Khoury College of Computer Sciences, during a public hearing of the European Parliament’s Committee on Internal Market and Consumer Protection on July 11. 

The proposed regulation on transparency and targeting of political advertising would be the first to govern political advertising in the European Union online, since the current legislation only governs traditional media such as television and newspapers.  

Piotr Sapiezynski, associate research scientist at Khoury College of Computer Sciences, believes that technology companies should be required to give users, journalists and researchers the access to information necessary to hold both the advertisers and the platforms accountable. Photo by Matthew Modoono/Northeastern University

The draft legislation provides for general transparency obligations of all actors involved in the financing, preparation, placement and dissemination of political advertising, both offline and online, and aims to protect individuals’ personal data by laying down rules on the use of ad targeting and amplification techniques. The proposal intends to increase transparency of political ads across the European Union before the upcoming European Parliament election in 2024.

Sapiezynski and his collaborators have been studying Facebook’s advertising platform for the last few years, he said, discovering that the platform’s ad algorithms demonstrate gender, race and other biases. Their findings about Facebook’s practices of collecting and using personal data without consent became a part of a $5 billion dollar settlement with the U.S. Federal Trade Commission.

They have also identified that Facebook’s algorithms led to gender and race discrimination in the delivery of ads for employment and housing, for which the U.S. Department of Justice recently charged Facebook’s parent company Meta Platforms Inc. in June in the first case challenging algorithmic discrimination under the Fair Housing Act.

“We are operating under the assumption that they are a for-profit company so they do whatever makes the most money,” Sapiezynski said. “It just so happens that optimizing for profit does not align with the goals of a healthy society.”

In their studies, Sapiezynski and his collaborators looked closer into ad delivery versus ad targeting. Facebook gives advertisers plenty of tools and options to analyze and select target audiences who will see the ad by demographics, location, interests and so on, Sapiezynski said. Advertisers might display micro-targeting or discriminatory behavior in this phase; however, the researchers are more concerned about the harms that arise when Facebook makes certain algorithmic decisions in the second phase, the ad delivery, even when the advertiser is well-intentioned or uses broad targeting criteria. 

“The budget that the advertiser sets for a particular ad is typically not sufficient to reach every single person in the targeted audience,” Sapiezynski said. “The platform decides who among the targeted audience will actually see the ad.” 

The researchers have discovered that the delivery decision is not made at random. It is made by an algorithm optimized for profit based on all of the user data that Facebook has, Sapiezynski said. 

The more time a user spends on Facebook, the more ads they view, and the more money Meta makes. The ad delivery algorithms are designed to maximize the relevance of ads to users and avoid showing them upsetting or uninteresting ads in hopes that users will continue browsing and viewing additional ads. 

“They avoid showing uninteresting ads, or they charge more for showing them to cover the potentially lost income,” Sapiezynski said. 

In one of the studies that Sapierzynski and his colleagues conducted in the U.S. in 2019, they ran ads for then presidential candidates Bernie Sanders and Donald Trump targeting Democratic and Republican audiences. Although in the first experiment the ads targeted the same general audiences, the ads for Bernie Sanders were delivered mostly to Democrats, whereas the ads for Donald Trump were shown mostly to Republicans, Sapiezynski said. 

In further experiments, the researchers found out that it was up to four times more expensive to show ads to the targeted audience that already disagreed with the message and did not support the featured candidate.

“This way Facebook limits political advertisers’ ability to reach audiences that, in Facebook’s estimation, do not share those advertisers’ political views,” Sapiezynski said. “This is in stark contrast with advertising in traditional media, where reaching a voter costs the same, regardless of the identity of the political advertiser.”

People are more likely to see ads that reinforce their world view and less likely to be exposed to messages from other parties, even if those parties are actively trying to reach various Facebook users. Users who are algorithmically prevented from seeing different political ads and points of view will have limited sources of information when making their choices at the voting booth, Sapiezynski said.

“That optimization plays an important role in shaping which voters see which political messages, and it does not align with the goals of a well-informed, democratic society. If the legislation focuses only on limiting the ad targeting options, it will give advertising platforms even more power to make non-transparent delivery decisions, and it will not guarantee meaningful improvements in the experiences of actual users,” Sapiezynski said.

He called for legislation that explicitly recognizes the distinct roles that the advertisers and the platforms play and that will force technology companies to give users, journalists and researchers the information necessary to hold both the advertisers and the platforms accountable. 

Facebook has already been made to release all the political ads to the Facebook Ad Library, Sapiezynski said; however, there is not enough information in it to understand what is really happening. 

“All we can see right now is what kind of messages political actors are advertising, but we don’t fully understand who they are targeting with these messages and we also don’t understand what Facebook does with those targeting options, how Facebook optimizes the delivery,” he said. “We only get a rough estimate of the distribution of who saw the ad by gender, age and by state.”

At the core of all this is processing of personal data, Sapiezynski said, because Facebook algorithms operate using “millions” of sensitive characteristics that define or describe a user. Some of the proponents of the new EU transparency regulation are pushing for severely limiting the ability of online platforms to process personal data. They are proposing to allow platforms and advertisers to use a short list of features and prohibit them from targeting users with any other characteristics.

“I am willing to believe that it [Facebook advertising algorithm] actually works well for products. The problem really comes in when you think about higher stakes, advertising for life opportunities, and so on,” Sapiezynski said. “And what we see is that Facebook is basically replicating the biases that we already know exist.”

For media inquiries, please contact media@northeastern.edu. 

Cookies on Northeastern sites

This website uses cookies and similar technologies to understand your use of our website and give you a better experience. By continuing to use the site or closing this banner without changing your cookie settings, you agree to our use of cookies and other technologies. To find out more about our use of cookies and how to change your settings, please go to our Privacy Statement.