A Northeastern University team tested Facebook’s algorithm and found its delivery of advertisements is skewed by race and gender

Illustration by Hannah Moore

The research was troubling. It showed that the group of users to whom Facebook chose to show ads can be skewed along gender and racial lines, in potential violation of federal laws that prevent discrimination in ads for employment, housing, and credit.

A Northeastern team tested Facebook’s advertising system with a series of online advertisements. As the researchers tweaked the images, Facebook’s system presented the ads more predominantly to specific racial and gender groups.

Alan Mislove, associate professor of computer science. Photo by Matthew Modoono/Northeastern University

Other prominent online platforms, including Google and Twitter, use similar algorithms that also target specific audiences. It is the pursuit of efficiency that has contributed to the distortion, says the lead researcher, Alan Mislove, an associate dean, director, and associate professor at Northeastern’s Khoury College of Computer Sciences.    

“I was surprised at the strength of the results in terms of how strong you see the skew in particular circumstances,” Mislove says. “I did not imagine that we would be able to run ads that are exactly the same, except for the image. And that, depending on the image we chose, one would go 90 percent to men, one would go 90 percent to women—that you would see that strong of a skew. In science, you hardly ever see something that obvious, right?”

Their six-month investigation, in cooperation with researchers at the University of Southern California and Upturn, a nonprofit tech company, was not aimed at advertisers with malicious intent. Instead, Mislove and his fellow Northeastern computer scientists, Muhammad Ali and Piotr Sapiezynski, were focused on Facebook’s platform for delivering advertisements.

Its mysterious algorithm, based on years of data vacuumed up from billions of user experiences online, is meant to steer ads towards the Facebook users who would find the ads the most “relevant”.

“Increasingly, our lives are being mediated by large online algorithm-based platforms,” Mislove says. “Those companies have financial interests in trying to optimize certain things. And in many domains, that may be OK: You advertise shoes for women, we’re probably OK with the ad going to women.

“But there are other domains where that controlling of the flow of information could lead to significant negative outcomes for society at large. The most obvious of those are credit, housing, and employment, which, in addition to being very important domains, also have special legal protection.”

Muhammad and Sapiezynski created harmless, carefully controlled suites of ads which were linked to real job and housing services. While they couldn’t know exactly why the algorithm decided to deliver the ads to particular users, the researchers were able to see aggregate statistics about the gender and location of the users who saw their ads. And by focusing the ads in North Carolina, which provides public access to voter records, it was possible to use the statistics to infer the racial distribution as well.

“Could the platform lead to discriminatory ad delivery?” Mislove says. “It’s interesting because that may not be what the advertiser intended.”

The results were obvious. By changing the images, the researchers could see Facebook zeroing in on specific target audiences for the particular ads that were run in the experiment: 75 percent black for taxi-driver jobs, 85 percent female for supermarket cashier jobs, 72 percent white and 90 percent male for lumberjack jobs.

The result of their homes-for-sale ads was an audience that was 75 percent white, while ads for units for rent were more balanced. Mislove is careful to note that the results may be different for other housing ads, but the fact that the researchers saw significant differences on a small set of ads suggests that many other housing and employment ads are affected.

Mislove said he and his researchers found themselves experiencing a sense of competitive urgency that is normally attributed to journalists who are on the verge of uncovering news for the public good.

“We were worried about getting scooped,” Mislove says. “My thought was that this is so obvious, somebody else is going to find this. So we need to work hard and try to get it out sooner rather than later.”

On March 28, when the Department of Housing and Urban Development charged Facebook with violations of the Fair Housing Act, Mislove knew he had to come out with his research. It was first reported by The Economist last week.

Even if Facebook were to stop seeking optimal delivery of advertisements in the protected areas of credit, housing and employment, the skewing may still exist based on the demands of the marketplace.

“All advertising is based on auctions all over the web, and I don’t know how you fix that without just saying we don’t have those kinds of ads,” Mislove says. “So that is very much an open problem.”

For media inquiries, please contact Mike Woeste at m.woeste@northeastern.edu or 617-373-5718.