Advertisers might not know it, but Facebook’s ad delivery service still discriminates by race and gender

Facebook chief executive officer Mark Zuckerberg arrives for a House Financial Services Committee hearing on Capitol Hill in Washington, Wednesday, Oct. 23, 2019, on Facebook’s impact on the financial services and housing sectors. AP Photo/Andrew Harnik

When Mark Zuckerberg testified before Congress this month, he was grilled on a whole host of topics. Members of Congress interrogated the Facebook CEO about the company’s plan for cryptocurrency; whether Facebook was violating antitrust laws; and its rules about political advertising

Buried within the five hours of testimony was another important question: Does Facebook’s system for delivering advertisements reinforce discrimination on the basis of gender and race? 

The answer, it appears, is yes. Alan Mislove, an associate professor at Northeastern who studies online social networks, recently turned his attention to Facebook’s ad-delivery algorithm, and found some troubling results.  

Portrait of Alan Mislove

Alan Mislove is an associate dean, director, and associate professor at Northeastern’s Khoury College of Computer Sciences. Photo by Matthew Modoono/Northeastern University

The group of users to whom Facebook chooses to show ads can be skewed along gender and racial lines, Mislove found, in potential violation of federal laws that prevent discrimination in ads for employment, credit, and housing.

For his research, Mislove and his colleagues tested Facebook’s advertising system with a series of online advertisements. As the researchers tweaked the images, Facebook’s system presented the ads more predominantly to specific racial and gender groups.

The problem is not that advertisers are purposely excluding or targeting certain racial groups—although that was a problematic feature that Facebook eliminated this summer—but rather that the algorithm responsible for delivering those ads might be making choices that reinforce discriminatory stereotypes, Mislove says.

Mislove offered information about his findings to members of Congress before Zuckerberg’s hearing last week, and U.S. Rep. Maxine Waters, who chairs the House Financial Services Committee, made note of the issue in her opening statement.

Waters said that Facebook’s ad delivery algorithms, “were found to have a discriminatory impact when advertisers did not target their audience in discriminatory ways.” 

Facebook’s algorithms are shrouded in proprietary secrecy, but here’s what Mislove and other researchers do know: When someone buys an ad on Facebook, the advertiser can tailor to whom they want to advertise. According to Facebook, the advertiser can choose certain audiences based upon location, demographics, interests, behavior, and more. If you’re selling hospital scrubs, you might want surgeons to see your ad, for example. If you’re the maker of a video game, you might want people who play video games to see your ad. 

 

Then—and here’s where it gets sticky—Facebook has to decide among all the advertisers who bid on you, which ads to show you. If you’re a surgeon who plays video games, at least two different advertisers might be vying for your attention. 

Mislove says that this choice used to be based on price—if the video game-seller paid more for ads, the video game ad wins out. 

But more recently, Facebook also makes estimates about relevance, Mislove says.

“Facebook wants to try to show you ads that it thinks will be relevant to you, but we don’t know how [the company] calculates relevance,” he says. 

It’s possible, Mislove says, that this calculation “propagates the discrimination” already present in certain domains.

For example: Certain surgical specialties are overwhelmingly male, according to research from the American Medical Association. If someone is advertising a job posting for a surgeon, Facebook’s algorithm might decide that since most surgeons are men, the ad is most relevant to men, and deliver it only to Facebook users who are men. 

And, for ads that are about housing, credit, and employment opportunities, such discrimination by gender or race is illegal in the U.S. 

Mislove doesn’t think that Zuckerberg or Facebook are actively trying to discriminate among racial and gender groups, but that an algorithm built to optimize engagement will do just that, no matter the cost. 

“The broader problem here is that the algorithm is set up to do something, although we don’t know what that is,” he says. “But the way that the algorithm is going about its task could be enhancing existing biases.” 

For media inquiries, please contact Jessica Hair at j.hair@northeastern.edu or 617-373-5718.