Facebook’s job ad algorithm still has gender biases, according to researchers at the University of Southern California. The new study comes after repeated efforts by Facebook to decrease gender disparities.

USC researcher Aleksandra Korolova

Using a newly developed methodology, the USC research proves the ad-targeting bias can’t be explained away by differences in recipients’ qualifications, according to the researchers. Rather, the bias is based on gender, which could make Facebook’s algorithm illegal.

The study looked at programmatic job ads by comparable employers with different gender imbalances in their workforces.

First it looked at ads for delivery workers by Domino’s Pizza and Instacart: Domino’s employs more men than women, while Instacart employs more women than men.

The researchers found that, in line with the male-skewed staffing at Domino’s, its job ads through Facebook tended to reach more men than women. Instacart ads, meanwhile, tended to be seen by more women than men.

The same phenomenon affected job ads for higher-pay positions, the study found. To show that, the researchers compared ad campaigns for tech workers at Netflix, whose staff is more female, against those for Nvidia, whose staff is more male. Netflix job ads tended to reach women, while Nvidia’s reached more men.

By tracking ads for nearly identical jobs, the differences in whom the ads reached must come down to differences in gender, a protected class, the authors concluded.  “We confirm skew by gender in ad delivery on Facebook, and show that it cannot be justified by differences in qualifications,” they wrote.

The authors applied the same research methods to job ads on LinkedIn and found no such skew.

The charges against Facebook aren’t new. In 2018, a group of job seekers, with the support of the American Civil Liberty Union, filed charges that tech employers were using Facebook’s ad-targetting tools to get job ads in front of male job seekers, in violation of federal anti-discrimination laws. 

And a study last spring, with participation by the USC, found that Facebook’s algorithm could discriminate even when the employer placing the ad wanted to reach a balanced audience. That study implicated Facebook of bias on racial as well as gender lines.

At the time, Facebook announced, “We stand against discrimination in any form. We’ve announced important changes to our ad targeting tools and know that this is only a first step.”

In the wake of the new study, the company released a similar statement:

“Our system takes into account many signals to try and serve people ads they will be most interested in, but we understand the concerns raised in the report. We’ve taken meaningful steps to address issues of discrimination in ads and have teams working on ads fairness today.” 

However, it’s been more than two years since the ACLU lawsuit, raising questions about the company’s efforts.

Aleksandra Korolova, a former Google researcher and a co-author of the new study, told the Wall Street Journal that she was surprised Facebook hadn’t addressed the skewed distribution of job ads. “They’ve known about this for years, and it’s an important question for society,” she said.

Related Articles