After Lawsuits, Facebook Announces Changes To Alleged Discriminatory Ad Targeting

After years of criticism and multiple lawsuits alleging Facebook engaged in discrimination by allowing advertisers to select which users could see their ads, the social media giant announced changes to its ad platform as well as a settlement worth millions.

On Tuesday, Sheryl Sandberg, Facebook’s Chief Operating Officer, announced on the company’s website the changes “will better protect people on Facebook.”

“One of our top priorities is protecting people from discrimination on Facebook. Today, we’re announcing changes in how we manage housing, employment, and credit ads on our platform,” Sandberg said.

Facebook was providing online advertisers tools to customize which users could view their ads based on their likes as well as their background.

In a federal suit filed last March, the National Fair Housing Alliance and others accused Facebook of providing an option for “advertisers to exclude families with children and women from receiving advertisements, as well as users with interests based on disability and national origin. Then Facebook approves and permits advertisers to publish these ads in a discriminatory manner without consumers ever knowing they have been excluded.”

As part of Tuesday’s announcement, Facebook officials say they settled with NFHA and other groups for about $5 million.

Facebook officials said “we can do better” and announced other ad changes would include:

  • Anyone who wants to run housing, employment or credit ads will no longer be allowed to target by age, gender, or zip code
  • Advertisers offering housing, employment, and credit opportunities will have a much smaller set of targeting categories to use in their campaigns overall
  • We’re building a tool so you can search for and view all current housing ads in the US targeted to different places across the country, regardless of whether the ads are shown to you

This comes years after a 2016 ProPublica report, which found advertisers could also weed out who could see their ads by using filters referred to as “Ethnic Affinities.”

In a house hunting advertisement purchased by ProPublica for their investigation, they were able to “exclude anyone for an ‘affinity’ for African-American, Asian-American or Hispanic people” according to the report.

Facebook promised to make changes following that report.

But a subsequent investigation by ProPublica in 2017 still found “a significant lapse” in how Facebook monitoring advertisers.

Last August, the Department of Housing and Urban Development accused Facebook of engaging in housing discrimination by allowing landlords and home sellers to block certain prospective buyers or tenants from seeing online ads based on race, sex, religion and other characteristics.

In its complaint, Anna María Farías, HUD assistant secretary for fair housing and equal opportunity, said in a statement, “The Fair Housing Act prohibits housing discrimination including those who might limit or deny housing options with a click of a mouse.”

She added: “When Facebook uses the vast amount of personal data it collects to help advertisers to discriminate, it’s the same as slamming the door in someone’s face.”

Days after the HUD complaint, Facebook announced it was updating its ad-targeting tools and “removing over 5,000 targeting options to help prevent misuse.”

Copyright 2019 NPR. To see more, visit https://www.npr.org.