On January 9, 2023, the U.S. Justice Department (“DOJ”) and the U.S. District Court for the Southern District of New York released further details of their discrimination settlement with Meta Platforms, Inc., formerly known as Facebook, Inc., over their housing ads delivery system’s violations of the Fair Housing Act (“FHAct”).
Last year, Meta entered into a settlement agreement with the DOJ over claims that their advertisement delivery algorithms violated the Fair Housing Act. The DOJ’s lawsuit was based on a 2018 investigation by the U.S. Department of Housing and Urban Development (“HUD”), which alleged that the platform allowed housing advertisers to use Facebook-created categories based on FHAct-protected characteristics in targeting or excluding the audience receiving housing ads.
Facebook had a “Special Ad Audience” tool which used protected categories to identify additional users with similar characteristics as an advertiser’s ideal audience – allowing targeting on the basis of race, religion, sex, and similar characteristics. Advertisers were also given the ability to exclude users from seeing their housing ads based on protected characteristics. And lastly, Facebook’s own internal ad delivery algorithms (which are inaccessible to both advertisers and users) used FHAct-protected characteristics to determine which consumers could see the advertisement.
As part of the settlement, Meta agreed to make significant changes to its housing ad delivery methods, including discontinuing the Special Ad Audience tool. The company also agreed to develop a new system for housing ads to address disparities between advertisers’ targeted audiences and the group of Facebook users who will actually be delivered the ads.
On January 9, 2023, Meta unveiled its new housing ad platform to address algorithmic discrimination (referred to as a “Variance Reduction System” or “VRS”). All housing advertisements on Facebook will now go through the VRS, which will be expanded later this year to include credit and employment ads.
Meta states that VRS works to combat discrimination by removing the ability of advertisers to target or exclude users based on personal characteristics, including age, zip code, and gender. The system will also aim to mitigate bias within the company’s internal algorithms by ensuring that all housing ads are distributed evenly across demographic groups. Meta plans to regularly review this tool to ensure that the ad audiences are in line with real-world demographic distributions, based on age, gender, and estimated race or ethnicity. Meta has also agreed to the DOJ’s follow-up compliance targets, which ensure that the company will be subject to court oversight and regular reviews until June 27, 2026.
“This development marks a pivotal step in the Justice Department’s efforts to hold Meta accountable for unlawful algorithmic bias and discriminatory ad delivery on its platforms,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division. “The Justice Department will continue to hold Meta accountable by ensuring the Variance Reduction System addresses and eliminates discriminatory delivery of advertisements on its platforms. Federal monitoring of Meta should send a strong signal to other tech companies that they too will be held accountable for failing to address algorithmic discrimination that runs afoul of our civil rights laws.”