To settle a Justice Department complaint accusing it of allowing landlords to engage in discriminatory advertising on its Facebook platform, Meta has agreed to make adjustments to its housing advertisement system.
The Justice Department announced the complaint on Tuesday, describing the settlement deal as “groundbreaking” and the case as the first to raise concerns about algorithmic prejudice against the Fair Housing Act.
According to Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division, “This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit.”
The settlement was reached as a result of a lawsuit filed in 2019 against Meta, formerly known as Facebook, which claimed that its advertising system allowed landlords to target and deliver housing advertisements to some users of the social media platform while excluding others. The agreement still needs to be approved by the court.
The case made public on Tuesday claims that Meta encouraged advertisers to target housing advertising based on criteria such as race, colour, religion, sex, disability, national origin, and family status, in violation of the Fair Housing Act.
According to federal prosecutors, Meta used an ad-targeting tool that employed algorithms to locate Facebook users who resemble the groups chosen by an advertiser, as well as an ad delivery system that utilised algorithms that relied on federally protected characteristics to help determine which segment of an advertiser’s target audience would receive a housing ad.
According to U.S. Attorney Damian Williams for the Southern District of New York, “a company has violated the Fair Housing Act when it develops and deploys technology that denies users of housing opportunities based entirely or in part on protected characteristics, just as when companies engage in discriminatory advertising using more conventional advertising methods.”
By the end of the year, Meta will no longer employ its biassed algorithm, under the agreement. It also has until December to create a new system for housing advertisements that will be examined by the United States and resolve discrepancies for race, ethnicity, and sex.
The settlement will be terminated and the Justice Department will bring legal action if the United States determines that its new system does not effectively address the inequities in treatment due to discrimination.
Additionally, Meta must pay a civil fine of $115,054, which is the maximum amount allowed by the Fair Housing Act.
In a blog post published on Tuesday, the social media giant stated that while the case focuses on housing marketing, it has plans to extend the new system to employment and credit advertising.
- IRS Clears the Backlog From Last Season, but Still Fears a Crush in 2022
- Senators Applaud a “Bipartisan Breakthrough” on Gun Control
- After a Power Outage, SNAP Replacement Benefits Are Available
The statement read, “Discrimination in housing, employment, and credit is a pervasive issue in the United States with a long history. We are committed to enhancing opportunities for underrepresented people in these spaces and others.