Facebook will stop using an advertising tool in settlement with US government | Meta

Facebook will change its algorithms to prevent discriminatory housing advertising and its parent company will seek judicial review to settle a lawsuit filed Tuesday by the US Department of Justice.

In a statement, US government officials said Meta, formerly known as Facebook, had reached an agreement to settle the lawsuit filed the same day in federal court in Manhattan.

Under the terms of the settlement, Facebook will stop using a real estate advertising tool that the government says employed a discriminatory algorithm to locate users who “look like” other users based on legally protected characteristics. on fair housing, the Justice Department said. By December 31, Facebook must stop using the tool formerly called “Lookalike Audience,” which relies on an algorithm that the US says discriminates on the basis of race, gender and other features.

Facebook will also develop a new system over the next half year to address racial and other disparities caused by its use of personalization algorithms in its real estate ad serving system, it said.

According to the release, this was the Justice Department’s first case challenging algorithmic discrimination under the Fair Housing Act. Facebook will now be subject to Justice Department approval and court oversight for its system for targeting and serving ads.

US attorney Damian Williams called the trial “groundbreaking”. Assistant Attorney General Kristen Clarke called it “historic.”

Facebook spokeswoman Ashley Settle said in an email that the company “is building a new method of machine learning without our advertising system that will change the way real estate listings are delivered to people in the United States.” United in different demographics”.

She said the company would expand its new method to employment and credit-related ads in the United States. “We are thrilled to be pioneering this effort,” Settle added in an email.

Williams said Facebook’s technology has in the past violated the Fair Housing Act online “just like when companies engage in discriminatory advertising using more traditional advertising methods.”

Clarke said that “companies like Meta have a responsibility to ensure that their algorithmic tools are not used in a discriminatory way.”

The announcement comes after Facebook already agreed in March 2019 to overhaul its ad targeting systems to prevent discrimination in housing, credit and employment ads as part of a legal settlement with a group including the American Civil Liberties Union, National Fair Housing Alliance and others. .

The changes announced then were designed so that advertisers who wanted to run real estate, job or credit ads would no longer be allowed to target people by age, gender or ZIP code.

The Justice Department said on Tuesday that the 2019 regulations reduce potentially discriminatory targeting options available to advertisers, but do not address other issues, including Facebook’s discriminatory delivery of real estate ads through learning algorithms. automatique.

Leave a Comment