In 2019, Facebook was sued for housing discrimination because their machine learning advertising algorithm functioned “just like an advertiser who intentionally targets or excludes users based on their protected class.”
They have now settled the lawsuit by agreeing to scrap the algorithm:
Under the settlement, Meta will stop using an advertising tool for housing ads (known as the “Special Ad Audience” tool) which, according to the complaint, relies on a discriminatory algorithm to find users who “look like” other users based on FHA-protected characteristics. Meta also will develop a new system over the next six months to address racial and other disparities caused by its use of personalization algorithms in its ad delivery system for housing ads. If the United States concludes that the new system adequately addresses the discriminatory delivery of housing ads, then Meta will implement the system, which will be subject to Department of Justice approval and court oversight. If the United States concludes that the new system is insufficient to address algorithmic discrimination in the delivery of housing ads, then the settlement agreement will be terminated.
United States Attorney Resolves Groundbreaking Suit Against Meta Platforms, Inc., Formerly Known As Facebook, To Address Discriminatory Advertising For Housing
Government lawyers will need to approve Meta’s new algorithm, and Meta was fined $115,054, “the maximum penalty available under the Fair Housing Act.”
The DOJ’s press release states: “This settlement marks the first time that Meta will be subject to court oversight for its ad targeting and delivery system.”