Continue reading on DataGuidance with:
Free Member
Limited ArticlesCreate an account to continue accessing select articles, resources, and guidance notes.
Already have an account? Log in
USA: DoJ announces $115,054 settlement with Meta for behavioural advertising violations
The Department of Justice ('DoJ') announced, on 21 June 2022, that it had reached a settlement of $115,054 with Meta Platforms Inc., formerly known as Facebook Inc., for violations of the Fair Housing Act of 1968 ('FHA') following Meta's use of algorithms in determining which Facebook users receive housing advertisements.
Background of the decision
In particular, the DoJ stated that it brought a lawsuit to the U.S. District Court Southern District of New York after the Fair Housing and Equal Opportunity at the Department of Housing and Urban Development filed an administrative complaint with the DoJ alleging that Meta had violated §42-3601 et seq. of Title 42 of the United States Code ('FHA') based on its targeting options and delivery processes for housing advertisements on Meta Platforms.
Findings of the Court
More specifically, the DoJ filed the lawsuit alleging that:
- Meta enabled and encouraged advertisers to target their housing advertisements by relying on race, colour, religion, sex, disability, familial status, and national origin to decide which Facebook users would be eligible and ineligible to receive housing advertisements;
- Meta created an advertising targeting tool known as 'Lookalike Audience' now known as 'Special Ad Audience' which uses a machine-learning algorithm to find Facebook users who share similarities with groups of individuals selected by an advertiser using several options provided by Facebook. Thus, Facebook allowed its algorithm to consider FHA-protected characteristics including race, religion, and sex in finding Facebook users who 'look like' the advertiser's source audience and thus were eligible to receive housing advertisements; and
- Meta's advertisement delivery system used machine-learning algorithms that rely in part on FHA-protected characteristics, to help determine which subset of an advertiser's targeted audience would receive a housing advertisement.
Thus, the DoJ further alleged in the complaint that Meta used the above aspects of its advertising system to target and deliver housing-related advertisements to some Facebook users while excluding other users based on FHA-protected characteristics.
Outcomes
As a result, the DoJ stated that it arrived at the following settlement with Meta:
- By 31 December 2022, Meta must stop using an advertising tool for housing advertisements.
- Meta has until December 2022 to develop a new system for housing advertisements to address disparities for race, ethnicity and sex between advertisers' targeted audiences and the group of Facebook users to whom Facebook's personalisation algorithms deliver the advertisements. Additionally, if the US concludes that the new system sufficiently addresses the discriminatory disparities that Meta's algorithms introduce, then Meta will fully implement the new system by 31 December 2022.
- If the US concludes that Meta's changes to its advertisements delivery system do not adequately address the discriminatory disparities, the settlement agreement will terminate and the US will litigate its case against Meta in federal court.
- The parties will select an independent, third-party reviewer to investigate and verify on an ongoing basis whether the new system is meeting the compliance standards agreed to by the parties and under the agreement, and Meta must provide the reviewer with any information necessary to verify compliance with those standards.
- Meta will not provide any targeting options for housing advertisers that directly describe or relate to FHA-protected characteristics and, under the agreement, it must notify the US if it intends to add any targeting options.
- Meta must pay the US a civil penalty of $115,054, the maximum penalty available under the FHA.
You can read the press release here and the settlement here.