SAN FRANCISCO — Meta on Tuesday agreed to change its ad-targeting technology and pay a $115,054 fine in a settlement with the Justice Department over allegations the company engaged in housing discrimination This allows advertisers to restrict who can see ads on the platform based on their race, gender and zip code.
Under the terms of the agreement, Meta, the company formerly known as Facebook, said it would change its technology and use a new computerized method aimed at periodically verifying that the audiences who are targeted and eligible to receive housing ads are actually seeing those ads. The new method, which Meta calls a “variance reduction system,” uses machine learning to ensure advertisers serve housing-related ads to specific protected groups of people.
Meta also said it would no longer use a feature called “Special Ad Audiences,” a tool it developed to help advertisers expand the groups of people their ads would reach. The company said the tool was an early attempt to combat prejudice and that its new methods were more effective.
“We will occasionally take a snapshot of marketers’ audiences, see who they’re targeting, and remove as much variance from that audience as possible,” Roy L. Austin, Meta’s vice president of civil rights and associate general counsel,” he said in an interview. He called it “a significant technological advance for using machine learning to deliver personalized ads.”
Facebook, which became a business behemoth by collecting its users’ data and allowing advertisers to target ads based on an audience’s characteristics, has faced complaints for years that some of these practices are biased and discriminatory. The company’s advertising systems have allowed marketers to choose who saw their ads based on thousands of different characteristics, which also allowed those advertisers to exclude people who fall under a number of protected categories.
Read more about artificial intelligence
While Tuesday’s settlement relates to housing ads, Meta also plans to apply its new system to verify the targeting of employment and credit-related ads. The company has previously faced setbacks Allow bias against women in job advertisements and the exclusion of certain groups of people see credit card ads.
“As a result of this landmark lawsuit, Meta will – for the first time – amend its ad delivery system to address algorithmic discrimination,” said Damian Williams, a US Attorney. said in a statement. “But unless Meta can demonstrate that it has modified its delivery system sufficiently to avoid algorithmic bias, this office will proceed with the litigation.”
The problem of biased ad targeting has been discussed particularly in the case of housing ads. In 2018, Ben Carson, then Secretary of the Department for Housing and Urban Development, announced a formal complaint against Facebook, accusing the company of having advertising systems that “unlawfully discriminate” based on categories such as race, religion and disability. Facebook’s potential for ad discrimination was also exposed in a 2016 detection by ProPublica, which showed that the company made it easy for marketers to exclude certain ethnic groups for advertising purposes.
2019, HUD sued Facebook for discrimination when looking for housing and violating the Fair Housing Act. The agency said Facebook’s systems didn’t serve ads to “a diverse audience,” even if an advertiser wanted the ad to be widely seen.
“Facebook discriminates against people based on who they are and where they live,” Mr. Carson said at the time. “Using a computer to restrict a person’s housing options can be as discriminatory as slamming a door in someone’s face.”
The HUD lawsuit came amid a broader push from civil rights groups claiming that the vast and complicated advertising systems underlying some of the internet’s largest platforms harbor inherent biases and that tech companies like Meta, Google and others should do more Fight back these prejudices.
The area of study known as “algorithmic fairness” is an important topic for computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists such as Timnit Gebru and Margaret Mitchell, have done so rang the alarm bell to such prejudices for years.
In the years since Facebook has struck on the types of categories marketers can choose from when buying housing ads, reducing the number to hundreds and eliminating race, age and zip code targeting options.
Meta’s new system, which is still in development, will occasionally check who gets housing, employment, and loan ads served to, making sure those audiences match the people marketers want to target. Theoretically, if the ads being served start to focus heavily on white males in their 20s, for example, the new system will recognize this and more fairly distribute the ads to be served to a broader and more diverse audience.
Meta said it will work with HUD in the coming months to integrate the technology into Meta’s ad targeting systems and agreed to a third-party audit of the new system’s effectiveness.
The penalty Meta pays in the settlement is the maximum available under the Fair Housing Act, the Justice Department said.