SAN FRANCISCO — On Tuesday, Meta agreed to change its ad-targeting technology and pay a $115,054 fine to settle with the Justice Department, alleging it did so by letting advertisers limit who can post on the platform based on their race, gender and zip code.
Under the agreement, Meta, a company formerly known as Facebook, said it would change its technology And using a new computer-assisted method designed to periodically check whether target audiences and those eligible to receive housing ads actually saw them. The new approach, called a “variance reduction system,” relies on machine learning to ensure advertisers target housing-related ads to specific protected groups of people.
Meta also said it will no longer use a feature called “Special Ad Audiences,” a tool it developed to help advertisers expand their ad reach. The company says the tool is an early effort to fight bias, and its new approach will be more effective.
“We’ll take occasional snapshots of marketers’ audiences, understand their target audiences, and eliminate as many disparities in those audiences as possible,” said Roy L. Austin, vice president of civil rights and deputy general counsel at Meta, in an interview. Say. He called it “a major technological advance in how machine learning can be used to deliver personalized ads.”
Facebook, which became a business juggernaut by collecting user data and letting advertisers target ads based on demographics, has faced complaints for years that some of its practices are biased and discriminatory. The company’s advertising system allows marketers to choose who sees their ads by using thousands of different characteristics, which also lets those advertisers exclude people who fall into many protected categories.
While Tuesday’s settlement involved housing ads, Meta said it also plans to apply its new system to check the targeting of jobs and credit-related ads. The company has previously faced pushback for allowing bias against women in job ads and excluding certain groups of people. see a credit card ad.
“As a result of this groundbreaking lawsuit, Meta will, for the first time, change its ad delivery system to address algorithmic discrimination,” U.S. Attorney Damian Williams, said in a statement. “But if Meta fails to demonstrate that it has sufficiently altered its delivery system to prevent algorithmic bias, the office will proceed with the lawsuit.”
In housing advertising, the issue of biased ad targeting has been controversial. In 2018, then-Minister of Housing and Urban Development Ben Carson announced that Formal Complaint Oppose Facebook, accusing the company of having an ad system that “unlawfully discriminates” based on categories such as race, religion and disability. Facebook’s Ad Discrimination Potential Also in 2016 investigation Published by ProPublica, it shows that the company makes it easy for marketers to exclude specific racial groups from advertising.
In 2019, HUD sues Facebook Engage in housing discrimination and violations of the Fair Housing Act. The agency said Facebook’s system would not serve ads to “different audiences” even if advertisers wanted their ads to be widely seen.
“Facebook is discriminating against people based on who they are and where they live,” Carson said at the time. “Using a computer to limit one’s housing options can be as discriminatory as closing the door in someone’s face.”
The HUD lawsuit comes amid a broader push by civil rights groups claiming that the vast and complex ad systems underpinning some of the biggest internet platforms are inherently biased, and that tech companies like Meta, Google, and others should do more to crack down on those who support bias.
The area of research known as “algorithmic fairness” has been an important topic of interest to computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists like Timnit Gebru and Margaret Mitchell, have sounded the alarm about this bias over the years.
In the years since, Facebook has limited the types of categories marketers can choose from when buying housing ads, reducing the number to hundreds and eliminating targeting options based on race, age and zip code.
Meta’s new system, still in development, will occasionally check to whom housing, employment and credit ads are being offered and make sure those audiences are matched with the people marketers want to target. For example, if an ad served starts to skew heavily toward white men in their 20s, the new system would theoretically recognize that and serve ads more equitably to a wider and more diverse audience.
Meta said it would work with HUD in the coming months to integrate the technology into Meta’s ad targeting system, and agreed to a third-party audit of the new system’s effectiveness.
The Justice Department said the fine Meta paid in the settlement is the largest fine under the Fair Housing Act.