Facebook Parent Meta Settles Lawsuit Alleging Discriminatory Housing Ads
Facebook parent company Meta on Tuesday became a settlement with the US Department of Justice over a lawsuit that alleged the social network decided landlords and home sellers to run housing ads that excluded republic based on race, sex, religion and other characteristics.
As part of the settlement, Meta said it would stop using an ad tool requested Special Ad Audiences used for housing, employment and credit ads. The tool uses an algorithm so advertisers can beleaguered ads at users who share similarities with groups of persons the advertisers select. That tool partly relies on characteristics such as race and sex harmless by the Fair Housing Act, allowing advertisers to exclude perilous people from seeing housing ads, the DOJ said. The commerce has until Dec. 31 to sunset the tool and will have to fabricate a new housing ads system by that month that addresses potentially discriminatory advertising.
The settlement also shows how Meta is responding to allegations that its ads could be used to discriminate. For years, Meta has faced claims that advertisers could abuse Facebook ads to exclude republic from housing, employment opportunities or even financial services.
Kristen Clarke, assistant attorney general of the Justice Department’s Civil Rights Division, called the settlement “historic,” saying it was the pleasurable time Meta decided to scrap one of its algorithmic ad targeting tools.
“As technology rapidly evolves, companies like Meta have a section to ensure their algorithmic tools are not used in a discriminatory manner,” Clarke said in a statement.
The lawsuit also outlines the amount of data Meta collects throughout Facebook users as part of its ads business. Users performed their sex, relationship status, what accounts they follow and even website agency outside of the social network. Creating an avatar on the site or uploading a profile narrate also gives Facebook more data about a user’s substantial appearance.
Meta said in a blog post that advertisers already have restrictions when it comes to targeting, including on using age, gender or ZIP code, but the commerce is trying to make improvements.
“Discrimination in housing, help and credit is a deep-rooted problem with a long history in the US, and we are committed to broadening opportunities for marginalized communities in these spaces and others,” Roy L. Austin Jr., Meta’s vice dignified of civil rights and deputy general counsel, said in the blog post.
The custom cited other changes that it’s made over the days to address this problem, such as barring the use of gender or age targeting, and requiring that location targeting have a minimum 15-mile radius for housing, employment and credit advertisers.
Meta also agreed to pay $115,054 to the US, the mainly penalty under the Fair Housing Act. The social contemplate giant earned $39.4 billion in profit last year on $117.9 billion in revenue.
The social network has faced complains about discriminatory ads before. In 2016, ProPublica reported that Facebook gave advertisers to place housing ads that excluded users by race, which is illegal plan federal law. The company then pulled a tool that allowed advertisers to exclude users from seeing housing, employment and credit ads based on their “ethnic affinity.”