Meta and the US Government settle lawsuit regarding discrimination in housing ads

Share

The government of America and Meta seem to have come to an agreement to clear up a lawsuit. The lawsuit was regarding the tendency of Facebook’s algorithm to show discriminatory housing advertisements. Just a week back they were hit with multiple lawsuits from across America over their app being addictive.

According to the press release, the company gave advertisers the option to specify that some of their housing ads will not be shown to people belonging to certain protected groups. Meta has further agreed in front of the government that they will change their ad algorithm and they have to pay $115,000 to settle the case. The first case done against the company for the same matter goes back to 2019 and back then, they even tried to rectify the issue on their part.

The Department of Justice(DOJ), says that it happened for the first time that they are dealing with a case of algorithmic violations under the Fair Housing Act. Back in 2019, Facebook was accused of unlawfully discriminating based on color, race, religion, sex, and disability, by restricting who can view housing ads. Ashley Settle, a Facebook spokesperson, talks about 

“building a novel machine learning method without our ads system that will change the way housing ads are delivered to people residing in the U.S. across different demographic groups.”

As per the terms of the settlement, Meta will need to immediately stop using an advertising tool for housing ads that takes the help of a discriminatory algorithm. They were also asked to stop the usage of their ‘Lookalike Audience’ tool which relies on factors such as race, sex, and other characteristics. The company was asked to develop a system that addresses the racial and other disparities caused by its personalization algorithms in its ad delivery system.

Meta is now taking active steps along with the US Department of Housing and Urban Development (HUD), to make sure that their machine-learning technology is working in such a way that the ad will only be targeted and shown to the people who are actually eligible to see it. The company will monitor factors such as race, gender, and ethnicity to measure how far off is the targeted audience as opposed to the audience interacting right now. 

Meta has also decided to be transparent from now on about their systems and the progress they are making on their new algorithm. The settlement shows that they will need to prove by the end of 2022 that its algorithm doesn’t have any malice and works the way it’s intended to. If the government does approve it, a third body will be given the responsibility to investigate and verify the workings on an ongoing basis.

Read more

Recommended For You