Facebook will make big changes to settle a U.S. Department of Justice lawsuit accusing the platform of allowing advertisers to discriminate against people with disabilities and other groups.

The Justice Department said this week that it reached an agreement with Facebook’s parent company, Meta, to resolve allegations that the company’s system for offering housing ads violates the Fair Housing Act.

In its lawsuit, the government contended that Facebook’s algorithm — which determines who sees a particular ad — wrongly allowed and encouraged advertisers to target housing ads to users based on disability status and other characteristics that are protected by federal housing law.

Advertisement - Continue Reading Below

The Fair Housing Act prohibits housing discrimination based on race, color, national origin, religion, sex, disability or familial status.

Under the agreement, Meta will stop using a tool known as the “Special Ad Audience” by the end of this year that the Justice Department said uses a machine learning algorithm to find users that “look like” certain users that an advertiser selects. The company will also create a new system by the close of the year to address disparities in its housing ads and it will no longer offer any targeting options for housing advertisers that relate to characteristics protected under the Fair Housing Act.

In addition, the company will pay a civil penalty of $115,054, the maximum amount allowed under the law.

“As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division. “This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit. The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.”

The settlement agreement comes more than three years after the Department of Housing and Urban Development charged Facebook with violating the Fair Housing Act. When the company chose to have the charge heard in federal court, the housing agency referred the matter to the Justice Department.

Facebook said that the changes it’s making will extend beyond housing ads.

“Notably, as part of this settlement, we will be building a novel machine learning method within our ads system that will change the way housing ads are delivered to people residing in the U.S. across different demographic groups. While HUD raised concerns about personalized housing ads specifically, we also plan to use this method for ads related to employment and credit in the U.S.,” Ashley Settle, a spokesperson for the company, wrote in an email. “This type of work is unprecedented in the advertising industry and represents a significant technological advancement for how machine learning is used to deliver personalized ads. We are excited to pioneer this effort.”