Mary Louise’s excitement about moving into a Massachusetts apartment in the spring of 2021 turned to disappointment when she received an email saying Louis, a Black woman, had been denied a move in by a “third-party service.” Ta.
The third-party service, which includes an algorithm that scores rental applicants, was the subject of a class-action lawsuit led by Louis, which alleges the algorithm discriminates based on race and income.
In one of the first settlements of its kind, a federal judge on Wednesday ordered the company that developed the algorithm to pay more than $2.2 million and reverse some of the product’s screenings that the lawsuit alleged were discriminatory. Agreed.
The settlement does not include an admission of fault by SafeRent Solutions, which said in a statement that “While we continue to believe that SRS scores comply with all applicable laws, “Litigation is time-consuming and expensive.”
While these lawsuits may be relatively new, the use of algorithms and artificial intelligence programs to screen and score Americans is not so new. AI has been quietly helping U.S. residents make important decisions for years.
When a person submits a job application, applies for a mortgage, or even seeks certain medical care, AI systems and algorithms score and evaluate them, just as Louis did. There is a possibility that However, these AI systems are largely unregulated, even though they have been found to be discriminatory.
Todd Kaplan, one of Louie’s attorneys, said: “Managers and landlords are concerned that they are now being warned and that these systems that they believe they can trust are now being challenged.” “We need to know that it will happen,” he said.
The complaint alleges that SafeRent’s algorithm does not take into account the benefits of housing subsidies, which are an important detail for a renter’s ability to pay their monthly bills, and therefore lower incomes that qualify for the subsidy. It was alleged that the government was discriminating against applicants.
The lawsuit also accused Saferent’s algorithm of relying too much on credit information. They believe the bill does not provide a complete picture of applicants’ ability to pay rent on time, and that Black and Hispanic applicants have lower median credit scores due to historical inequalities. They argued that the housing vouchers were unfairly issued due in part to the low price.
Kristin Weber, one of the plaintiffs’ attorneys, said that just because algorithms and AI are not programmed to discriminate, the data they use and weight “doesn’t seem like an intentional instruction to discriminate.” “It could have the same effect,” he said.
When Ms. Louie’s application was denied, she attempted to appeal the decision and even though she did not have a strong credit history, she required two people to prove that she had paid her rent early or on time for 16 years. I sent a letter of introduction for the landlord.
Louis, who had a housing certificate, was panicking because he had already given notice to his previous landlord and was left with the responsibility of looking after his granddaughter.
The response from the management company that used SafeRent’s screening service was that “We do not accept objections and cannot overturn the results of the tenant screening.”
Louis felt defeated. The algorithm didn’t know about her, she said.
“It’s all based on numbers. You don’t get any personal empathy from them,” Lewis said. “You can’t beat the system. The system will always beat us.”
State legislatures have proposed aggressive regulation of these types of AI systems, but the proposals have not garnered enough support. That means lawsuits like Louis’ are beginning to lay the groundwork for AI liability.
SafeRent’s attorneys argued in a motion to dismiss that SafeRent should not be liable for discrimination because it has not made the final decision on whether to accept or reject a tenant. The service screens applicants, scores them and submits a report, but it’s up to landlords and management companies to accept or reject tenants.
Louis’ lawyers, along with the U.S. Department of Justice, which filed a statement of interest in the case, argued that SafeRent’s algorithms still play a role in accessing housing and could be held liable. The judge denied Saferent’s motion to dismiss based on these charges.
The settlement provides that SafeRent cannot include scoring features in tenant screening reports in certain cases, such as when applicants are using housing vouchers. It also requires that any other screening scores Saferent plans to use must be verified by a third party to whom the plaintiffs agree.
Louis’ son found an affordable apartment for her on Facebook Marketplace and she has since moved in, but the apartment was $200 more expensive and in a less desirable neighborhood.
“I’m not optimistic that I’ll get a break, but I have to keep going, that’s all,” Lewis said. “There are too many people relying on me.”
___
Jesse Bedine is a corps member for the Associated Press/Report for America Statehouse News Initiative. Report for America is a nonprofit national service program that places journalists in local newsrooms to report on undercovered issues.