Curated by THEOUTPOST
On Thu, 21 Nov, 8:02 AM UTC
7 Sources
[1]
AI discrimination lawsuit reaches $2.2 million settlement
Mary Louis's excitement to move into an apartment in Massachusetts in the spring of 2021 turned to dismay when Louis, a Black woman, received an email saying that a "third-party service" had denied her tenancy. That third-party service included an algorithm designed to score rental applicants, which became the subject of a class action lawsuit, with Louis at the helm, alleging that the algorithm discriminated on the basis of race and income. A federal judge approved a settlement in the lawsuit, one of the first of it's kind, on Wednesday, with the company behind the algorithm agreeing to pay over $2.2 million and roll back certain parts of it's screening products that the lawsuit alleged were discriminatory. The settlement does not include any admissions of fault by the company SafeRent Solutions, which said in a statement that while it "continues to believe the SRS Scores comply with all applicable laws, litigation is time-consuming and expensive." While such lawsuits might be relatively new, the use of algorithms or artificial intelligence programs to screen or score Americans isn't. For years, AI has been furtively helping make consequential decisions for U.S. residents. When a person submits a job application, applies for a home loan or even seeks certain medical care, there's a chance that an AI system or algorithm is scoring or assessing them like it did Louis. Those AI systems, however, are largely unregulated, even though some have been found to discriminate. "Management companies and landlords need to know that they're now on notice, that these systems that they are assuming are reliable and good are going to be challenged," said Todd Kaplan, one of Louis's attorneys. The lawsuit alleged SafeRent's algorithm didn't take into account the benefits of housing vouchers, which they said was an important detail for a renter's ability to pay the monthly bill, and it therefore discriminated against low-income applicants who qualified for the aid. The suit also accused SafeRent's algorithm of relying too much on credit information. They argued that it fails to give a full picture of an applicant's ability to pay rent on time and unfairly dings applicants with housing vouchers who are Black and Hispanic partly because they have lower median credit scores, attributable to historical inequities. Christine Webber, one of the plaintiff's attorneys, said that just because an algorithm or AI is not programmed to discriminate, the data an algorithm uses or weights could have "the same effect as if you told it to discriminate intentionally." When Louis's application was denied, she tried appealing the decision, sending two landlords' references to show she'd paid rent early or on time for 16 years, even if she didn't have a strong credit history. Louis, who had a housing voucher, was scrambling, having already given notice to her previous landlord that she was moving out, and she was charged with taking care of her granddaughter. The response from the management company, which used SafeRent's screening service, read, "We do not accept appeals and cannot override the outcome of the Tenant Screening." Louis felt defeated; the algorithm didn't know her, she said. "Everything is based on numbers. You don't get the individual empathy from them," said Louis. "There is no beating the system. The system is always going to beat us." While state lawmakers have proposed aggressive regulations for these types of AI systems, the proposals have largely failed to get enough support. That means lawsuits like Louis's are starting to lay the groundwork for AI accountability. SafeRent's defense attorneys argued in a motion to dismiss that the company shouldn't be held liable for discrimination because SafeRent wasn't making the final decision on whether to accept or deny a tenant. The service would screen applicants, score them and submit a report, but leave it to landlords or management companies to accept or deny a tenant. Louis's attorneys, along with the U.S. Department of Justice, which submitted a statement of interest in the case, argued that SafeRent's algorithm could be held accountable because it still plays a role in access to housing. The judge denied SafeRent's motion to dismiss on those counts. The settlement stipulates that SafeRent can't include its score feature on its tenant screening reports in certain cases, including if the applicant is using a housing voucher. It also requires that if SafeRent develops another screening score it plans to use, it must be validated by a third-party that the plaintiffs agree to. Louis's son found an affordable apartment for her on Facebook Marketplace that she has since moved into, though it was $200 more expensive and in a less desirable area. "I'm not optimistic that I'm going to catch a break, but I have to keep on keeping, that's it," said Louis. "I have too many people who rely on me."
[2]
Class action lawsuit on AI-related discrimination reaches final settlement
Mary Louis' excitement to move into an apartment in Massachusetts in the spring of 2021 turned to dismay when Louis, a Black woman, received an email saying that a "third-party service" had denied her tenancy. That third-party service included an algorithm designed to score rental applicants, which became the subject of a class action lawsuit, with Louis at the helm, alleging that the algorithm discriminated on the basis of race and income. A federal judge approved a settlement in the lawsuit, one of the first of it's kind, on Wednesday, with the company behind the algorithm agreeing to pay over $2.2 million and roll back certain parts of it's screening products that the lawsuit alleged were discriminatory. The settlement does not include any admissions of fault by the company SafeRent Solutions, which said in a statement that while it "continues to believe the SRS Scores comply with all applicable laws, litigation is time-consuming and expensive." While such lawsuits might be relatively new, the use of algorithms or artificial intelligence programs to screen or score Americans isn't. For years, AI has been furtively helping make consequential decisions for U.S. residents. When a person submits a job application, applies for a home loan or even seeks certain medical care, there's a chance that an AI system or algorithm is scoring or assessing them like it did Louis. Those AI systems, however, are largely unregulated, even though some have been found to discriminate. "Management companies and landlords need to know that they're now on notice, that these systems that they are assuming are reliable and good are going to be challenged," said Todd Kaplan, one of Louis' attorneys. The lawsuit alleged SafeRent's algorithm didn't take into account the benefits of housing vouchers, which they said was an important detail for a renter's ability to pay the monthly bill, and it therefore discriminated against low-income applicants who qualified for the aid. The suit also accused SafeRent's algorithm of relying too much on credit information. They argued that it fails to give a full picture of an applicant's ability to pay rent on time and unfairly dings applicants with housing vouchers who are Black and Hispanic partly because they have lower median credit scores, attributable to historical inequities. Christine Webber, one of the plaintiff's attorneys, said that just because an algorithm or AI is not programmed to discriminate, the data an algorithm uses or weights could have "the same effect as if you told it to discriminate intentionally." When Louis' application was denied, she tried appealing the decision, sending two landlords' references to show she'd paid rent early or on time for 16 years, even if she didn't have a strong credit history. Louis, who had a housing voucher, was scrambling, having already given notice to her previous landlord that she was moving out, and she was charged with taking care of her granddaughter. The response from the management company, which used SafeRent's screening service, read, "We do not accept appeals and cannot override the outcome of the Tenant Screening." Louis felt defeated; the algorithm didn't know her, she said. "Everything is based on numbers. You don't get the individual empathy from them," said Louis. "There is no beating the system. The system is always going to beat us." While state lawmakers have proposed aggressive regulations for these types of AI systems, the proposals have largely failed to get enough support. That means lawsuits like Louis' are starting to lay the groundwork for AI accountability. SafeRent's defense attorneys argued in a motion to dismiss that the company shouldn't be held liable for discrimination because SafeRent wasn't making the final decision on whether to accept or deny a tenant. The service would screen applicants, score them and submit a report, but leave it to landlords or management companies to accept or deny a tenant. Louis' attorneys, along with the U.S. Department of Justice, which submitted a statement of interest in the case, argued that SafeRent's algorithm could be held accountable because it still plays a role in access to housing. The judge denied SafeRent's motion to dismiss on those counts. The settlement stipulates that SafeRent can't include its score feature on its tenant screening reports in certain cases, including if the applicant is using a housing voucher. It also requires that if SafeRent develops another screening score it plans to use, it must be validated by a third-party that the plaintiffs agree to. Louis' son found an affordable apartment for her on Facebook Marketplace that she has since moved into, though it was $200 more expensive and in a less desirable area. "I'm not optimistic that I'm going to catch a break, but I have to keep on keeping, that's it," said Louis. "I have too many people who rely on me." Jesse Bedayn is a corps member for the Associated Press/Report for America Statehouse News Initiative. Report for America is a nonprofit national service program that places journalists in local newsrooms to report on undercovered issues.
[3]
Renter scoring firm agrees to pay $2.2 million to settle case accusing its algorithm of discriminating on race and income
Mary Louis' excitement to move into an apartment in Massachusetts in the spring of 2021 turned to dismay when Louis, a Black woman, received an email saying that a "third-party service" had denied her tenancy. That third-party service included an algorithm designed to score rental applicants, which became the subject of a class action lawsuit, with Louis at the helm, alleging that the algorithm discriminated on the basis of race and income. A federal judge approved a settlement in the lawsuit, one of the first of it's kind, on Wednesday, with the company behind the algorithm agreeing to pay over $2.2 million and roll back certain parts of it's screening products that the lawsuit alleged were discriminatory. The settlement does not include any admissions of fault by the company SafeRent Solutions, which said in a statement that while it "continues to believe the SRS Scores comply with all applicable laws, litigation is time-consuming and expensive." While such lawsuits might be relatively new, the use of algorithms or artificial intelligence programs to screen or score Americans isn't. For years, AI has been furtively helping make consequential decisions for U.S. residents. When a person submits a job application, applies for a home loan or even seeks certain medical care, there's a chance that an AI system or algorithm is scoring or assessing them like it did Louis. Those AI systems, however, are largely unregulated, even though some have been found to discriminate. "Management companies and landlords need to know that they're now on notice, that these systems that they are assuming are reliable and good are going to be challenged," said Todd Kaplan, one of Louis' attorneys. The lawsuit alleged SafeRent's algorithm didn't take into account the benefits of housing vouchers, which they said was an important detail for a renter's ability to pay the monthly bill, and it therefore discriminated against low-income applicants who qualified for the aid. The suit also accused SafeRent's algorithm of relying too much on credit information. They argued that it fails to give a full picture of an applicant's ability to pay rent on time and unfairly dings applicants with housing vouchers who are Black and Hispanic partly because they have lower median credit scores, attributable to historical inequities. Christine Webber, one of the plaintiff's attorneys, said that just because an algorithm or AI is not programmed to discriminate, the data an algorithm uses or weights could have "the same effect as if you told it to discriminate intentionally." When Louis' application was denied, she tried appealing the decision, sending two landlords' references to show she'd paid rent early or on time for 16 years, even if she didn't have a strong credit history. Louis, who had a housing voucher, was scrambling, having already given notice to her previous landlord that she was moving out, and she was charged with taking care of her granddaughter. The response from the management company, which used SafeRent's screening service, read, "We do not accept appeals and cannot override the outcome of the Tenant Screening." Louis felt defeated; the algorithm didn't know her, she said. "Everything is based on numbers. You don't get the individual empathy from them," said Louis. "There is no beating the system. The system is always going to beat us." While state lawmakers have proposed aggressive regulations for these types of AI systems, the proposals have largely failed to get enough support. That means lawsuits like Louis' are starting to lay the groundwork for AI accountability. SafeRent's defense attorneys argued in a motion to dismiss that the company shouldn't be held liable for discrimination because SafeRent wasn't making the final decision on whether to accept or deny a tenant. The service would screen applicants, score them and submit a report, but leave it to landlords or management companies to accept or deny a tenant. Louis' attorneys, along with the U.S. Department of Justice, which submitted a statement of interest in the case, argued that SafeRent's algorithm could be held accountable because it still plays a role in access to housing. The judge denied SafeRent's motion to dismiss on those counts. The settlement stipulates that SafeRent can't include its score feature on its tenant screening reports in certain cases, including if the applicant is using a housing voucher. It also requires that if SafeRent develops another screening score it plans to use, it must be validated by a third-party that the plaintiffs agree to. Louis' son found an affordable apartment for her on Facebook Marketplace that she has since moved into, though it was $200 more expensive and in a less desirable area. "I'm not optimistic that I'm going to catch a break, but I have to keep on keeping, that's it," said Louis. "I have too many people who rely on me."
[4]
Class action lawsuit on AI-related discrimination reaches final settlement
Mary Louis' excitement to move into an apartment in Massachusetts in the spring of 2021 turned to dismay when Louis, a Black woman, received an email saying that a "third-party service" had denied her tenancy. That third-party service included an algorithm designed to score rental applicants, which became the subject of a class action lawsuit, with Louis at the helm, alleging that the algorithm discriminated on the basis of race and income. A federal judge approved a settlement in the lawsuit, one of the first of it's kind, on Wednesday, with the company behind the algorithm agreeing to pay over $2.2 million and roll back certain parts of it's screening products that the lawsuit alleged were discriminatory. The settlement does not include any admissions of fault by the company SafeRent Solutions, which said in a statement that while it "continues to believe the SRS Scores comply with all applicable laws, litigation is time-consuming and expensive." While such lawsuits might be relatively new, the use of algorithms or artificial intelligence programs to screen or score Americans isn't. For years, AI has been furtively helping make consequential decisions for U.S. residents. When a person submits a job application, applies for a home loan or even seeks certain medical care, there's a chance that an AI system or algorithm is scoring or assessing them like it did Louis. Those AI systems, however, are largely unregulated, even though some have been found to discriminate. "Management companies and landlords need to know that they're now on notice, that these systems that they are assuming are reliable and good are going to be challenged," said Todd Kaplan, one of Louis' attorneys. The lawsuit alleged SafeRent's algorithm didn't take into account the benefits of housing vouchers, which they said was an important detail for a renter's ability to pay the monthly bill, and it therefore discriminated against low-income applicants who qualified for the aid. The suit also accused SafeRent's algorithm of relying too much on credit information. They argued that it fails to give a full picture of an applicant's ability to pay rent on time and unfairly dings applicants with housing vouchers who are Black and Hispanic partly because they have lower median credit scores, attributable to historical inequities. Christine Webber, one of the plaintiff's attorneys, said that just because an algorithm or AI is not programmed to discriminate, the data an algorithm uses or weights could have "the same effect as if you told it to discriminate intentionally." When Louis' application was denied, she tried appealing the decision, sending two landlords' references to show she'd paid rent early or on time for 16 years, even if she didn't have a strong credit history. Louis, who had a housing voucher, was scrambling, having already given notice to her previous landlord that she was moving out, and she was charged with taking care of her granddaughter. The response from the management company, which used SafeRent's screening service, read, "We do not accept appeals and cannot override the outcome of the Tenant Screening." Louis felt defeated; the algorithm didn't know her, she said. "Everything is based on numbers. You don't get the individual empathy from them," said Louis. "There is no beating the system. The system is always going to beat us." While state lawmakers have proposed aggressive regulations for these types of AI systems, the proposals have largely failed to get enough support. That means lawsuits like Louis' are starting to lay the groundwork for AI accountability. SafeRent's defense attorneys argued in a motion to dismiss that the company shouldn't be held liable for discrimination because SafeRent wasn't making the final decision on whether to accept or deny a tenant. The service would screen applicants, score them and submit a report, but leave it to landlords or management companies to accept or deny a tenant. Louis' attorneys, along with the U.S. Department of Justice, which submitted a statement of interest in the case, argued that SafeRent's algorithm could be held accountable because it still plays a role in access to housing. The judge denied SafeRent's motion to dismiss on those counts. The settlement stipulates that SafeRent can't include its score feature on its tenant screening reports in certain cases, including if the applicant is using a housing voucher. It also requires that if SafeRent develops another screening score it plans to use, it must be validated by a third-party that the plaintiffs agree to. Louis' son found an affordable apartment for her on Facebook Marketplace that she has since moved into, though it was $200 more expensive and in a less desirable area. "I'm not optimistic that I'm going to catch a break, but I have to keep on keeping, that's it," said Louis. "I have too many people who rely on me." ___ Jesse Bedayn is a corps member for the Associated Press/Report for America Statehouse News Initiative. Report for America is a nonprofit national service program that places journalists in local newsrooms to report on undercovered issues.
[5]
Class action lawsuit on AI-related discrimination reaches final settlement
Mary Louis' excitement to move into an apartment in Massachusetts in the spring of 2021 turned to dismay when Louis, a Black woman, received an email saying that a "third-party service" had denied her tenancy. That third-party service included an algorithm designed to score rental applicants, which became the subject of a class action lawsuit, with Louis at the helm, alleging that the algorithm discriminated on the basis of race and income. A federal judge approved a settlement in the lawsuit, one of the first of it's kind, on Wednesday, with the company behind the algorithm agreeing to pay over $2.2 million and roll back certain parts of it's screening products that the lawsuit alleged were discriminatory. The settlement does not include any admissions of fault by the company SafeRent Solutions, which said in a statement that while it "continues to believe the SRS Scores comply with all applicable laws, litigation is time-consuming and expensive." While such lawsuits might be relatively new, the use of algorithms or artificial intelligence programs to screen or score Americans isn't. For years, AI has been furtively helping make consequential decisions for U.S. residents. When a person submits a job application, applies for a home loan or even seeks certain medical care, there's a chance that an AI system or algorithm is scoring or assessing them like it did Louis. Those AI systems, however, are largely unregulated, even though some have been found to discriminate. "Management companies and landlords need to know that they're now on notice, that these systems that they are assuming are reliable and good are going to be challenged," said Todd Kaplan, one of Louis' attorneys. The lawsuit alleged SafeRent's algorithm didn't take into account the benefits of housing vouchers, which they said was an important detail for a renter's ability to pay the monthly bill, and it therefore discriminated against low-income applicants who qualified for the aid. The suit also accused SafeRent's algorithm of relying too much on credit information. They argued that it fails to give a full picture of an applicant's ability to pay rent on time and unfairly dings applicants with housing vouchers who are Black and Hispanic partly because they have lower median credit scores, attributable to historical inequities. Christine Webber, one of the plaintiff's attorneys, said that just because an algorithm or AI is not programmed to discriminate, the data an algorithm uses or weights could have "the same effect as if you told it to discriminate intentionally." When Louis' application was denied, she tried appealing the decision, sending two landlords' references to show she'd paid rent early or on time for 16 years, even if she didn't have a strong credit history. Louis, who had a housing voucher, was scrambling, having already given notice to her previous landlord that she was moving out, and she was charged with taking care of her granddaughter. The response from the management company, which used SafeRent's screening service, read, "We do not accept appeals and cannot override the outcome of the Tenant Screening." Louis felt defeated; the algorithm didn't know her, she said. "Everything is based on numbers. You don't get the individual empathy from them," said Louis. "There is no beating the system. The system is always going to beat us." While state lawmakers have proposed aggressive regulations for these types of AI systems, the proposals have largely failed to get enough support. That means lawsuits like Louis' are starting to lay the groundwork for AI accountability. SafeRent's defense attorneys argued in a motion to dismiss that the company shouldn't be held liable for discrimination because SafeRent wasn't making the final decision on whether to accept or deny a tenant. The service would screen applicants, score them and submit a report, but leave it to landlords or management companies to accept or deny a tenant. Louis' attorneys, along with the U.S. Department of Justice, which submitted a statement of interest in the case, argued that SafeRent's algorithm could be held accountable because it still plays a role in access to housing. The judge denied SafeRent's motion to dismiss on those counts. The settlement stipulates that SafeRent can't include its score feature on its tenant screening reports in certain cases, including if the applicant is using a housing voucher. It also requires that if SafeRent develops another screening score it plans to use, it must be validated by a third-party that the plaintiffs agree to. Louis' son found an affordable apartment for her on Facebook Marketplace that she has since moved into, though it was $200 more expensive and in a less desirable area. "I'm not optimistic that I'm going to catch a break, but I have to keep on keeping, that's it," said Louis. "I have too many people who rely on me." ___ Jesse Bedayn is a corps member for the Associated Press/Report for America Statehouse News Initiative. Report for America is a nonprofit national service program that places journalists in local newsrooms to report on undercovered issues.
[6]
Class Action Lawsuit on AI-Related Discrimination Reaches Final Settlement
Mary Louis' excitement to move into an apartment in Massachusetts in the spring of 2021 turned to dismay when Louis, a Black woman, received an email saying that a "third-party service" had denied her tenancy. That third-party service included an algorithm designed to score rental applicants, which became the subject of a class action lawsuit, with Louis at the helm, alleging that the algorithm discriminated on the basis of race and income. A federal judge approved a settlement in the lawsuit, one of the first of it's kind, on Wednesday, with the company behind the algorithm agreeing to pay over $2.2 million and roll back certain parts of it's screening products that the lawsuit alleged were discriminatory. The settlement does not include any admissions of fault by the company SafeRent Solutions, which said in a statement that while it "continues to believe the SRS Scores comply with all applicable laws, litigation is time-consuming and expensive." While such lawsuits might be relatively new, the use of algorithms or artificial intelligence programs to screen or score Americans isn't. For years, AI has been furtively helping make consequential decisions for U.S. residents. When a person submits a job application, applies for a home loan or even seeks certain medical care, there's a chance that an AI system or algorithm is scoring or assessing them like it did Louis. Those AI systems, however, are largely unregulated, even though some have been found to discriminate. "Management companies and landlords need to know that they're now on notice, that these systems that they are assuming are reliable and good are going to be challenged," said Todd Kaplan, one of Louis' attorneys. The lawsuit alleged SafeRent's algorithm didn't take into account the benefits of housing vouchers, which they said was an important detail for a renter's ability to pay the monthly bill, and it therefore discriminated against low-income applicants who qualified for the aid. The suit also accused SafeRent's algorithm of relying too much on credit information. They argued that it fails to give a full picture of an applicant's ability to pay rent on time and unfairly dings applicants with housing vouchers who are Black and Hispanic partly because they have lower median credit scores, attributable to historical inequities. Christine Webber, one of the plaintiff's attorneys, said that just because an algorithm or AI is not programmed to discriminate, the data an algorithm uses or weights could have "the same effect as if you told it to discriminate intentionally." When Louis' application was denied, she tried appealing the decision, sending two landlords' references to show she'd paid rent early or on time for 16 years, even if she didn't have a strong credit history. Louis, who had a housing voucher, was scrambling, having already given notice to her previous landlord that she was moving out, and she was charged with taking care of her granddaughter. The response from the management company, which used SafeRent's screening service, read, "We do not accept appeals and cannot override the outcome of the Tenant Screening." Louis felt defeated; the algorithm didn't know her, she said. "Everything is based on numbers. You don't get the individual empathy from them," said Louis. "There is no beating the system. The system is always going to beat us." While state lawmakers have proposed aggressive regulations for these types of AI systems, the proposals have largely failed to get enough support. That means lawsuits like Louis' are starting to lay the groundwork for AI accountability. SafeRent's defense attorneys argued in a motion to dismiss that the company shouldn't be held liable for discrimination because SafeRent wasn't making the final decision on whether to accept or deny a tenant. The service would screen applicants, score them and submit a report, but leave it to landlords or management companies to accept or deny a tenant. Louis' attorneys, along with the U.S. Department of Justice, which submitted a statement of interest in the case, argued that SafeRent's algorithm could be held accountable because it still plays a role in access to housing. The judge denied SafeRent's motion to dismiss on those counts. The settlement stipulates that SafeRent can't include its score feature on its tenant screening reports in certain cases, including if the applicant is using a housing voucher. It also requires that if SafeRent develops another screening score it plans to use, it must be validated by a third-party that the plaintiffs agree to. Louis' son found an affordable apartment for her on Facebook Marketplace that she has since moved into, though it was $200 more expensive and in a less desirable area. "I'm not optimistic that I'm going to catch a break, but I have to keep on keeping, that's it," said Louis. "I have too many people who rely on me." ___ Jesse Bedayn is a corps member for the Associated Press/Report for America Statehouse News Initiative. Report for America is a nonprofit national service program that places journalists in local newsrooms to report on undercovered issues. Copyright 2024 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[7]
SafeRent Settles $2.3M Discrimination Lawsuit Over Alleged AI Screening Bias Against Low-Income Renters
*Terms and conditions apply. Visit Nada's website for more details. SafeRent Solutions, an AI-powered tenant screening tool, has reached a settlement to resolve a class action lawsuit filed in Massachusetts. What Happened: On Wednesday, U.S. District Judge Angel Kelley granted final approval for a settlement of about $2.3 million. The lawsuit accused SafeRent's algorithm of disproportionately scoring Black and Hispanic tenants, as well as those using housing vouchers, lower than other applicants. Tenants with housing vouchers, who are more likely to be low-income, were reportedly more likely to be denied housing based on their AI scores. See Also: 'No Grand Family Compound,' Elon Musk Says Amid Rumors Of $35M Texas Estate As part of the settlement, SafeRent will stop using AI-generated scores to evaluate applicants who use housing vouchers. The company will also cease providing any recommendations on whether landlords should accept or deny applicants with vouchers. SafeRent spokesperson Yazmin Lopez told The Verge, "It became increasingly clear that defending the SRS Score in this case would divert time and resources SafeRent can better use to serve its core mission of giving housing providers the tools they need to screen applicants." Why It Matters: SafeRent has joined other property management platforms that are facing legal challenges over algorithmic practices. This includes RealPage, an American property management software company, which is currently under investigation by the Department of Justice for alleged rent-inflating practices, the report noted. SafeRent, backed by IA Capital Group, closed its latest funding round on Sept. 1, 2001, securing investment in a Series C round, according to Crunchbase. The company faces competition from other firms in the tenant screening and property management space, with notable alternatives including Home Buyer Louisiana, PointCentral, and Ivan AI. Read Next: CrowdStreet's New CEO John Imbriglia Charts a Bold Path for Commercial Real Estate Investors Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. Photo courtesy: SafeRent Market News and Data brought to you by Benzinga APIs
Share
Share
Copy Link
A class action lawsuit against SafeRent Solutions, alleging racial and income-based discrimination in its AI rental screening algorithm, reaches a $2.2 million settlement, marking a significant step in AI accountability.
In a landmark case highlighting the potential pitfalls of AI-driven decision-making, SafeRent Solutions has agreed to a $2.2 million settlement in a class action lawsuit alleging discrimination in its rental screening algorithm. The case, led by Mary Louis, a Black woman from Massachusetts, accused the algorithm of discriminating on the basis of race and income 123.
The lawsuit, one of the first of its kind, alleged that SafeRent's algorithm failed to adequately consider housing vouchers and relied too heavily on credit information, potentially discriminating against low-income applicants and people of color 12. This case underscores the growing concern about the use of AI and algorithms in making consequential decisions for U.S. residents, particularly in areas such as housing, employment, and healthcare 4.
While SafeRent Solutions agreed to pay over $2.2 million, the settlement does not include any admission of fault. The company stated that it "continues to believe the SRS Scores comply with all applicable laws," but acknowledged that "litigation is time-consuming and expensive" 123. As part of the settlement, SafeRent has agreed to roll back certain parts of its screening products and cannot include its score feature on tenant screening reports in cases where the applicant is using a housing voucher 14.
Mary Louis's experience highlights the personal toll of algorithmic decision-making. Despite a 16-year history of timely rent payments, Louis was denied tenancy based on the algorithm's assessment. Her attempt to appeal the decision was met with a rigid response: "We do not accept appeals and cannot override the outcome of the Tenant Screening" 123.
SafeRent's defense initially argued that they shouldn't be held liable for discrimination as they weren't making the final tenancy decisions. However, Louis's attorneys, supported by the U.S. Department of Justice, successfully argued that SafeRent's algorithm could be held accountable due to its role in housing access 145.
This settlement may set a precedent for future cases involving AI-related discrimination. Todd Kaplan, one of Louis's attorneys, stated, "Management companies and landlords need to know that they're now on notice, that these systems that they are assuming are reliable and good are going to be challenged" 123. The case highlights the need for more robust regulation and oversight of AI systems, especially in critical areas like housing and employment 45.
While some state lawmakers have proposed regulations for AI systems, many of these proposals have failed to gain sufficient support. As a result, lawsuits like Louis's are beginning to lay the groundwork for AI accountability in the absence of comprehensive legislation 145. This case serves as a wake-up call for companies using AI in decision-making processes, emphasizing the importance of fairness, transparency, and accountability in algorithm design and implementation.
Reference
[1]
[5]
California cities are leading efforts to ban AI-driven rent pricing software, as federal prosecutors and state officials target RealPage for alleged antitrust violations. The controversy highlights growing concerns over algorithmic pricing in the housing market.
3 Sources
3 Sources
The U.S. Justice Department has filed a lawsuit against RealPage, alleging the software company violated antitrust laws by colluding with landlords to artificially inflate apartment rents. The case highlights concerns over the use of algorithms in pricing strategies within the housing market.
13 Sources
13 Sources
Stanford researchers use AI to identify and map racially restrictive covenants in Santa Clara County property deeds, saving time and resources while uncovering historical patterns of housing discrimination.
2 Sources
2 Sources
The House Financial Services Committee held a hearing to discuss the opportunities and risks associated with artificial intelligence in the financial industry. Lawmakers and experts debated the need for regulation and the potential benefits of AI adoption.
2 Sources
2 Sources
An examination of how AI-powered hiring tools can perpetuate and amplify biases in the recruitment process, highlighting cases involving HireVue and Amazon, and exploring solutions to mitigate these issues.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved