You are viewing an archived webpage. The information on this page may be out of date. Learn about EPIC's recent work at epic.org.

Court Blocks Rule That Would Okay Algorithmic Housing Decisions, Limit Discrimination Claims

A federal judge in Massachusetts has blocked a federal regulation that would have made it significantly harder to sue landlords and lenders for housing discrimination under the Fair Housing Act. The rule created a defense to any disparate impact claim in which a "predictive analysis" tool was used to make a housing decision, so long as that tool "accurately assessed risk" or was not "overly restrictive on a protected class." The court ruled that this regulation would "run the risk of effectively neutering disparate impact liability under the Fair Housing Act." In 2019, EPIC and others warned the federal housing agency that sanctioning the use of algorithms for housing decisions would exacerbate discrimination unless the agency imposed transparency, accountability, and data protection requirements. The Alliance for Housing Justice called the rule "a vague, ambiguous exemption for predictive models that appears to confuse the concepts of disparate impact and intentional discrimination." EPIC has called for greater accountability in the use of automated decision-making systems, including the adoption of the Universal Guidelines for Artificial Intelligence and requirements for algorithmic transparency.


« Consumer Groups Urge Limits to FCC Robocall Exemptions | Main | BREAKING: DOJ Releases New Sections of Mueller Report in EPIC Case »

Share this page:

Defend Privacy. Support EPIC.
US Needs a Data Protection Agency
2020 Election Security