You are viewing an archived webpage. The information on this page may be out of date. Learn about EPIC's recent work at epic.org.

ABA Urges Withdrawal of Algorithmic Safe Harbor Rule for Disparate Impact Claims in Housing

In September 2020, the Department of Housing and Urban Development released a final rule creating a defense to a discrimination claim under the Fair Housing Act where “predictive analysis” tools are not "overly restrictive on a protected class" or where they “accurately assessed risk.” Shortly after, a federal judge in Massachusetts blocked the rule, saying the regulation would "run the risk of effectively neutering disparate impact liability under the Fair Housing Act.” Today, American Bar Association President Patricia Lee Refo urged the agency to "act immediately to withdraw the 2020 FHA Rule and to adopt new guidance and a new rule to ensure the danger of algorithmic bias is adequately tackled.” EPIC and several others warned the federal housing agency during the initial rule announcement that providing such a safe harbor for the use of algorithms in housing without imposing transparency, accountability, or data protection regulations would exacerbate harms to individuals subject to discrimination. EPIC has called for greater accountability in the use of automated decision-making systems, including the adoption of the UGAI principles and requirements for algorithmic transparency.


« EPIC, Coalition Urge Florida Lawmakers to Preserve Private Right of Action | Main | Facebook Breach Exposes Personal Data of Over 500 Million Users »

Share this page:

Defend Privacy. Support EPIC.
US Needs a Data Protection Agency
2020 Election Security