You are viewing an archived webpage. The information on this page may be out of date. Learn about EPIC's recent work at epic.org.

New Housing Regulation Limits Disparate Impact Housing Claims Based on Algorithms

Individuals alleging that a landlord discriminated against them by using a tenant-screening algorithm will face a higher burden of proof under a new rule that went into effect last Thursday. The rule creates a defense to a discrimination claim under the Fair Housing Act where the “predictive analysis” tools used were not "overly restrictive on a protected class" or where they “accurately assessed risk.” Last October, EPIC and several others warned the federal housing agency that providing such a safe harbor for the use of algorithms in housing without imposing transparency, accountability, or data protection regulations would exacerbate harms to individuals subject to discrimination. The agency did modify its rule following comments from EPIC and others, removing a complete defense based on use of an "industry standard” algorithm or in cases where the algorithm was not the “actual cause” of the disparate impact. But the final rule simply replaces the word “algorithm” with “predictive analysis” and includes vague "overly restrictive" and "accurate assessment” standards. The Alliance for Housing Justice called the rule "a vague, ambiguous exemption for predictive models that appears to confuse the concepts of disparate impact and intentional discrimination.” EPIC has called for greater accountability in the use of automated decision-making systems, including the adoption of the UGAI principles and requirements for algorithmic transparency.


« Pennsylvania's Supreme Court Prohibits Election Officials From Counting 'Naked Ballots'; PA Voters Must Use Secrecy Envelopes | Main | Report on Trump Tax Records Reinforces EPIC's Calls for Presidential Tax Return Disclosure »

Share this page:

Defend Privacy. Support EPIC.
US Needs a Data Protection Agency
2020 Election Security