Can We Train Technology to Comply with Fair Housing Laws?

Artificial Intelligence (“AI”) has become a driving force in our world today, even in its early stages, and provides both companies as well as consumers with powerful and highly effective tools for managing, analyzing, and collaborating with large amounts of data. Search engines, music apps such as Spotify, YouTube, ads, cell phones, and even your car are using algorithms all the time to determine things such as your safety, music preferences, political affiliation, brand loyalty, daily steps, and much, much more. While there are of course numerous benefits to the incorporation of AI, there are also just as many pitfalls. One of these is the potential for unintentional illegal discrimination. The White House describes this as such: “Algorithmic discrimination occurs when automated systems contribute to unjustified different treatment or impacts disfavoring people based on their race, color, ethnicity, sex (including pregnancy, childbirth, and related medical conditions, gender identity, intersex status, and sexual orientation), religion, age, national origin, disability, veteran status, genetic information, or any other classification protected by law. Depending on the specific circumstances, such algorithmic discrimination may violate legal protections.”

The Fair Housing Act of 1968 has the stated goal of protecting individuals against the discriminations listed above, yet due to algorithms, directed marketing systems, AI and other technology, some are being left out of the loop.  Even when there is no intent to discriminate, these electronic systems are not currently designed to discern data in a manner that protects against fair housing violations.  To that end, real estate marketing on many platforms has far fewer features to select when doing targeting marketing.  This can help, but definitely doesn’t solve the problem of discrimination in housing marketing, absorption, access, and the other areas key to meeting the goals of the Fair Housing Act.

To try and identify, and hopefully address and adapt to these needs, the House Financial Services Committee has asked that the Government Accountability Office do an impact study to analyze how the use of artificial intelligence and property technology affects the housing market, especially concerning issues around fairness and affordability. Additionally, in April, the Consumer Financial Protection Bureau (CFPB) and three other federal agencies released a joint statement claiming that automated systems are “not an excuse for lawbreaking behavior” and went on to say that they would not only enforce civil rights and consumer protection but would also back fair competition laws related to these technologies.

For those of us in real estate, we must be mindful of fair housing in all our marketing and business endeavors, including those driven by technology.