What Do You DO When an Algorithm Discriminates Based on Race? In the Trump Administration You Protect the Algorithm
The NYTImes’ Emily Badger wrote an article on the new, subtle method of housing discrimination: algorithms. In her recent Upshot article titled “Who’s to Blame When Algorithms Discriminate” she describes how bankers and real estate agents use algorithms to reinforce segregated housing patterns and deny African Americans an equal opportunity to get decent housing. The way HUD pushed back against these in the past was to develop rules that made it more difficult to claim innocence when “disparate impact” occurred. She writes:
Federal law prohibits not just outright discrimination, but also certain policies and decisions that have a “disparate impact” on groups protected by civil rights laws. It may be illegal, in other words, to design a rental app that has the effect of excluding minorities, even if no one meant to discriminate against them…
Housing discrimination today is largely a matter of such cases: ones where there is no racist actor, no paper trail of intent to discriminate, but where troubling disparities emerge between different classes of people…
“People don’t just say the things they used to say,” said Myron Orfield, a law professor at the University of Minnesota who directs the Institute on Metropolitan Opportunity there.
But some statistical patterns speak just as loudly.
“A black household that makes $167,000 is less likely to qualify for a prime loan than a white household that makes $40,000,” Mr. Orfield said, citing analysis of public mortgage data by the institute. “That looks funny. What the banks say in these cases is, ‘It’s the credit histories, and our models explain the differences.’ But you can’t look at those models. They’re proprietary.”
The Obama administration wrote rules that placed the onus for proving non-discrimination on the loaner or renter. Unsurprisingly, the Trump administration is taking a different view:
The Department of Housing and Urban Development published a proposed rule on Monday significantly raising the bar for housing discrimination claims that rely on such evidence…
By raising the bar for such claims, the new rule would make it harder to hold banks accountable if their underwriting algorithms repeatedly deny mortgages to seemingly qualified black families, or if city zoning laws that make no mention of race still have the effect of racially segregating neighborhoods.
Fair housing advocates see these new rules as onerous and undercutting the Fair Housing Act of 1968 and the guidelines that have been in place since then.
“The problem that we have is that more and more, industry players are relying on artificial intelligence,” said Lisa Rice, the president of the National Fair Housing Alliance. “They’re relying on machine learning. They’re relying on algorithmic-based systems for more and more of the functions of the housing market.”
Online ads for rental housing are targeted in ways that mean, for example, that African-American would-be tenants may never see them. Decisions are based on credit scores that perceive families who use payday lenders — who are more likely to be African-American — as having no credit at all.
“We’re just learning what the impacts are of these things,” said Greta Byrum, co-director of the Digital Equity Laboratory at the New School. “That’s why we’re seeing this battle to set policy precedent. HUD I think is trying to get ahead of what everyone is seeing on the horizon now as a big fight to set policy around algorithms.”
In the end the losers in this are the children whose parents want to move into a neighborhood or community where schools are better and services are more robust… but whose parents may never see ads for houses in those neighborhoods due to their online profile and the algorithms used based on that profile… and having banks and renters wash their hands of the problem by claiming: ‘It’s the credit histories, and our models explain the differences.’ But you can’t look at those models. They’re proprietary.” They may be proprietary… but they are also racist if they result in disparate treatment and they should be thrown out if that is the case. We can’t claim to be a fair and just society where everyone has an equal opportunity if we let propriety software deny access to good housing, good schools, and good neighborhoods. But from the Trump administration’s perspective, this is not a software bug… it’s a software feature.