How Big Data and Credit Law Could Reinforce Inequality

How Big Data and Credit Law Could Reinforce Inequality
By: Staff
Posted: January 26, 2015 publishes personal finance studies daily to inform users of the latest news and trends in the subprime marketplace.

Is big data leveling the playing field in the modern-day consumer credit market, or is it doing more harm than good?

This was one of the questions think-tanks, policymakers and community leaders grappled with during the first conference on Data & Civil Rights last October.

Hosted by the Data & Society Research Institute, the Leadership Conference on Civil and Human Rights and New America Foundation’s Open Technology Institute, the conference explored emerging civil rights issue connected to the rise in big data and complex algorithms used in our everyday lives.

To provide attendees a comprehensive introduction credit scoring’s civil rights concerns, Alex Rosenblat and danah boyd, of Data & Society, Rob Randhava and Corrine Yu, of The Leadership Conference on Civil and Human Rights, and Seeta Peña Gangadharan, of the New America Foundation’s Open Technology Institute co-authored “Data & Civil Rights: Consumer Finance Primer“.

A brief history of consumer credit law

The 1960s and ’70s produced several key building-blocks of the consumer finance market as we know it today:

  • 1962 — Fair Isaac (more commonly known as FICO) developed a 12-point mathematical tool for a then-massive retailer Montgomery Ward to predict risk for their credit applicants.
  • 1968 — Title VIII of the Civil Rights Act of 1968 forbid institutions from using some protected class information (such as religion) when evaluating consumer mortgage applicants.
  • 1970 — Congress passed the Fair Credit Reporting Act, which ordered bureaus to provide accurate and relevant files on consumers and follow stringent rules in how they reported and organized such information.

However, “Institutions designed automated credit scores in the 1970s to achieve full compliance with anti-discrimination laws, rather than to achieve fairness,” the paper says. The authors also point out that, prior to this automation, credit scoring relied on some questionable metrics such as how well-kept an applicant kept his or her yard, or their “effeminate gestures.”

Thanks to consumer finance reform over the years, credit reporting bureaus can no longer rely on protected information like race, sex, religion, ethnic origin and anything else not directly related to financial activity (except for age, so long as people over 62 aren’t given a negative risk value).

Reinforcing a history of prejudice

However, just because everyone’s playing by the same set of rules on paper doesn’t mean consumers can’t still be victims of circumstance.

Many consumers don’t realize it, but they might actually receive unique credit offers based on the aggregate credit profile of the surrounding neighborhood — which raises concern if a majority of the neighborhood’s residents share a protected class characteristic.

“In credit-scoring, unfavorable credit scores accurately and strongly correlate with protected statuses,” the paper says. “However, they reflect and reinforce the history of prejudice that is encoded in the data they measure, and they can unfairly result in a disparate impact against protected classes.”

“In credit-scoring, unfavorable credit scores accurately and strongly correlate with protected statuses … They reflect and reinforce the history of prejudice that is encoded in the data they measure.”

Your address will affect the types of credit offers that are tailored to you, Rosenblat said. These are likely to be based on your Zip+4 aggregate score, rather than your specific credit score.

This practice is known as “steering.” While your individual credit score might qualify you for a prime-rate mortgage, the offers tailored to you based on your aggregate score can be sub-prime, which are higher cost.

She used a simple example — person A and person B both earn $100,000 a year. However, person A lives in a in a neighborhood that is predominantly lower income. The other person lives in a neighborhood where the average homeowner there has a higher income of $100,000.

This can have the effect of steering homeowners into unfavorable mortgages based on where they live, which has a disparate impact on communities of color.

For example, “Black families making $100,000 typically live in the kinds of neighborhoods inhabited by white families making $30,000,” according to research by New York University Sociologist Patrick Sharkey.

“If you’re going to exclude someone’s race, which is a protected class characteristic, as to whether or not they’re eligible to receive credit, but use neighborhoods instead and happen to know that neighborhoods correlate very strongly to race,” she said, “Then you’re still effectively using race in a way that’s acceptable, legal, but unfair.

“If you receive offers based on your neighbors and income, then the first five options in front of your are probably going to relate to the aggregate score of the people you live around,” she continued.

This in and of itself isn’t illegal, so long as people with the same data characteristics, retrieved and processed the same way, receive the same rates.

While person A would likely receive better lending terms if he or she shopped around, Rosenblat notes that people are more likely to select whatever’s presented to them, following an idea of “fuzzy nudges.”

This applies to all sorts of consumer products, not just mortgages.

“How willing are you to explore whether you’re eligible for better offers?” she asks. “If you click and apply for a product presented to you in an advertisement, you might not go look for a better offer … You might not know that if you had a different IP address [the location from which you access the internet], you would get a better offer.”

For example, their paper notes how an audit revealed that consumers may see different advertised prices from office-supply chain Staples based on their physical location.

The future of consumer finance and credit scoring

Another problem facing large segments of the population relate to the problems inherent in data-poor environments with limited access to credit.

“If they need a loan, they may have to use payday loan services, which are notoriously predatory,” the paper says. “In a data-centric system, the absence of a credit history can penalize a low-income consumer as much as a negative credit history.”

This is where new techniques, such as alternative credit scoring and peer-to-peer lending, may help democratize access to credit in underbanked populations.

However, Rosenblat points out that alternative scoring systems won’t totally resolve civil rights concerns.

“The risk there is that only the credit bureaus are regulated because your credit score affects your ability to bank and obtain loans,” she said. “If you’re having alternative scores created, collecting data from your social media activity for example, they’re not obliged to follow the same legislation.”

To help protect yourself against unfair credit practices, be an informed citizen and find out your credit score today.