663: Credit Score

A Thousand Things to Talk About
A Thousand Things to Talk About
663: Credit Score

Big personal disclosure time. In my adult life, I’ve been literally through the full range of credit scores, as measured in the United States. I’ve had scores as low as just over 370 and as high as the 790s. The scale generally goes from 300 to 800. There’s a very long personal story there, but it has given me a very intimate view of the system. And it’s a system that impacts a lot more than you think – from the ability to get a job to the cost of a cell phone plan. Things far beyond just the cost of credit financial institutions extend to you.

As an article in The Street reported in 2015, quote:

Greg Lull. head of consumer insights with Credit Karma, says the difference between a good and an excellent credit score over a lifetime is a whopping $60,000. The difference between poor and excellent is over $200,000.

Originally the credit scoring system that is used in the US, Canada, South Africa, Norway, India, and a number of other countries was built as an attempt to reduce bias in financial decisions by stripping a person’s history of maintaining payment and credit down to quote-unquote pure numbers.

Those numbers can be biased, after all. This was acknowledged by the City of New York in 2015 when they voted to outlaw employers running credit checks on potential job applicants. As The Nation put it, quote:

The rationale behind the ban is simple: it’s unfair and useless to use a person’s credit history, which is often inaccurate or misleading, when assessing their job qualifications. When corporations use massive data screenings to hire and fire en masse, credit checks can drastically narrow an applicant pool and subsequently be held as a cudgel over desperate job seekers and compel them to expose private background information. There’s nothing meritocratic about this practice. But it is racially biased, and very cruel to the poor.

The challenge is, no algorithm is naturally free of bias. After all, they are numbers, and those numbers are impacted by the person programming the system and the type of input they get. For an example of this, look no further than the multiple “social credit” systems being built in China. Quoting from a Wired article about the practice:

Private projects, such as Sesame Credit, hoover up all sorts of data on its 400 million customers, from how much time they spend playing video games (that’s bad) to whether they’re a parent (that’s good).

In other words, the biases and assumptions of those creating the system impact the outcomes. And if you agree with it or not, most countries have some kind of system that scores you in some way — officially, or as a part of a private company. So do you know where and how you rank?