The glossy new Apple credit card lost some of its sheen recently, as its credit decisions were found to be inadvertently sexist.
David Heinemeier Hansson, a Danish tech entrepeneur known for developing the Ruby on Rails framework, tweeted that he had received a different credit limit for his wife despite providing identical information as they shared a bank account, income and address.
The only difference in the applications was the gender of the applicant. This resulted in a completely different credit limit. Apple co-founder Steve Wozniak reported the same experience.
Apple developed their card with Goldman Sachs who is responsible for the credit decisions. Consumers are offered credit terms based on "black box" machine learning algorithms which process large quantities of data to find correlations and provide a credit decision, but are unable to demonstrate their workings.
Explicitly using gender (or other characteristics such as race) in credit decisions is illegal and the New York Department of Financial Services is now investigating Goldman Sachs.
Insurance companies have held off deploying "black-box" machine learning models in core pricing and rate-setting for precisely these reasons. In consumer-facing lines of business, the regulatory and reputational risk is too high vs the potential returns to underwriting profit. Given Goldman and Apple's recent experience, there is unlikely to be a rush to change this.
While Goldman said that it doesn’t make underwriting decisions based on gender, Hansson said the opaque methodology behind the card’s credit decisions amounts to sexism. ″My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time,” Hansson tweeted, along with a screenshot showing a $57 dollar spending limit. “Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does.”