The Apple Card Is Sexist. Blaming the Algorithm Is Proof.
Hyped as the biggest credit-card innovation in 50 years, the Apple Card is starting to look more like something from the 1960s and 1970s: Women are allegedly being granted a fraction of their spouses’ borrowing limits. It’s another troubling example of the deficiencies of machine learning.
November 12, 2019
(Bloomberg Opinion) -- Hyped as the biggest credit-card innovation in 50 years, the Apple Card is starting to look more like something from the 1960s and 1970s: Women are allegedly being granted a fraction of their spouses’ borrowing limits. It’s another troubling example of the deficiencies of machine learning.
Just months after its launch, New York regulators say they’re investigating Goldman Sachs Group Inc., the bank behind the card, and the algorithm that it uses to determine credit-worthiness. Goldman denies any discrimination but that hasn’t stopped Apple Inc.’s co-founder Steve Wozniak from calling for the U.S. government to get involved. “We don’t have transparency on how these companies set these things up and operate,” he told Bloomberg News.
The investigation and Wozniak’s comments came in response to a Twitter broadside from the tech entrepreneur David Heinemeier Hansson, in which he said the Apple Card gave him a credit limit 20 times bigger than the one for his wife. This was despite her superior credit score and their jointly filed tax returns. Wozniak says he has been given 10 times the limit granted to his wife.
The bone of contention here is what Apple’s customer services representatives called, in Hansson’s telling, “the algorithm.” When he sought an explanation of why his wife was being treated differently, he was told the algorithm was accountable.
Yet blaming the algorithm — while saying an exception would be made for Hansson’s wife and her credit assessment adjusted, as Apple did — seems a tacit admission that said algorithm is flawed. At the very least, it raises questions about just how “accountable” these systems are. Customers don’t know the details of how the Apple-Goldman credit-worthiness computations work, how dependent they are on artificial intelligence (or, more precisely, machine learning), what inputs they use, or even how much of the technology is proprietary to the two companies.
If the system is indeed making such blatantly egregious decisions, should it really be used at all? At least when there’s human error or bias there’s a more straightforward route to correct it. While a company can interrogate a person about how they arrived at an individual decision, that’s usually not possible with machine learning. Instead you have to examine the “big data” inputs that informed the algorithm, and see if that prompted a set of biases.
Of course, bias in artificial intelligence is not unique to the Apple Card. It has reared its head in the criminal justice system, the employment market, health care, facial recognition, app recommendations and beyond. In each case, understanding what prompted the prejudices is essential to fixing it. And in each case, that’s easier said than done. John Giannandrea, Apple's head of AI, said in 2017 that data bias is the greatest danger posed by machine learning.
This isn’t the first consumer finance misstep for Goldman. The Wall Street firm may be at the cutting edge of finance, but its foray into consumer lending has been mired in rookie mistakes. Its consumer lending arm, Marcus, reportedly started without a team of debt collectors, leading to early losses on delinquent borrowers.
Apple’s chief executive officer Tim Cook, meanwhile, has hinted that he’s seeking a partner to bring the Apple Card to Europe. The Hansson and Wozniak episodes show that would be quite a gamble. The European Union’s General Data Protection Regulation, introduced last year, includes the “right to explanation” for consumers – exactly the thing being demanded by Hansson. A failure to provide a satisfactory reason might result in financial penalties. As we can see, with AI algorithms such explanations aren’t easily extracted.
About the Author
You May Also Like