Apple co-founder Wozniak echoes sexist Apple Card allegations

Consumers claim Apple credit limits are ten times larger for men

Apple Card

An algorithm used by Goldman Sachs to determine the credit limits for Apple Card customers will be probed by US regulators for alleged discrimination towards women.

Apple co-founder Steve Wozniak supported the claims made by David Heinemeier Hansson on Twitter, which ultimately went viral, which allege his wife's Apple Card credit limit was 20 times lower than his own, despite her having a better credit score.

Hansson's scathing Twiter tirade spanned multiple threads, adding that even when his wife paid her "ridiculously low" limit in full and in advance, the card wouldn't permit her to spend on it until the next billing period.

Wozniak, who helped found Apple alongside Steve Jobs in 1976, substantiated Hansson's claims, replying to his original tweet with his own similar story that his credit limit was ten times higher than his wife's.

IT Pro contacted Goldman Sachs but it did not reply at the time of publication.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

An Apple spokesperson confirmed to IT Pro that the company has little input in the running of the card, saying "all credit decisioning and operational management of the card is done by the bank".

Upon raising the issue with Apple's customer service, Hansson was told the representatives didn't know why the limits were different, they just said "it's just the algorithm," according to Hansson. Following discussions, his wife's limit was subsequently raised to match his own.

"The department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex," said the New York Department of Financial Services in a statement to Bloomberg.

"Any algorithm that intentionally or not results in discriminatory treatment of women or any other protected class violates New York law."

The Apple Card is Goldman Sachs' first credit card and accompanies its push to providing more consumer-targeted products such as personal loans and savings accounts.

Advertisement - Article continues below

"Our credit decisions are based on a customer's creditworthiness and not on factors like gender, race, age, sexual orientation or any other basis prohibited by law," said the investment bank to Bloomberg.

It's currently unclear if the algorithm used to determine the credit limits is developed by Apple or Goldman Sachs. Whichever company developed the algorithm, it's unlikely to be used with any credit card form other providers. It's also unclear whether Apple knowingly approved the algorithm to be used on its product.

Calls to regulate the use of AI and algorithms have been heard for years now and this isn't the first time discrimination has been alleged towards the automated decision-making tools.

A history of bias

Algorithms have been used prevalently in the job recruitment industry to quickly filter out applicants, saving a lengthy process of human analysis of CVs and covering letters. Such technology has been widely scrutinised, particularly in the UK where specific cases have been met with complaints.

In March 2019, the UK government launched an investigation into possible bias in algorithms. One example cited was the Harm Assessment Risk Tool used in Durham to determine the likelihood of criminals reoffending, which was ruled to be unfairly targeting users based on their income.

Advertisement
Advertisement - Article continues below

Candidates that make it through to an interview stage may also be met with a camera to analyse their facial expressions and voice to determine whether they would be right for the job an implementation which has been condemned by privacy advocates.

Advertisement - Article continues below

The Home Office is also the subject of a legal investigation into its "secretive" algorithm used to make important decisions regarding immigration policy. The software is said to judge immigration applicants unfairly, when a human review may account for their merits in a fairer way, rather than relying on just simple criteria such as age or nationality.

Featured Resources

Transform the operator experience with enhanced automation & analytics

Bring networking into the digital era

Download now

Artificially intelligent data centres

How the C-Suite is embracing continuous change to drive value

Download now

Deliver secure automated multicloud for containers with Red Hat and Juniper

Learn how to get started with the multicloud enabler from Red Hat and Juniper

Download now

Get the best out of your workforce

7 steps to unleashing their true potential with robotic process automation

Download now
Advertisement

Recommended

Visit/technology/33253/toyota-partners-with-nvidia-to-create-the-future-of-autonomous-vehicles
Technology

Toyota, NVIDIA partner on self-driving cars

20 Mar 2019

Most Popular

Visit/security/vulnerability/354309/patch-issued-for-critical-windows-bug
vulnerability

Patch issued for critical Windows bug

11 Dec 2019
Visit/hardware/354193/buy-it-to-grow-not-slow-your-business
Sponsored

Buy IT to grow, not slow, your business

25 Nov 2019
Visit/cloud/microsoft-azure/354230/microsoft-not-amazon-is-going-to-win-the-cloud-wars
Microsoft Azure

Microsoft, not Amazon, is going to win the cloud wars

30 Nov 2019
Visit/business-strategy/recruitment/354296/life-ends-at-40-in-the-tech-industry
recruitment

Life ends at 40 in the tech industry

9 Dec 2019