Apple co-founder Wozniak echoes sexist Apple Card allegations

Consumers claim Apple credit limits are ten times larger for men

Apple Card

An algorithm used by Goldman Sachs to determine the credit limits for Apple Card customers will be probed by US regulators for alleged discrimination towards women.

Apple co-founder Steve Wozniak supported the claims made by David Heinemeier Hansson on Twitter, which ultimately went viral, which allege his wife's Apple Card credit limit was 20 times lower than his own, despite her having a better credit score.

Hansson's scathing Twiter tirade spanned multiple threads, adding that even when his wife paid her "ridiculously low" limit in full and in advance, the card wouldn't permit her to spend on it until the next billing period.

Advertisement - Article continues below

Wozniak, who helped found Apple alongside Steve Jobs in 1976, substantiated Hansson's claims, replying to his original tweet with his own similar story that his credit limit was ten times higher than his wife's.

IT Pro contacted Goldman Sachs but it did not reply at the time of publication.

An Apple spokesperson confirmed to IT Pro that the company has little input in the running of the card, saying "all credit decisioning and operational management of the card is done by the bank".

Upon raising the issue with Apple's customer service, Hansson was told the representatives didn't know why the limits were different, they just said "it's just the algorithm," according to Hansson. Following discussions, his wife's limit was subsequently raised to match his own.

"The department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex," said the New York Department of Financial Services in a statement to Bloomberg.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

"Any algorithm that intentionally or not results in discriminatory treatment of women or any other protected class violates New York law."

The Apple Card is Goldman Sachs' first credit card and accompanies its push to providing more consumer-targeted products such as personal loans and savings accounts.

"Our credit decisions are based on a customer's creditworthiness and not on factors like gender, race, age, sexual orientation or any other basis prohibited by law," said the investment bank to Bloomberg.

It's currently unclear if the algorithm used to determine the credit limits is developed by Apple or Goldman Sachs. Whichever company developed the algorithm, it's unlikely to be used with any credit card form other providers. It's also unclear whether Apple knowingly approved the algorithm to be used on its product.

Calls to regulate the use of AI and algorithms have been heard for years now and this isn't the first time discrimination has been alleged towards the automated decision-making tools.

A history of bias

Algorithms have been used prevalently in the job recruitment industry to quickly filter out applicants, saving a lengthy process of human analysis of CVs and covering letters. Such technology has been widely scrutinised, particularly in the UK where specific cases have been met with complaints.

Advertisement - Article continues below

In March 2019, the UK government launched an investigation into possible bias in algorithms. One example cited was the Harm Assessment Risk Tool used in Durham to determine the likelihood of criminals reoffending, which was ruled to be unfairly targeting users based on their income.

Candidates that make it through to an interview stage may also be met with a camera to analyse their facial expressions and voice to determine whether they would be right for the job an implementation which has been condemned by privacy advocates.

The Home Office is also the subject of a legal investigation into its "secretive" algorithm used to make important decisions regarding immigration policy. The software is said to judge immigration applicants unfairly, when a human review may account for their merits in a fairer way, rather than relying on just simple criteria such as age or nationality.

Featured Resources

Navigating the new normal: A fast guide to remote working

A smooth transition will support operations for years to come

Download now

Putting a spotlight on cyber security

An examination of the current cyber security landscape

Download now

The economics of infrastructure scalability

Find the most cost-effective and least risky way to scale

Download now

IT operations overload hinders digital transformation

Clearing the path towards a modernised system of agreement

Download now
Advertisement
Advertisement

Recommended

Visit/technology/artificial-intelligence-ai/354766/mit-develops-ai-tech-to-edit-outdated-wikipedia
artificial intelligence (AI)

MIT develops AI tech to edit outdated Wikipedia articles

13 Feb 2020

Most Popular

Visit/laptops/29190/how-to-find-ram-speed-size-and-type
Laptops

How to find RAM speed, size and type

24 Jun 2020
Visit/policy-legislation/data-protection/356344/eu-institutions-warned-against-purchasing-any-further
data protection

EU institutions told to avoid Microsoft software after licence spat

3 Jul 2020
Visit/security/vulnerability/356295/microsoft-patches-high-risk-flaws-that-can-be-exploited-with-a
vulnerability

Microsoft releases urgent patch for high-risk Windows 10 flaws

1 Jul 2020