In-depth

The Facebook news feed experiment: A despicable breach of user trust

Davey Winder vents his fury over Facebook's controversial news feed experiments

Facebook Like sign

My IT Pro musings usually reflect my opinion about something that happened during the previous week, not a couple of years ago.

So why have I been driven to respond to Facebook conducting a social experiment with unknowing users back in 2012? Well, like most people, I only became aware of it towards the end of last week.

It also stirs up some very real ethical concerns, not to mention privacy issues, and there's a very real lesson here that needs to be understood and implemented by any enterprise that handles customer data or provides a web service.

The June 17th edition of the Proceedings of the National Academy of Sciences of the United States of America (PNAS) included publication of the "Experimental evidence of massive-scale emotional contagion through social networks" paper.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

The 'significance' callout is all you need to appreciate the significance of this. It reads: "We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness."

The Facebook researchers call this 'emotional contagion' whereas I'd describe it as a despicable breach of user trust for no good reason.

Yep, you read that right. Facebook happily manipulated the emotions of nearly 700,000 users in order to see if happy or sad emotional states could be passed on in a contagious fashion.

Whether or not this kind of experiment is valid or necessary is besides the point. What's worth noting is that none of the 689,003 people who participated knew they were being experimented upon nor gave their explicit consent.

This incredible invasion of privacy, and I would argue breach of ethical boundaries, took place over the course of a week in 2012. During that time, news feeds for the people involved were biased towards either happy or sad postings, and it was noted whether the responses became generally more positive or negative as a result.

The Facebook researchers call this 'emotional contagion' whereas I'd describe it as a despicable breach of user trust for no good reason.

Advertisement - Article continues below

Who cares if user emotions can be spread, infection-like, through social networks? Other than advertisers, politicians and others with a vested interest in manipulating how we feel? Slippery slope is a phrase that springs to mind, but it's not even that which has got my goat the most. It's Facebook's response to being caught red-handed, dipping into the ethically-questionable honeypot.

That was, essentially, every user signs up to the Terms of Service when joining Facebook, including the Data Use Policy, which states Facebook can use your information "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

So, it's not illegal as such, but it's morally and ethically corrupt, if you ask me. I could care less if all the analysis was done by computer, which meant no human saw the posts in question. But this just stinks of hand-washing, and avoids the most important question: just how big do you have to be in order to not care about your customers?

This is where it gets a bit more complex, especially as I'm trying to compare this with the enterprise world where 'free service' is something of an anachronism. I suppose you could argue, like advertising, it's a cost:benefit thing.

Advertisement
Advertisement - Article continues below

The customer chooses a no cost option, the provider benefits by throwing adverts at them. However, there's a difference between marketing material which is pretty much a given and conducting large scale experiments which is not.

More to the point, assuming that customers won't care is a big mistake. Worse, thinking that you have the moral right to do anything your legally binding terms and conditions allow is an even bigger one.

Advertisement - Article continues below

I've seen a statement from Facebook when questioned over the incident which claims it does research to 'improve our services' and make content more relevant. This is, frankly, twaddle.

Is Facebook seriously suggesting that because sad content prompts negative comments from users that only happy posts should be allowed? Of course not, it's stumbling around in the dark trying to defend the indefensible.

A data scientist employed at Facebook even pretty much admitted as much in a post where he (Adam D.I. Kramer) said: "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

So here's the lesson to learn folks, and that is critical mass does not mean you can walk all over user privacy.

Even if your legal nerd says it's all above board, that does not make it right. User privacy is an expectation as much as it is legalese in an agreement nobody reads before signing. Facebook users would not expect their data to be used like this, and any organisation which respects its customer base should understand it.

Ignore my advice at your peril. Even if you are Facebook.

Featured Resources

What you need to know about migrating to SAP S/4HANA

Factors to assess how and when to begin migration

Download now

Your enterprise cloud solutions guide

Infrastructure designed to meet your company's IT needs for next-generation cloud applications

Download now

Testing for compliance just became easier

How you can use technology to ensure compliance in your organisation

Download now

Best practices for implementing security awareness training

How to develop a security awareness programme that will actually change behaviour

Download now
Advertisement

Recommended

Visit/policy-legislation/32857/irish-data-protection-commission-facebook-whatsapp-instagram-merge
Policy & legislation

Irish Data Protection Commission has questions for Facebook

29 Jan 2019
Visit/security/internet-security/354484/facebook-exec-calls-cambridge-analytica-scandal-a-non-event
internet security

Facebook exec calls Cambridge Analytica scandal a "non event"

8 Jan 2020
Visit/security/internet-security/354417/avast-and-avg-extensions-pulled-from-chrome
internet security

Avast and AVG extensions pulled from Chrome

19 Dec 2019
Visit/hardware/mobile/354392/facebook-apps-dominated-this-decades-mobile-market
Mobile

Facebook apps dominated this decade's mobile market

18 Dec 2019

Most Popular

Visit/policy-legislation/data-governance/354496/brexit-security-talks-under-threat-after-uk-accused-of
data governance

Brexit security talks under threat after UK accused of illegally copying Schengen data

10 Jan 2020
Visit/security/cyber-security/354468/if-not-passwords-then-what
cyber security

If not passwords then what?

8 Jan 2020
Visit/policy-legislation/31772/gdpr-and-brexit-how-will-one-affect-the-other
Policy & legislation

GDPR and Brexit: How will one affect the other?

9 Jan 2020
Visit/web-browser/30394/what-is-http-error-503-and-how-do-you-fix-it
web browser

What is HTTP error 503 and how do you fix it?

7 Jan 2020