In-depth

The Facebook news feed experiment: A despicable breach of user trust

Davey Winder vents his fury over Facebook's controversial news feed experiments

Facebook Like sign

My IT Pro musings usually reflect my opinion about something that happened during the previous week, not a couple of years ago.

So why have I been driven to respond to Facebook conducting a social experiment with unknowing users back in 2012? Well, like most people, I only became aware of it towards the end of last week.

Advertisement - Article continues below

It also stirs up some very real ethical concerns, not to mention privacy issues, and there's a very real lesson here that needs to be understood and implemented by any enterprise that handles customer data or provides a web service.

The June 17th edition of the Proceedings of the National Academy of Sciences of the United States of America (PNAS) included publication of the "Experimental evidence of massive-scale emotional contagion through social networks" paper.

The 'significance' callout is all you need to appreciate the significance of this. It reads: "We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness."

The Facebook researchers call this 'emotional contagion' whereas I'd describe it as a despicable breach of user trust for no good reason.

Advertisement
Advertisement - Article continues below

Yep, you read that right. Facebook happily manipulated the emotions of nearly 700,000 users in order to see if happy or sad emotional states could be passed on in a contagious fashion.

Advertisement - Article continues below

Whether or not this kind of experiment is valid or necessary is besides the point. What's worth noting is that none of the 689,003 people who participated knew they were being experimented upon nor gave their explicit consent.

This incredible invasion of privacy, and I would argue breach of ethical boundaries, took place over the course of a week in 2012. During that time, news feeds for the people involved were biased towards either happy or sad postings, and it was noted whether the responses became generally more positive or negative as a result.

The Facebook researchers call this 'emotional contagion' whereas I'd describe it as a despicable breach of user trust for no good reason.

Who cares if user emotions can be spread, infection-like, through social networks? Other than advertisers, politicians and others with a vested interest in manipulating how we feel? Slippery slope is a phrase that springs to mind, but it's not even that which has got my goat the most. It's Facebook's response to being caught red-handed, dipping into the ethically-questionable honeypot.

Advertisement - Article continues below

That was, essentially, every user signs up to the Terms of Service when joining Facebook, including the Data Use Policy, which states Facebook can use your information "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

So, it's not illegal as such, but it's morally and ethically corrupt, if you ask me. I could care less if all the analysis was done by computer, which meant no human saw the posts in question. But this just stinks of hand-washing, and avoids the most important question: just how big do you have to be in order to not care about your customers?

Advertisement
Advertisement - Article continues below

This is where it gets a bit more complex, especially as I'm trying to compare this with the enterprise world where 'free service' is something of an anachronism. I suppose you could argue, like advertising, it's a cost:benefit thing.

The customer chooses a no cost option, the provider benefits by throwing adverts at them. However, there's a difference between marketing material which is pretty much a given and conducting large scale experiments which is not.

Advertisement - Article continues below

More to the point, assuming that customers won't care is a big mistake. Worse, thinking that you have the moral right to do anything your legally binding terms and conditions allow is an even bigger one.

I've seen a statement from Facebook when questioned over the incident which claims it does research to 'improve our services' and make content more relevant. This is, frankly, twaddle.

Is Facebook seriously suggesting that because sad content prompts negative comments from users that only happy posts should be allowed? Of course not, it's stumbling around in the dark trying to defend the indefensible.

A data scientist employed at Facebook even pretty much admitted as much in a post where he (Adam D.I. Kramer) said: "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

So here's the lesson to learn folks, and that is critical mass does not mean you can walk all over user privacy.

Advertisement - Article continues below

Even if your legal nerd says it's all above board, that does not make it right. User privacy is an expectation as much as it is legalese in an agreement nobody reads before signing. Facebook users would not expect their data to be used like this, and any organisation which respects its customer base should understand it.

Ignore my advice at your peril. Even if you are Facebook.

Featured Resources

Successful digital transformations are future ready - now

Research findings identify key ingredients to complete your transformation journey

Download now

Cyber security for accountants

3 ways to protect yourself and your clients online

Download now

The future of database administrators in the era of the autonomous database

Autonomous databases are here. So who needs database administrators anymore?

Download now

The IT expert’s guide to AI and content management

Your guide to the biggest opportunities for IT teams when it comes to AI and content management

Download now
Advertisement
Advertisement

Recommended

Visit/security/cyber-security/355267/zoom-hires-ex-facebook-cso-to-boost-platform-security
cyber security

Zoom hires ex-Facebook CSO Alex Stamos to boost platform security

8 Apr 2020
Visit/security/vulnerability/355236/hp-support-assistant-flaws-leave-windows-devices-open-to-attack
vulnerability

HP Support Assistant flaws leave Windows devices open to attack

6 Apr 2020
Visit/security/cyber-security/355234/safari-bug-let-hackers-access-cameras-on-iphones-and-macs
cyber security

Safari bug let hackers access cameras on iPhones and Macs

6 Apr 2020
Visit/infrastructure/network-internet/355233/russian-isp-intercepted-traffic-from-internet-giants-like
Network & Internet

Russian ISP intercepted traffic from AWS, Facebook, Google and more

6 Apr 2020

Most Popular

Visit/mobile/mobile-phones/355239/microsofts-patent-design-reveals-a-mobile-device-with-a-third-screen
Mobile Phones

Microsoft patents a mobile device with a third screen

6 Apr 2020
Visit/security/cyber-security/355271/microsoft-gobbles-up-corpcom-domain-to-keep-it-from-hackers
cyber security

Microsoft gobbles up corp.com domain to keep it from hackers

8 Apr 2020
Visit/server-storage/servers/355254/a-critical-flaw-in-350000-microsoft-exchange-remains-unpatched
servers

A critical flaw in 350,000 Microsoft Exchange remains unpatched

7 Apr 2020