Make social media firms liable for harm to children, MPs say

social media

Companies like Facebook and Snapchat must be subject to a legally-enforceable duty of care' when it comes to mitigating the harms and risks to children's mental health associated with their platforms.

An influential committee of MPs has outlined the risks children and teenagers face online, on platforms such as Snapchat and Instagram, concluding that regulations are needed to compel firms to share data with researchers, and align standards.

Firms have thus far been unwilling to share data with researchers so they can rigorously and scientifically establish the emerging risks for young people, according to the report written by parliament's Science and Technology Committee (STC).

These may range from damage to sleep patterns and self-esteem to bullying, grooming and unsolicited sexting' and child sexual exploitation.

Meanwhile, due to the lack of any proper regulatory mechanisms and a "patchwork" of legislation in place, different platforms may abide by different regulations, while some are exempt altogether.

YouTube and other video sharing services, as well as social media platforms and search engines, such as Google and Bing, are not subject to any specific regulations.

Chair of the committee Norman Lamb MP lamented the lack of any substantial body of research on how social media affects younger users and said the government had a vital role to play.

"More worryingly, social media companies - who have a clear responsibility towards particularly young users - seem to be in no rush to share vital data with academics that could help tackle the very real harms our young people face in the virtual world," he said.

"We understand their eagerness to protect the privacy of users but sharing data with bona fide researchers is the only way society can truly start to understand the impact, both positive and negative, that social media is having on the modern world.

"During our inquiry, we heard that social media companies had openly refused to share data with researchers who are keen to examine patterns of use and their effects. This is not good enough."

The persistent theme of self-regulation has not worked, the report found, with the need greater than ever for an independent statutory regulator to audit social media companies.

The SCT report has been published just days after the father of a young girl directly blamed Instagram, owned by Facebook, for her death.

The incident has led to calls for the government to heavily regulate the content circulated on social media, particularly after a search through the social media accounts of Molly Russell, 14, found material that alluded to self-harm and suicide.

The committee's report published today recommends that the boundaries of the law are made clear and that illegal and harmful content can be identified and removed easily.

Its authors have recommended the government asks the Law Commission to produce clear recommendations as to how to reform existing laws and construe new regulations to effectively govern this emerging phenomenon.

An example of such regulation could be similar to that which has been produced in Germany, the report recommended, in which potentially illegal content that is reported on a platform must be removed by the company within 24 hours.

A new regulator should also be appointed by October this year to assess and maintain any new regime, with the designated organisation tasked with creating explanatory guidance and upholding a code of conduct. This should also be reinforced with a "strong sanctions regime" including the capacity to determine the personal liability of company directors.

Keumars Afifi-Sabet
Features Editor

Keumars Afifi-Sabet is a writer and editor that specialises in public sector, cyber security, and cloud computing. He first joined ITPro as a staff writer in April 2018 and eventually became its Features Editor. Although a regular contributor to other tech sites in the past, these days you will find Keumars on LiveScience, where he runs its Technology section.