Tech firms lobby EU over harmful content liability

Companies are desperate to avoid the sanctions for failing to remove content but admit some oversight is needed

Lobbyists working on behalf of the biggest tech companies have been writing to European Union (EU) lawmakers to ensure they cannot be held liable for harmful content hosted on their platforms.

With the European Commission in the midst of drawing up a Digital Services Act, tech firms such as Google and Facebook have been working to water down the legislation to avoid prospective sanctions, according to the Financial Times.

Self-regulation should continue, according to a letter sent from Edima, an organisation established to lobby on behalf of Silicon Valley’s biggest companies. There should, however, also be a “new approach”, which could involve an additional degree of oversight.

Related Resource

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

How to manage security risk and compliance - whitepaperDownload now

Due to the lack of any existing legislation, the EU has given social media companies the scope to regulate themselves, with platforms trusted to remove extremist or illegal content on their own accord.

These companies would argue that, despite gaps and flaws in their processes, their self-regulatory systems have improved in the last few years. 

The European Commission published research in February 2019 showing tech giants have become better at removing content like hate speech and child sexual abuse.

The lobbyists have urged lawmakers not to introduce an element of liability in any new legislation drawn up, as this could lead to punishments for companies that have proactively tried to unearth illegal content.

New rules could establish “a perverse incentive whereby companies are discouraged from taking action before being made aware of the existence of illegal content, for fear of incurring additional liability”, according to Edima's director-general, Siada El Ramly.

Member states and lawmakers, however, have moved towards a consensus that the existing model of self-regulation is flawed, and that tech companies cannot be trusted nor incentivised to remove harmful content without sufficient oversight.

The majority of reports from the EU in the last few years have pointed towards a much more aggressive regulatory landscape, where removing harmful content is concerned. 

Stretching back years, lawmakers have been urging social media companies to do more, and despite some progress, the lack of significant improvement has led to harsher briefings.

In March 2018, for example, the European Commission suggested limiting the statutory deadline for tech giants to remove illegal content to just one hour.

Featured Resources

The ultimate law enforcement agency guide to going mobile

Best practices for implementing a mobile device program

Free download

The business value of Red Hat OpenShift

Platform cost savings, ROI, and the challenges and opportunities of Red Hat OpenShift

Free download

Managing security and risk across the IT supply chain: A practical approach

Best practices for IT supply chain security

Free download

Digital remote monitoring and dispatch services’ impact on edge computing and data centres

Seven trends redefining remote monitoring and field service dispatch service requirements

Free download

Recommended

Senator wants social media companies held liable for spreading anti-vax lies
social media

Senator wants social media companies held liable for spreading anti-vax lies

23 Jul 2021

Most Popular

Best Linux distros 2021
operating systems

Best Linux distros 2021

11 Oct 2021
What is cyber warfare?
Security

What is cyber warfare?

15 Oct 2021
Windows 11 has problems with Oracle VirtualBox
Microsoft Windows

Windows 11 has problems with Oracle VirtualBox

5 Oct 2021