IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

Tech firms lobby EU over harmful content liability

Companies are desperate to avoid the sanctions for failing to remove content but admit some oversight is needed

Social media icons on phone screen

Lobbyists working on behalf of the biggest tech companies have been writing to European Union (EU) lawmakers to ensure they cannot be held liable for harmful content hosted on their platforms.

With the European Commission in the midst of drawing up a Digital Services Act, tech firms such as Google and Facebook have been working to water down the legislation to avoid prospective sanctions, according to the Financial Times.

Self-regulation should continue, according to a letter sent from Edima, an organisation established to lobby on behalf of Silicon Valley’s biggest companies. There should, however, also be a “new approach”, which could involve an additional degree of oversight.

Related Resource

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

How to manage security risk and compliance - whitepaperDownload now

Due to the lack of any existing legislation, the EU has given social media companies the scope to regulate themselves, with platforms trusted to remove extremist or illegal content on their own accord.

These companies would argue that, despite gaps and flaws in their processes, their self-regulatory systems have improved in the last few years. 

The European Commission published research in February 2019 showing tech giants have become better at removing content like hate speech and child sexual abuse.

The lobbyists have urged lawmakers not to introduce an element of liability in any new legislation drawn up, as this could lead to punishments for companies that have proactively tried to unearth illegal content.

New rules could establish “a perverse incentive whereby companies are discouraged from taking action before being made aware of the existence of illegal content, for fear of incurring additional liability”, according to Edima's director-general, Siada El Ramly.

Member states and lawmakers, however, have moved towards a consensus that the existing model of self-regulation is flawed, and that tech companies cannot be trusted nor incentivised to remove harmful content without sufficient oversight.

The majority of reports from the EU in the last few years have pointed towards a much more aggressive regulatory landscape, where removing harmful content is concerned. 

Stretching back years, lawmakers have been urging social media companies to do more, and despite some progress, the lack of significant improvement has led to harsher briefings.

In March 2018, for example, the European Commission suggested limiting the statutory deadline for tech giants to remove illegal content to just one hour.

Featured Resources

Accelerating AI modernisation with data infrastructure

Generate business value from your AI initiatives

Free Download

Recommendations for managing AI risks

Integrate your external AI tool findings into your broader security programs

Free Download

Modernise your legacy databases in the cloud

An introduction to cloud databases

Free Download

Powering through to innovation

IT agility drive digital transformation

Free Download

Recommended

Senator wants social media companies held liable for spreading anti-vax lies
social media

Senator wants social media companies held liable for spreading anti-vax lies

23 Jul 2021

Most Popular

LockBit 2.0 ransomware disguised as PDFs distributed in email attacks
Security

LockBit 2.0 ransomware disguised as PDFs distributed in email attacks

27 Jun 2022
Open source giant Red Hat joins HPE GreenLake ecosystem
automation

Open source giant Red Hat joins HPE GreenLake ecosystem

28 Jun 2022
Carnival hit with $5 million fine over cyber security violations
cyber security

Carnival hit with $5 million fine over cyber security violations

27 Jun 2022