Tech firms lobby EU over harmful content liability

Companies are desperate to avoid the sanctions for failing to remove content but admit some oversight is needed

Lobbyists working on behalf of the biggest tech companies have been writing to European Union (EU) lawmakers to ensure they cannot be held liable for harmful content hosted on their platforms.

With the European Commission in the midst of drawing up a Digital Services Act, tech firms such as Google and Facebook have been working to water down the legislation to avoid prospective sanctions, according to the Financial Times.

Self-regulation should continue, according to a letter sent from Edima, an organisation established to lobby on behalf of Silicon Valley’s biggest companies. There should, however, also be a “new approach”, which could involve an additional degree of oversight.

Related Resource

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

How to manage security risk and compliance - whitepaperDownload now

Due to the lack of any existing legislation, the EU has given social media companies the scope to regulate themselves, with platforms trusted to remove extremist or illegal content on their own accord.

These companies would argue that, despite gaps and flaws in their processes, their self-regulatory systems have improved in the last few years. 

The European Commission published research in February 2019 showing tech giants have become better at removing content like hate speech and child sexual abuse.

The lobbyists have urged lawmakers not to introduce an element of liability in any new legislation drawn up, as this could lead to punishments for companies that have proactively tried to unearth illegal content.

New rules could establish “a perverse incentive whereby companies are discouraged from taking action before being made aware of the existence of illegal content, for fear of incurring additional liability”, according to Edima's director-general, Siada El Ramly.

Member states and lawmakers, however, have moved towards a consensus that the existing model of self-regulation is flawed, and that tech companies cannot be trusted nor incentivised to remove harmful content without sufficient oversight.

The majority of reports from the EU in the last few years have pointed towards a much more aggressive regulatory landscape, where removing harmful content is concerned. 

Stretching back years, lawmakers have been urging social media companies to do more, and despite some progress, the lack of significant improvement has led to harsher briefings.

In March 2018, for example, the European Commission suggested limiting the statutory deadline for tech giants to remove illegal content to just one hour.

Featured Resources

B2B under quarantine

Key B2C e-commerce features B2B need to adopt to survive

Download now

The top three IT pains of the new reality and how to solve them

Driving more resiliency with unified operations and service management

Download now

The five essentials from your endpoint security partner

Empower your MSP business to operate efficiently

Download now

How fashion retailers are redesigning their digital future

Fashion retail guide

Download now

Recommended

Senator wants social media companies held liable for spreading anti-vax lies
social media

Senator wants social media companies held liable for spreading anti-vax lies

23 Jul 2021

Most Popular

The benefits of workload optimisation
Sponsored

The benefits of workload optimisation

16 Jul 2021
RMIT to be first Australian university to implement AWS supercomputing facility
high-performance computing (HPC)

RMIT to be first Australian university to implement AWS supercomputing facility

28 Jul 2021
Samsung Galaxy S21 5G review: A rose-tinted experience
Mobile Phones

Samsung Galaxy S21 5G review: A rose-tinted experience

14 Jul 2021