What is the 'right to be forgotten'?

Everything you need to know about the EU's data-removal ruling

Ironically, Mario Costeja Gonzalez will always be remembered. After a six-year legal battle with Google, the man from Spain helped pave the way for a clause in the General Data Protection Regulations (GDPR) called 'the right to be forgotten'.

This, also known as the right to erase, is an important principle that was introduced as part of the GDPR in 2018. It's legislation that allows data subjects to request data about them to be removed from an organisation's database, no matter the reason.

Gonzalez demanded that outdated information about his house being repossessed was removed from search listings. Both the Spanish Court and the European Court of Justice (ECJ) looked at his case and agreed it should be removed from the public domain.   

This proved to be a landmark moment with numerous cases following suit and Google even establishing its own procedures for receiving and judging right to be forgotten complaints.

Under GDPR, businesses must comply with the provision and take action as soon as a right to be forgotten request is made. The punishment for failing to do so is arguably just as famous as GDPR itself; it can be as much as 4% of a company's annual turnover, or €20 million, whichever is higher. 

But the 'right to be forgotten' is not considered in isolation and must be balanced against other rights, such as that of freedom of expression, the ECJ said in its ruling, with information that's considered to be in the public interest unlikely to be removed on request.

Google, and any other search engines, must consider removing links to any information that is inaccurate, inadequate, irrelevant, or excessive when a request is filed from an individual about their own search results. With Google, your 'right to be forgotten' can be exercised using this form.

Right to be forgotten: GDPR

The right to be forgotten ruling was based on the EU's 1995 Data Protection Directive, which stated in Article 12 that people can ask for their personal data to be deleted once it's no longer necessary. The ruling outlined when and how search engines like Google must honour such a request.

However, the General Data Protection Regulation (GDPR), which applied to all EU member states (and all organisations using EU citizens' personal data) from 25 May 2018, has taken the reigns from the old 1995 Directive. Intended to update privacy and data protection rules for the digital age, GDPR also updates the definition of the right to be forgotten.

In Article 17, the GDPR legislation considers the right to be forgotten in the context of organisations collecting and processing people's personal data. It retains the 1995 Directive's intent to allow people to request their data is deleted when it's no longer relevant but expands this right to give people more control over who can access and use their personal data.

Under GDPR then, an EU citizen has the right to demand an organisation erases their personal data if:

  • the data is no longer relevant to the reason it was collected;
  • if the person withdraws their consent for their data to be used (and if the organisation has no other legal basis for collecting it);
  • if the person objects to their data being collected for marketing purposes or where their rights override legitimate interests in collecting data (for instance, where that is sensitive data concerning a child);
  • if the data was unlawfully processed;
  • if the data's erasure is necessary to comply with a legal obligation;
  • if the data belongs to a child, and was exchanged for "information society services".

In all these cases, the organisation must delete the data "without undue delay" - i.e. as soon as possible. If the organisation has made the data public, it must take "reasonable steps, including technical measures" to inform any other organisation processing that data that the data subject has asked for it to be removed. 

However, organisations don't have to honour these requests if they're complying with legal obligations, exercising their right to freedom of expression or the right to freedom of information, if the data is in the public interest or to establish, exercise or defend legal claims.

Right to be forgotten: UK

The UK is set to leave the EU on the 31 January, but all of the data protection principles outlined in GDPR will still apply to UK data processing, including the right to erasure. This is for two reasons: firstly, a UK-specific version of GDPR has already been enshrined in UK law as the Data Protection Act (DPA) 2018, and secondly because its key tenets will be included in the European (Withdrawal) Act. In other words, all GDPR rights will continue to exist in the UK for the foreseeable future.

How citizens use the provision will vary from case to case. Data subjects could, for example, ask social media companies like Twitter to delete posts they published earlier in their lives that could hinder them personally or professionally as adults.

There are two prominent cases in the UK where individuals have invoked their right to be forgotten, in 27 February and 13 March 2018. They were identified at the time only as NT1 and NT2. Both claimants are men, and both cases involved challenges against Google.

They had requested that Google remove links to articles that listed their previous convictions for crimes committed in a place of work - arguing these reports have damaged their personal relationships and hindered their professional reputation. NT2's court filing even referenced the assertion he had faced attempted blackmail and had been threatened in public.

Google initially refused to comply with their requests, suggesting this information was in the public interest. In April last year, the High Court ruled in favour of NT2's request, suggesting the conviction listed, in his case, was not relevant to his business dealings. By contrast, the High Court ruled against NT1, as this individual was convicted of false accounting, and therefore still relevant to those doing business with him.

What will be removed?

The information must be deemed "irrelevant, outdated, or otherwise inappropriate", and be accompanied by a digital copy of the user's official identification. Failure to remove links that align with the EU court ruling's definition will result in fines.

Who is regulating the right to be forgotten?

How Google handles complaints and requests to remove information from its search results will be looked over by a task force of European privacy watchdogs, referred to as Article 29.

Following the flood of requests received by Google, Professor Luciano Floridi, the person tasked with determining how Google can comply with the recent EU court ruling, said in 2014: "People would be screaming if a powerful company suddenly decided what information could be seen by what people, when and where. That is the consequence of this decision. A private company now has to decide what is in the public interest."

Google, currently responsible for almost 90% of web searches in Europe, faces the unenviable task of balancing its duty to comply with its users' "right to be forgotten" and preserving its reputation as the go-to source for online information and content.

Peter Barron, Google's director of communications for Europe, said: "The European court of justice ruling was not something that we wanted, but it is now the law in Europe, and we are obliged to comply with that law. We are aiming to deal with it as responsibly as possible... It's a very big process, it's a learning process, we are listening to the feedback and we are working our way through that."

All applications must verify that the links in question relate specifically to the applicant unless the applicant has the legal authority to act on the claimant's behalf, in which case this must be proven.

Landmark cases to date

Google vs CNIL

In September, the European Court of justice ruled that Google is not required to apply the right to be forgotten globally and that only search results within Europe should qualify.

The case, which began as a dispute between the tech giant and French data regulator CNIL, initially saw Google being ordered to delete any results that included damaging or false information relating to an individual. Google introduced a geoblocking feature that prevented search results from appearing, however, this was only introduced in Europe, prompting a challenge from the regulator.

Google argued that allowing a geoblocking feature to be applied beyond Europe could potentially allow rogue governments to hide criminal activity, such as human rights abuses.

German double murder

In November, a German man that had been previously convicted of two counts of murder in 1982 won a case in Germany's highest court to have his surname removed from articles referencing the initial charge and subsequent criminal proceedings. 

In this instance, the courts agreed that his right to privacy outweighed any public interest or press freedom case, and that Google needed to comply with his removal request. Any publication that held archives of the articles were also forced to remove them.

Is it about privacy or censorship?

The request form launched by Google after the ruling received 12,000 entries from across Europe within 24 hours at one point receiving up to 20 requests a minute. This grew to 41,000 requests in the first four days.

Feeding into fears about the potential consequences of this ruling, almost a third of the requests were related to accusations of fraud, while a further 12% were attached to child pornography arrests and 20% for other serious or violent crimes.

Of these first 12,000 entries, around 1,500 were said to be people residing in the UK, with an ex-politician, a paedophile and a GP among them.

By December 2014, the number of requests received by Google has grown to around 175,000 from all 28 EU countries, with 65,000 of the links coming from the UK. As of 22 January 2018, Google had complied with 43.3% of requests to remove links.

Before the form was made available, most removal requests to Google were coming from Germany and Spain, with the UK, Italy, and France making up the rest of the top five.

There are concerns from many that the ability for users to request information be removed from search results could result in the system being abused for nefarious purposes.

However, lawyers have assured those worried that politicians, celebrities, and criminals will probably not benefit from the ruling as Google will have the right to reject applications that request removal of information deemed in the public interest.

It should also be noted that, while links to the objectionable information will be removed, the information will not actually be deleted from the web.

Following on from comments regarding the ruling, Baroness Prashar, chair of the Lords Home Affairs EU Sub-Committee, said: "[We] do not believe that individuals should have the right to have links to accurate and lawfully available information about them removed, simply because they do not like what is said."

Featured Resources

Digital document processes in 2020: A spotlight on Western Europe

The shift from best practice to business necessity

Download now

Four security considerations for cloud migration

The good, the bad, and the ugly of cloud computing

Download now

VR leads the way in manufacturing

How VR is digitally transforming our world

Download now

Deeper than digital

Top-performing modern enterprises show why more perfect software is fundamental to success

Download now

Most Popular

The enemy of security is complexity

The enemy of security is complexity

9 Oct 2020
The top 12 password-cracking techniques used by hackers

The top 12 password-cracking techniques used by hackers

5 Oct 2020
Mobile browser flaw exposes users to spoofing attacks

Mobile browser flaw exposes users to spoofing attacks

21 Oct 2020