Last week, the Court of Justice of the European Union (CJEU) made a significant ruling in a data protection case - but given the focus in the UK on the activities of our own Supreme Court, it’s not surprising if you missed it.

The case was about a dispute between the French data protection authority (CNIL) and Google over the extent to which Google had to delist search results from its various global search sites. It ruled in Google's favour.


The case

This all stems from a case back in 2014 when a man in Spain, Senor Costeja, asked Google to delist links relating to his bankruptcy arrangements. This information had to be made public under Spanish law, and Google argued that they merely harvested some publicly available information, indexed and made it available to people using its search services.

I always thought Senor Costeja’s problems emanated from Spanish bankruptcy law rather than Google’s search services, but the courts didn’t agree and Google lost that case - setting a precedent for individual complainants and data protection authorities to order Google (and other search engines) to delist certain search results.


This has always been a difficult area

Google’s argument that it merely indexes and presents publicly available information is a simple but compelling one. But reflecting on one of the first ‘right to be forgotten’ cases at the ICO, the argument is not so simple. It was about a man who, when you Googled his (fairly unusual) name you were presented with pages and pages of results saying so-and-so was a teacher found guilty of offences against children. It was only around page 30 or so of the results that you found out that his conviction had been quashed and that in fact he was innocent. So, I can see why the prominence of search results can affect people in a dramatic and unfair way and the ICO and Google generally agreed that certain search results should be delisted.


The need for delisting is there, but how to achieve it?

Despite individuals, regulators and the search engine providers all accepting the principle that some results should be delisted, there was always considerable disagreement – between regulators, and between regulators and Google – about the territorial scope of delisting. This is not a straightforward matter. Google (and the other search engines) have various versions – google.com, google.co.uk, google.fr etc. However, the content a user sees – in terms of search results and ads – does not just depend on which version of Google is accessed, but also where it is accessed from.

Research we did at the ICO suggested that if you access google.com.mx using a UK IP address you see much the same content as you would find on google.co.uk, but if you set up a Mexican proxy address, you will see Mexico-centric content. So, this raises the question of if an EU data protection authority orders the delisting of a search result, which version does it have to be delisted from and accessible from where.

The ICO always took the view that search results had to be delisted from every version of Google accessed from within the EU. The French say it differently, and one of their officials once told me they expected delisting to take place ‘as far as the moon’. The CNIL lost its case at the CJEU and the judgement seems to support the ICO position, although the judgement refers to ‘versions of its search engine corresponding to all the Member States’. (But the ‘correspondence’ is not as clear cut at the CJEU believes – as explained above.)


What does this mean in practice?

Essentially, this means that individuals cannot expect EU data protection regulators or EU courts to secure the delisting of search results about them globally. Therefore, the right to be forgotten is territorially limited (unless you have very deep pockets and can employ an Online Reputation Management company.) It also means that if you are determined enough to find something out about someone, it’s easy to set up a proxy to access ‘uncensored’ non-EU search results.

It also raises questions over the territorial scope of the GDPR itself. In theory, the GDPR applies to any organisation anywhere that processes the personal information of people in the EU, to offer them goods or services or to monitor their behaviour. But how does this work in the context of the recent CJEU judgement? It suggests that EU data protection law only truly applies within the EU. Confusion over the territorial scope of EU law has not been helped by the even more recent CJEU Facebook judgement, which appears to be saying that content that whose publication is illegal has to be taken down globally, even in jurisdictions where the content is not illegal. Both cases highlight that there is a greater than ever need for international consensus over the governance of the internet.