It’s a little over three years since the European Court of Justice held in Google Spain v AEPD and Mario Costeja González that in some circumstances Google must honor requests to deindex search hits based on a person’s name. In data protection terms, the court ruled that Google is a data processor, and as such subject to the EU’s data protection laws. At the time, the decision outraged many American internet experts, who saw (and still see) it as an unacceptable abrogation of free speech. European privacy advocates were more likely to see it as a court balancing two fundamental rights – the right to privacy and the right to freedom of expression – against each other and finding a compromise. Some also argued that the decision redressed the imbalance of power between individuals and a large corporation, which led them to ask in puzzlement: isn’t self-reinvention an American tradition?
In these cases, the main search engines, which means, uniquely, Google, because it has a 91% share of the European search engine market – sit right in the crosshairs . (As Bas van der Beld writes at Search Engine Land, Europe really needs competition.) Because of that level of dominance, what Google’s algorithm chooses to display and in what order has a profound effect on not only businesses but individuals.
Once the court ruled that Google met the a legal definition of a data controller, to a non-lawyer the rest appears to follow. That key decision is, nonetheless, controversial: the court’s own advocate-general, Niilo Jääskinen, had advised otherwise, a recommendation the court chose to ignore. However, part of the advocate-general’s argument rested on the fact that the right to be forgotten was not, at the time, part of the law. As of May 25, 2018, it will be.
It is unhelpful to talk about this in black-and-white terms as censorship. Unlike the UK’s current relentless pursuit of restricting access to various types of material, the court did not rule that the underlying material should be removed, blocked, or made subject to other types of access restrictions. There is also no prohibition on having the pages in question pop up in response to other types of searches – that is, not on the person’s name. It’s also unhelpful to paint the situation as one that aids wealthy criminals and corrupt politicians to hide evidence of their misdeeds: Google, as already noted, has rejected nearly 60% of the requests it’s received on, among other grounds, the basis of public interest. That said, transparency will continue to be crucial to ensuring that the system isn’t abused in that way.
After the inevitable teething problems while Google struggled with a rush of pent-up demand, things went somewhat quiet, although the right to be forgotten did get some airplay as an element of the General Data Protection Regulation, which was passed last year. This year, however, the French data protection watchdog, Commission Nationale de l’Informatique et des Libertés(CNIL), kicked the issue back into the rotation. Instead of removing these hits – as the link above shows, Google has deindexed 43.2% of the requests it has received – only from the search results seen by visitors based in the EU, CNIL told Google they must be removed from all its sites worldwide. The ruling has been appealed, and this week it was announced that the case will be, as Ars Technica’s Kelly Fiveash writes, heard by the European Court of Justice, which is expected to decide whether such results should be delisted globally, on a country-by-country basis, or across the EU.
Each of these options poses problems. Deindexing search results country-by-country, or even across the EU, is easily circumvented. Deindexing them globally raises the question of jurisdictional boundaries, previously seen most notably in the area of surveillance legislation and access to data on foreign servers. Like the issues of who gets to see whose data on distant servers, the question of how companies obey deindexing rulings is just one of the long series of demarcation disputes that will extend through most of our lifetimes as governments squabble over how far across the world their authority extends.
A second big issue – which Jääskinen raised in his opinion – is devolving responsibility for the decision of what to remove into the hand of private companies. The problems with this prospect also feature in the UK discussions about getting social media companies to “do more” to remove extremist material, hate speech, and other undesirables. Obviously the alternative, in which the government makes these decisions is even worse. Jääskinen also suggested that the volume of requests would become unmanageable. Google’s transparency report, linked above, shows otherwise: a spike at the beginning followed by a dramatic drop-off and a relatively flat trajectory thereafter. Experience over the last three years, therefore, indicates otherwise.
Costeja was a messy and controversial decision. The ECJ’s decision to hear this case gives it a chance to review and revise its thinking. However, it will not be able to solve the fundamental problems: the power struggle between global data services and national governments and the direct clash between European fundamental privacy rights and the US’s First Amendment. Most likely, it will contain something to offend everyone.
Illustrations: European Court of Justice (Cédric Puisney;
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web sitehas an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard – or follow on Twitter.