New technologies continue to tear down barriers and bring people closer together than ever before, but society in response is struggling to grapple with the social ramifications of evolving privacy expectations. That includes how we respond to the unexpected release of information about the private lives of our fellow citizens, particularly when it reveals behaviors that may be socially frowned upon but not legally punishable. The same challenges are also reflected in many current policy debates. Unfortunately, some of the efforts being undertaken in the name of privacy are dangerous and ill-conceived.
In May of 2014, the Court of Justice of the European Union agreed with a Spanish man who complained that it was an infringement of his privacy for old stories that contained references to his home once being repossessed to still show up when his name was Googled. It declared a right of citizens to petition search engines to remove links to stories containing â€śirrelevantâ€ť or â€śoutdatedâ€ť information about themselves.
Over the next 12 months, Google received more than 250,000 such requests, approving just over 40 percent and effectively burying almost a million pages into obscurity. The sites still exist, but without inclusion in search engines they might as well not. The protections carved out by the EU Court of Justice has since come to be referred to as a â€śright to be forgotten.â€ť
Note that there is no requirement that the information requested for delisting necessarily be inaccurate or misleading. Instead, search engines are being forced into acting â€“ at considerable expense â€“ as judges over what truths the public deserves to know, undermining their primary value as aggregators of information and denying the public access to potentially useful information.
But European authorities also don’t necessarily trust Google’s judgment. In Britain, there were 183 complaints to the nation’s Information Commissioner’s Office (ICO) about requests which Google denied during the 12-month period. The ICO agreed with Google’s decision in most circumstances, but said that Google â€śhasn’t got it quite rightâ€ť in 48 instances and asked for those cases to be reviewed again. Some Google agreed to reverse, but the ICO threatened to use â€śenforcement powersâ€ť to strong-arm Google into removing the rest if they refused to do so voluntarily. That effectively puts government bureaucrats in charge of deciding what information is of public value, and to which the people should be allowed access.
The practical challenges for building a 1984-style internet memory hole border on the comically absurd. When Google informed a British paper that it had been forced to remove links to a story about a shoplifting by the perpetrator, the paper understandably found the censorship itself to be newsworthy and wrote about it, thereby undermining the entire point. The ICO responded by demanding Google remove links to the new articles, too. How can this cycle of censorship and its subsequent coverage end? The answer, apparently, is memory holes all the way down.
It’s no coincidence that this problem has emerged in Europe, where free speech is not valued highly relative to the United States. Unfortunately, it may not remain there. Agitators are already calling for similar suppression efforts within the U.S., and the French last week declared that the conjured â€śright to be forgottenâ€ť must be applied by Google globally, not just within the European Union, and they are threatening significant fines if Google fails to comply.
Of course, Google can â€“ and should â€“ tell France to go pound sand and not offer its services to the nation. But that would be denying itself future revenue. Moreover, the danger is that other nations will follow suit, or that governments will band together and form an information restricting cartel that leaves Google and any other global search engine with no choice but to implement the regime globally, essentially denying all users across the world access to any information that even one bureaucratic commission considers â€śoutdatedâ€ť or â€śirrelevant.â€ť
There are a great many faults to be found in this sort of recognition and enforcement of a â€śright to be forgotten.â€ť
For one, it expands the idea of what constitute a ‘right’ beyond its breaking point. True rights are not only grounded in our natural inheritances as human beings, but make no demands upon others beyond non-infringement. The right to life requires only that we not be killed. The right to speech that we not be silenced. The so-called â€śright to be forgotten,â€ť however, requires that some entity filter information and judge its worth and then deny others access to it based upon that determination. As tech executive Andrew McLaughlin correctly observed, it would be more honestly called a â€śright to force others to forgetâ€ť â€“ which is itself the second issue: long established rights are being eviscerated in the name of enforcing the new â€śright to be forgotten.â€ť
Privacy as an idea does not encompass what proponents of these policies claim. It strains credulity to assert that individuals have some inherent say in how information about them used. In certain circumstances, such as when information is given to another party, like a website, in exchange for a service, individuals certainly have the right to condition that exchange upon certain restrictions. But while implicating the idea of privacy, that’s actually a contract right.
Privacy only exists in environments that are actually private. Descriptions of public actions belong not to the subjects of those actions, but to those doing the describing. One can no more demand that such information not be shared once public then one can demand while walking down the street that others turn away so as to avoid being seen.
But the issues go beyond the theory of rights and privacy. The EU’s attempt to institutionalize internet amnesia would â€“ if widely adopted â€“ result in a variety of practical, negative outcomes.
Although intended to benefit regular people whose actions are least likely to be of public interest, any implementation system is likely to favor those with the power and resources to make the most use of it. Furthermore, even if the tool is not outright captured by politicians and the powerful â€“ those about whom the public has the most need for information â€“ there’s no way to predict what information might be of use in the future.
Today’s nobody could be tomorrow’s candidate for office, yet for the â€śright to be forgotten,â€ť the public has no corollary â€śright for that which was forgotten to be remembered if later it is found to be of public benefit.â€ť The idea that anyone can possibly judge the future value of any information, in other words, is yet another â€śfatal conceit.â€ť Far better to let all information flow and the chips fall where they may.
It’s understandable that many are considering the privacy implications of modern technology. We don’t yet fully know the extent to which the internet and other developments have fundamentally altered social interactions. We know more about one another than ever before, and that poses challenges when we hold on to old ways of thinking. But the solutions to those challenges will come in the form of adapting social and cultural norms, not policies that foolishly seek to turn the clock back on the information revolution.
Photo by BartekSzewczyk / Getty Images
Brian Garst is an advocate for economic and individual liberty. He works as Director of Policy and Communications at the Center for Freedom & Prosperity, a free market think-tank dedicated to preserving tax competition. His writings have been published in major domestic and international papers, and he is a regular contributor for Cayman Financial Review. He also blogs at BrianGarst.com and you can find him on Twitter @BrianGarst.
Keep up with the best of his Free Radical column below. Click through the gallery to read more from Brian Garst.
Election 2016Garst writes about the presidential election and the impact of Donald Trump's authoritarian moment.
Photo by Chip Somodevilla/Getty Images
Sharing EconomyBrian Garst discusses why the Left should fear the sharing economy.
Photo by Adam Berry/Getty Images
CensorshipThe right to be forgotten is censorship by another name.
Photo by BartekSzewczyk / Getty Images
Pope FrancisBrian Garst explores why the pope needs a glass of STFU.
Photo by Vatican Pool/Getty Images
Reining in Abusive ProsecutorsWe need to acknowledge the role that unaccountable and out of control prosecutors play in contributing to the systemâ€™s worst failures and abuses.
Private Lives in a Digital AgeIf society is going to survive in this new age, it may need to learn how to forget, or at the very least, to pretend like it didnâ€™t notice. Read more...
Photo by BartĹ‚omiej Szewczyk / Getty Images
SCOTUSWhy do Republicans suck at picking Supreme Court justices?
Title IXGarst sounds off on how Title IX became an abomination.
AmtrakGarst explores why Amtrak is a government failure.
Robot WorkersDon't worry, the robots are not going to take away all the good jobs.
TaxesBrian Garst explores why you aren't supposed to understand the tax code.
King v. BurwellBrian Garst talks about the fact that there's much more at stake than just Obamacare.
Online GamingRepublicans captured by billionaire have abandoned principle to stop online poker.
Dwindling RightsBrian Garst discusses one of our many dwindling rights in this country -- the right to earn a living.
Government TheftYou would think the idea that government shouldnâ€™t steal would be as uncontroversial as the notion that citizens shouldnâ€™t steal. Alas, it is not. Can we put an end to government theft?
Also read: This Land is [Government] Land.