cross posted at the Digital Media Law Project and Blog Law Online

In 2012, a bevy of internet companies and web sites waged a successful campaign against bills in Congress -- the PROTECT IP Act and Stop Online Piracy Act (SOPA) --  meant to combat copyright privacy. In the face of this opposition, the proposals were dropped (although their legacy survives). One of the major claims by the opponents was that the bills would "break the Internet" by requiring the disabling of URLs and removal of online links to sites that include unauthorized uses of copyrighted materials (although not all agreed with this assessment).


Now, the European Court of Justice has issued a decision (summary) that could require search engines to remove links to online information about individuals that is "no longer necessary in the light of the purposes for which they were collected or processed." The court's decision does not discuss how the removal of these links should be accomplished.

The court's decision stemmed from a case brought by Spanish citizen Mario Costeja González, seeking removal from a newspaper's web site images of pages from January and March 1998 that included announcements for a real estate auction stemming from attachment proceedings for the recovery of social security debts owed by Costeja González. He complained to Spain's Agencia Española de Protección de Datos (Spanish Data Protection Agency; AEPD) (Spanish site; English resources), seeking removal of the information from the paper's website and from Google's search results.

AEPD held that the newspaper need not remove the material, since it published it under a legal directive. But it upheld the complaint against Google, saying that Costeja González had the right to shield the information from public view via the search engine. Google appealed to Spain's Audiencia Nacional (National High Court). That court sought an advisory opinion from the European Court of Justice -- the highest court in the European Union -- regarding the applicability of EU privacy laws to the case.

European law embodies a concept of privacy that is in many ways alien to American law, and would be unconstitutional under our First Amendment. This includes a right to bar or recover for publication of true but "private" information that is readily available publicly, and a right to shield dated information, often referred to as a "right to be forgotten."

The question before the court was whether the EU directive embodying these notions (Directive 95/46) applied to Google. This, in turn, depended on whether Google could be considered a content provider. The court held that it was, even though the information that Google collects and displays in its search results is already published online by someone else. Since Google is a content provider, the court held, it is obliged to follow the privacy directive.

Inasmuch as the activity of a search engine is therefore liable to affect significantly, and additionally compared with that of the publishers of websites, the fundamental rights to privacy and to the protection of personal data, the operator of the search engine as the person determining the purposes and means of that activity must ensure, within the framework of its responsibilities, powers and capabilities, that the activity meets the requirements of Directive 95/46 in order that the guarantees laid down by the directive may have full effect and that effective and complete protection of data subjects, in particular of their right to privacy, may actually be achieved.
Google Spain SL v. Agencia Española de Protección de Datos (AEPD), Case C‑131/12 (E.C.R. May 13, 2014), para. 38.


The court also ruled that Google was subject to Spain's jurisdiction, including its law applying Directive 95/46, because of Google's web site directed at the country (www.google.es). The court rejected Google's argument that it was Google, Inc., in the United States that performed the indexing and search functions at issue, rather than the Spanish subsidiary: "Since that display of results is accompanied, on the same page, by the display of advertising linked to the search terms, it is clear that the processing of personal data in question is carried out in the context of the commercial and advertising activity of the controller’s establishment on the territory of a Member State, in this instance Spanish territory." Id., para. 57.

After finding that the directive applies to Google, the court held that the search engine could be ordered to remove links to the objectionable material from search results for Costeja González's name.

[I]n order to comply with the rights laid down in those provisions and in so far as the conditions laid down by those provisions are in fact satisfied, the operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful.

Id., para. 88.

The irony -- expressed in the last sentence above -- is that the court also observed that the newspaper that posted the notices of the auctions in the first place could not be required to remove those postings because they were published "solely for journalistic purposes," which is included within "the right to receive and impart information" guaranteed in Article 10 of the European Convention for the Protection of Human Rights and Fundamental Freedoms, and referenced in the directive.

The court justified the different treatment of the newspaper and Google by staing that "first, the legitimate interests justifying the processing may be different [for the newspaper and the search engine] and, second, the consequences of the processing for the data subject, and in particular for his private life, are not necessarily the same." Id., para. 86.

Indeed, since the inclusion in the list of results, displayed following a search made on the basis of a person’s name, of a web page and of the information contained on it relating to that person makes access to that information appreciably easier for any internet user making a search in respect of the person concerned and may play a decisive role in the dissemination of that information, it is liable to constitute a more significant interference with the data subject’s fundamental right to privacy than the publication on the web page.

Id., para. 87 (emphasis added).

The court acknowledges that these "rights override ..., not only the economic interest of the operator of the search engine but also the interest of the general public in finding that information upon a search relating to the data subject’s name. "  Id., para. 97. The court adds that this may not be true in the case of a promient person in public life, which may mean that the public interest in disclosure would outweigh that person's right to privacy.

But since Costeja González is not a public person, he may request removal of the articles from Google's search results even though he cannot request removal of the same articles from the newspaper's website. Again, this would not be the case under United States law, regardless of the plaintiff's status as a private or public figure. In the U.S., privacy law generally does not provide a remedy for the dissemination of true information that is already publicly available; moreover, the compelled removal of such information would raise serious issues under the First Amendment as prior restraints on speech.

The case now returns to the Spanish Audiencia Nacional for a specific decision in Costeja González's case, which can be appealed to Spain's Supreme Court. But the EU court's decision is binding on member states of the European Union, and could lead to more efforts by Europeans to have embarrassing or other material removed from web search results, even when the original site containing the material has no obligation to remove it.

Google has stated that it is "analys[ing] the implications" of the ECJ ruling, but requiring the modification of search results in response to what will likely be a flood of complaints from residents of EU countries puts Google in the difficult (if not impossible) position of either managing these complaints at significant cost or taking a blunderbuss approach to removal of content. And would such results persist outside the EU? Content filtering by country is not a new concept, but this ruling has the potential to create a dramatically different Internet in Europe.

There could also be a significant impact on the news organizations that, as the ECJ acknowledges, have the right to publish this information. The ECJ, in its ad hoc balancing of interests, seems blind to the fact that news organizations depend on search engines and other online intermediaries in order to reach their audiences. Allowing the subjects of news coverage to use these intermediaries as a choke point because the intermediaries are not themselves journalists threatens the primary benefit of the Internet -- namely, the networked dissemination of information.

Operating on the Internet has always posed challenges in complying with the laws of multiple countries. Search engines in particular have had problems in the past dealing with Great Britain's privacy laws, France's laws against Nazi memorabilia, and China's web restrictions. But this ruling by the European Court of Justice might just be the straw that breaks the camel's -- or the Internet's -- back.

Unlike the United States, China places severe limits on what internet users in that country can access. So it may not be a big surprise that the Chinese search engine Baidu Inc. places restrictions on its search results, so that Chinese users cannot see in the results the sites that they are not able to access. (It was just these sort of restrictions, imposed by the government, that led Google to pull back its operations in the country.) The limits also apply to users who use the baidu.com site in the United States.

Last week, a federal judge in New York dismissed a lawsuit brought against Baidu by several Chinese dissidents and activists who live in New York, alleging that Baidu's restrictions constituted a violation of their civil rights, namely their free speech rights under the First Amendment. (They also alleged racial discrimination, and denial of rights to equal public accommodations.) The judge's dismissal was based on the principal that Baidu's First Amendment rights included the right to remove certain results from its results.

But what of the plaintiffs' First Amendment rights? If access to their writings is blocked because Baidu won't show their material in its search results, isn't that a violation of their free speech rights?

No, it isn't. And because it's a frequently misunderstood issue, its important to explain why.

The First Amendment states that "Congress shall make no law ... abridging the freedom of speech, or of the press ... ." Note that it mentions Congress, the federal body charged with creating laws. Court decisions have extended this to also cover other rule-making entities of the federal government, such as executive and administrative agencies, and (through the 14th Amendment) to also cover state and local governments.

But these are all governmental entities. In short, the First Amendment applies only to government restrictions on free speech, and the press. And because the Constitution applies only to the United States, it cannot applied to the Chinese government (which was initially named as a defendant in the case, then dropped.)

Private entities -- corporations and individuals not acting on behalf of a government entity -- can  generally restrict speech as they wish. And they often do: think of the things you can't say to your boss without getting fired as a result. You cannot be fired or denied something made available by a private entity to the public, however, on the basis of your race or religion, and, in many places, on the basis of your gender or sexual orientation.

Or, more to the point, a search engine can decide to display or not display certain results. Even Google does this in a way, by using an algorithm that emphasizes some results over others, based on what the information it has about the user (perhaps raising privacy issues that are an issue for another day). And Baidu can choose to not display results it -- or the Chinese government -- sees as subversive.

People often -- intentionally or not -- confuse this issue. For example, there was an outcry when A&E temporarily suspended Duck Dynasty star Phil Robertson, after he made a statement that homosexuality was a sin, that the network had violated his First Amendment rights. But as others -- even Fox News' Steve Doocy -- pointed out, A&E's suspension of Robertson was not a First Amendment issue at all. He had -- and has -- the right to say whatever he wants. But he does not have the right to show on A&E.

The same with the plaintiffs in the case against Baidu. Baidu's decision not to display the plaintiffs' writing in the search engine's results does not stop them from saying or posting things about the Chinese government. They still have their First Amendment right to speak, and Baidu's actions do not stop them from doing so. But there is no First Amendment right for them to be heard, or listed on a search engine's results.

 

This is a year in which the Georgia Legislature is paying a lot of attention to the definition of digital over-sharing.

The liberating nature of the internet - the opportunity to instantly share and send photos, video and other content - can connect extended families, facilitate the rate of scientific advancement and keep us informed and entertained. Lots and lots of good stuff.

But it also has given birth to digital “revenge porn,” to mug shot websites  that demand fees for deletion and to any number of schemes that can hold good people hostage to a misstep in their past that they thought they had already paid for.

Earnest Georgia legislators are trying to address these issues. The challenge in each instance is how to keep the malevolent types at bay – and to protect individual privacy – without undermining First Amendment protections and open government laws in the process.

Here’s a look at some of what’s in play:

HB 845, HB 150 – Mug Shot Mania

The malevolent element here is how to contain internet entrepreneurs who want, in essence, to monetize humiliation. There are a number of individuals who have discovered a way to make money from aggregating mug shots and publishing them in print books and on websites. The photos may be from an individual’s distant past, the charges may have been dropped or the records sealed, but the mug shot lives on. And publishers have often charged from $30 to more than $400 to remove an arrest booking photo from a site.

One of the “fathers” of this internet business was a guy named Craig Robert Wiggen, who was looking for a new business opportunity after serving three years in federal prison for a scheme to steal credit card numbers, according to news articles. Today, there are estimated be more than 80 of these sites.  

HB 845 follows on last year’s HB 150, which was signed into law. HB 150 defined individuals who were entitled, because charges had been dropped or through other circumstances, to demand that a website take their mug shot down, at no cost, within 30 days of notification. HB 845 goes another step and focuses on the law enforcement side, to instruct agencies when they must not release a booking photograph.

The danger here is that booking photos are and long have been considered public records. Restrictions on their use raise First Amendment issues and impinge on the rights of editors to determine what is newsworthy. One recent example; the public got its first look at the Navy Yard gunman through a booking photo from his 2010 arrest.

HB 845 is written with the intent that law enforcement cannot release a booking photo to anyone who publishes it to a website and also requires a fee before removing it. And it requires that the person making the request submit a statement affirming that the photo will not be used on such sites. HB 150 includes a media exemption that presumably would apply here so that law enforcement could release booking photos to the media. But the language is, to say the least, convoluted, and vulnerable to constitutional challenge. And a sheriff’s department already reluctant to make public records available could easily use it to make access more difficult.

SB 365 – Expunging Criminal Records

This bill seems particularly targeted towards “consumer reporting agencies” that aggregate publicly available data to inform such things as your credit score, and which end up being used in employer background criminal checks, among other things.

It attempts to extend the provisions of HB 1176 into the marketplace. That law, which expanded provisions for criminal records to be sealed, went into effect last July. The list includes, among others,  cases where felony convictions were reversed on appeal or when minor drug offenders successfully completed their probation.

There are certain situations in which record restriction is not allowed at all, including convictions for serious violent felonies and criminal acts such as child molestation, prostitution, sexual battery, theft and also traffic offenses like DUI, vehicular homicide or fleeing the scene of an accident.

(Meanwhile, among other things, there are also procedures that would enable people who were involuntarily put in a psychiatric hospital and ended up on a list that restricted them from buying guns to secure a hearing to get their gun-buying rights restored.)

Proponents argue that in the frictionless world of online data aggregation, old records of arrests or adolescent missteps get picked up and exist forever to haunt the employment prospects of good people.

The risk with expungement is that it creates a bureaucratic vehicle for those who can work the system, and have the resources to do so, to hide past sins that may actually be a good indicator of future behavior. Political candidates could be a case in point (just sayin’). It creates another hurdle for the public to access what had been public records. And it is fundamentally at odds with the First Amendment public right to know, the founders’ idea that in an open marketplace of information and ideas, truth will win out.

As attorney Margaret Love wrote in a 2003 law journal article, the policies underlying expungement of records “requires a certain willingness to ‘rewrite history’ that is hard to square with a legal system founded on the search for truth.”

The current bill, which passed the Senate, would create a new code section requiring that “consumer reporting agencies,” on a monthly basis, must update any criminal history information they have obtained to delete any records that have been restricted.

Is that different from requiring, say, newspapers to cleanse their archives of any stories where what happened was later authorized by the courts for expungement?

HB 838 – Revenge Porn

This is a bill passed by the House which would make “intimate harassment” illegal. Most everyone agrees the concept of ‘revenge porn’ – where users upload nude pictures or videos of their exes – is ugly. And there are many cases where reputations have been tarnished and employment prospects affected.

But in the nearly infinite internet, there are logistical and constitutional perils to policing this. Should the woman who exposed disgraced politician Anthony Weiner be subject to prosecution? Should there be an exception for nonconsensual nude photos that carry a public interest? A court in Florida ruled that Gawker Media, when it published bits of a Hulk Hogan sex tape, was protected by the First Amendment because Hogan was a public figure who had been making money off his public persona for years. There is also Section 230 of the Communications Decency Act, which protects sites like Facebook from having to police every bit of content uploaded by users. That protection would logically extend to sites that accept content that was submitted primarily to punish an ex, even though it might be hurtful.

Public concern about individual online privacy is a powerful force. The challenge is to make sure that in addressing those concerns we do not trade away the public’s larger First Amendment rights to access information held by government and then decide for ourselves what to make of it. In short, how much control should the government have over the public’s right to public records? And how much authority do we want to give government to legislate private online activity?