Judgement of the German Federal Court of Justice on Google's autocomplete function
In May 2013, the German Federal Court of Justice stated that Google's predictions within the autocomplete function of its web search engine can violate the right of personality.[1] The right of personality ensures that a person's (or even a company's[2]) personality (reputation) is respected and can be freely developed.[3] Only the individual shall, in principle, decide how he/she wants to present himself/herself to third parties and the public.[4]
A stock corporation, which sold food supplements and cosmetics online, and its chairman filed an action for an injunction and financial compensation against Google based on a violation of their right of personality.[6] Google runs a web search engine under the domain "www.google.de" (among others), which allows Internet users to search for information online and access third party content through a list of search results.
In 2009, Google implemented a so-called "autocomplete" function which shows word combinations as predictions for the search of the user in a new window while typing in a search term into the search mask. These predictions are based on an algorithm which evaluates the number of searches on specific terms of other users. If users typed the full name of the chairman into the search engine in May 2010 the autocomplete function showed the predictions "Betrug" (fraud) or "Scientology". The claimants stated that the chairman would have no connection to Scientology and that he was under no investigation for fraud. Furthermore, they argued that no search result would show a connection between the chairman and fraud or Scientology. Therefore, they saw these predictions as a violation of their right of personality.
The Regional Court Cologne decided in favour of Google and dismissed the case as unfounded.[7] The Higher Regional Court Cologne uphold this judgement.[8] The claimants filed an appeal to the German Federal Court of Justice.
The German Federal Court of Justice set aside the judgement of the Higher Regional Court Cologne and referred the case back to this court.[10]
The Federal Court of Justice held that
the predictions ("Betrug"/"Scientology") expressed the existence of a factual connection between the chairman and these negatively connoted terms and violated the right of personality[11] (the Higher Regional Court Cologne had taken a different view previously and had held that the predictions only expressed that other users typed in these word combinations for their search or that the terms could be found in linked third party content)
the claimant's right of personality outweighed Google's freedom of expression[12] and commercial freedom[13] in a trade-off because false expressions do not need to be accepted
the violation was directly assignable to Google because they designed the software, exploited the user's behaviour, and suggested the predictions to the users
the national implementation[14] of the provisions of the Electronic Commerce Directive,[15] which grant intermediaries (access, caching, and host provider) immunity from liability to a certain extent,[16] were not applicable in this case because the predictions were not third party content that Google only made accessible or presented, but Google's own content
the basis for a liability of the search engine provider is not the fact that he developed and used the software because these actions are protected by the provider's commercial freedom[17]
the liability can only be based on the fact that the provider did not take the necessary precautions to prevent the violation of a right of personality as part of a so-called "Stoererhaftung" (the "Stoererhaftung" (interferer's liability) is a liability of a person (the "Stoerer") who is not a perpetrator or participant himself, but contributed willingly and adequately causally to the infringement of a protected legal interest in any way and requires a breach of a reasonable duty of care[18])
the search engine provider has, in principle, no obligation to monitor the predictions generated by a software beforehand and is only responsible if he becomes aware of the violation by the predictions
if the provider is notified of a violation by the victim he is also required to prevent future violations.[19]
In April 2014, the Higher Regional Court Cologne then decided in favour of the claimants insofar as they objected to the additional term "Scientology" which Google initially refused to remove.[20] A financial compensation was not awarded because Google removed the entry later (about one and a half after the objection) and therefore limited the infringement.[21] Due to the fact that Google removed the additional term "Betrug" (fraud) immediately after the claimant's first objection, this part of the claim was unfounded.[22]
Criticism
Some legal scholars argued that the judgement established a reasonable balance between the protection of the right of personality (by Google's obligation to remove and prevent infringing predictions after a notice), Google's interest to still provide the autocomplete function (without the need to monitor all predictions) and the Internet user's interest to make use of the search's improvement.[23]
The court's decision that the search engine provider has no obligation to monitor the predictions generated by a software beforehand and is only responsible if he becomes aware of the violation by the predictions corresponds with previous judgements[24] of the court on the "Stoererhaftung" (interferer's liability) of a host provider for content that third parties posted on the host provider's website.[25] However, due to the fact that these previous judgements discussed the liability for third party content, others stated that the fact that the court's autocomplete judgement is based on Google being an interferer ("Störer") within the "Stoererhaftung" (interferer's liability) – and not a perpetrator – contradicts the court's statement that the predictions have to be seen as Google's own content.[26]
Moreover, the judgement raises the question which result a trade-off between Google's freedom of expression and commercial freedom and another person's right of personality would have in other scenarios.[27] Depending on the specific circumstances, it could be more complicated to assess if a prediction is false or (even) true, but not worthier of protection than the right of personality (e.g. in a case in which an investigation for a crime – like fraud – already started or in which a person is actually the victim of a crime).[28]
Another interesting issue is the question to what extent Google is capable of legally evaluating and processing notifications by alleged victims of an infringement.[29] The current legal situation could be an incentive for Google to just remove the prediction after a complaint in order to avoid any liability.[30]
Background information
This judgement was not the only time a possible defamation by Google's autocomplete function was discussed in a courtroom. In Germany, Bettina Wulff, the wife of the former President of the Federal Republic of GermanyChristian Wulff, filed for an action for an injunction regarding 43 predictions against Google at the Regional Court Hamburg based on a violation of her right of personality.[31] The word combinations included the words "Escort" (escort) and "Prostituierte" (prostitute).[32] However, in January 2015, Google deleted these predictions and the parties settled the lawsuit.[33] By taking legal actions against Google, Bettina Wulff probably also caused a so-called "Streisand effect" because many people learned about the predictions by the created media attention for the first time.[34]
In France, in 2010, the Superior Court of Paris ordered Google to cease suggesting certain predictions, including "rapist", "satanist", "rape", and "prison", to Internet users who search for a man's name.[35] The man, convicted for a "corruption of a minor" at the time, was still appealing his conviction.[36] In Italy, a businessman filed a defamation suit because of the terms "truffatore" (conman) and "truffa" (fraud) that were added to his name by the autocomplete function.[37] The Milan court ordered Google to remove these predictions in 2011.[38] Furthermore, in 2012, the Supreme Court of Victoria in Melbourne, Australia held Google liable for defamation by wrongly linking a private person to crimes he in fact was a victim of and awarded $200,000 in damages.[39][40] Moreover, in 2013, the Tokyo District Court in Japan also ordered Google to modify its predictions and pay 300,000 yen ($3,100) as damages to a man which was linked to crimes he did not commit.[41]
However, Google's autocomplete function was not only subject of defamation suits. In another case, French human rights organisations (including SOS Racisme) sued Google for adding the word "juif" (Jewish) to the names of celebrities within its predictions.[42] The human rights organisations argued that Google provided "ethnic files" by suggesting these predictions, which is forbidden in France.[43] The parties settled in 2012 without revealing the details of the settlement.[44]
Today, Google provides an online form that allows Internet users to report an (allegedly) infringing prediction within the autocomplete function.[45]
Relevance
The relevance of this judgement goes beyond the mere autocomplete function because it can be seen as a precedent on the question if algorithms can make defamatory statements.[46] With artificial intelligence[47] and robots becoming more and more widespread in our society, future scenarios, in which the liability for their actions has to be discussed, seem to be likely.[48]
References
^German Federal Court of Justice, VI ZR 269/12 of 14.5.2013. The German courts derived this right from art. 2(1) in conjunction with art. 1(1) of the Basic Law for the Federal Republic of Germany (German constitution) and it is also applicable in private law (as "another right" in the sense of sec. 823(1) of the German Civil Code. Retrieved 25.2.2016.
^German Federal Court of Justice, VI ZR 286/93 of 8.2.1994; German Federal Court of Justice, VI ZR 276/99 of 30.5.2000. Retrieved 25.2.2016.
^The German Federal Constitutional Court, BvR 536/72 of 5.6.1973. Retrieved 25.2.2016.
^The German Federal Constitutional Court, BvR 536/72 of 5.6.1973. Retrieved 25.2.2016.
^German Federal Court of Justice, VI ZR 269/12 of 14.5.2013, [1-3]. Retrieved 25.2.2016.
^Seema Ghatnekar, "Injury By Algorithm: A Look Into Google's Liability For Defamatory Autocompleted Search Suggestions" Loyola of Los Angeles Entertainment Law Review Vol. 33 (2013) 182; Meg Leta Ambrose and Ben M. Ambrose, "When Robots Lie: A Comparison of Auto-Defamation Law" Institute of Electrical and Electronics Engineers Workshop on Advanced Robotics and its Social Impacts (2014) 58 Retrieved 25.2.2016.
^Meg Leta Ambrose and Ben M. Ambrose, "When Robots Lie: A Comparison of Auto-Defamation Law" Institute of Electrical and Electronics Engineers Workshop on Advanced Robotics and its Social Impacts (2014) 58; Adrian Schneider, "11-4-2014 OLG Köln: Die Autocomplete-Entscheidung im Detail" Telemedicus (2014). Retrieved 25.2.2016.
^See for example the "natural-language generation platform" "Quill" which is capable of creating a "convincingly written news story" if structured data is used as input: Alex Hern and Dan Milmo, "Thinking machines: the skilled jobs that could be taken over by robots" The Guardian Online (2015). Retrieved 25.2.2016.