Google's libel defenders miss the point

Bloggers are rushing to Google's defense, saying the French court that convicted the company of libel is clueless and out of touch with technology. But in fact it's Google's defenders who are out of touch.

Google was found guilty by a French court because of the search engine's automated search suggestions. Google plans to appeal the ruling, a company spokeswoman told my colleague Peter Sayer on Monday.

The Superior Court of Paris ordered Google and its CEO Eric Schmidt to pay €5,000, or $6,721.50 at current exchange rates, to a plaintiff identified as "Mr. X," and stop suggesting additional search terms to people searching on Mr. X's name.

The lawsuit revolve around Google's Autocomplete service. When people type the first few letters of a search request, Google suggests words to round out the search. For example, someone typing "France" will see the suggestions "france world cup," "france 2," "france news," and so forth.

Mr. X had been charged with raping a 17-year-old girl. He was found guilty of a lesser charge, "corruption of a minor," two years ago, fined €15,000 and sentenced to a year in prison, although that was changed on appeal to a €50,000 fine and a suspended sentence of three years in prison.

Anyone searching on Mr. X's given name and the first few letters of his family name would get search suggestions including "rape," "rapist," "satanist," "sentenced," and "prison," according to court records.

Bloggers are jumping to Google's defense, and wagging a scolding finger at the French court. Business Insider's Nick Saint called the decision "outrageous:" "This is absurd. Google's suggestions are determined algorithmically, and a quick look at the suggestions in the image included here should make it clear that Google isn't endorsing any claims implied by those suggestions."

When you type a few characters into Google's search box, Google simply suggests search terms based on what other people who typed those characters into its search box were looking for. Type the letters "micro" in the search box, and the first suggestion that comes up is "Microsoft," not because Google loves Microsoft -- Google doesn't, they're deadly competitors -- but because "Microsoft" is what most people who type the letters "micro" are looking for.

The normally insightful Techdirt agreed the decision was a bad one, saying that French courts are "once again confused about Google Suggest."

"At some point, you hope that courts and politicians will understand the basics of how technology works, but it seems like we may be waiting a long, long time," Techdirt writes.

But actually the French court understands what's going on. Google raised just those issues in its defense, and the court disagreed. "The court ruling took issue with that line of argument, stating that 'algorithms or software begin in the human mind before they are implemented,' and noting that Google presented no actual evidence that its search suggestions were generated solely from previous related searches, without human intervention," according to Computerworld.

Google's search engine is a machine, and its defense simply would not stand for the manufacturer of any other type of machine. If I build a car with faulty brakes, and you drive it into a tree and die, I can't then wash my hands of blame, claiming that it was the car that did it, but not me.

Of course, that analogy isn't quite apt, is it? Because in the case of the car, the brakes were faulty. But Google's search engine was working exactly as it was designed. The analogy is more like: I make a chainsaw, and you accidentally cut off your leg with it and sue me. Should I be held liable? Depends on whether the chainsaw was built to appropriate safety standards.

Is it appropriate for Google to build a search engine that automatically generates results with no intervention to be sure those results aren't libelous, defamatory, or otherwise harmful?

This is a problem that goes beyond people accused of crimes. Many companies are unhappy with the results that comes up when you search on industry terms. If you make hats, and you're not in the first page of results that come up when searching the word "hats," then you're dissatisfied with Google. Does that make Google wrong? Does it matter if your hats are, in fact, better and more popular than companies with search terms ranked higher?

And consider the "Dreaded Google Suggest 'Scam.'" For many companies, the word "scam" is one of the Google Autocomplete suggestions that comes up when you search on their company names. This is not because the companies are dirty, but simply because searchers want to find out whether there are any scams associated with the company, writes Andy Beal on Marketing Pilgrim. Searchers "add 'scam' to the company’s name, just to make sure. That then creates a self-fulfilling Google prophecy, with Google Suggest showing 'scam' and creating a reputation nightmare that doesn’t actually exist."

The French lawsuit is a test of whether Google should be held liable for the automated operations of its search engine, which depends on the aggregate decisions made by all the users of the Web. I'm not sure whether the French court was correct; indeed, as a journalist, I'm sympathetic to Google's arguments that search results are free expression, which should be a universal moral right. But this principle needs to be tested, and the court was not foolish to rule as it did.

Mitch Wagner

Follow me on Twitter
Visit my LinkedIn page
Friend me on Facebook

is a freelance technology journalist and social media strategist.

Copyright © 2010 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon