Should search engines have a conscience?

The recent appearance of a racist image of the First Lady Michelle Obama during a search on Google's search engine raises an interesting question: Should search engines have a conscience? Obviously, search engines, like Google, Bing, and Yahoo!, that rely on highly complex algorithms to determine search results, have no intentional bias or inclinations that influence their searches. To suggest otherwise would be to anthropomorphize what is simply an immense and complicated set of computer code.

But that code is a creation of its individual developers and team of developers, each of whom has a conscience. And it is also a product of the company for whom they work. A company can't have a conscience, you might think. To the contrary, every company, because it is comprised of individuals, has a conscience (it's called a corporate culture). We can look no further than Google itself whose conscience is expressed in its motto: Don't be evil. Given that powerful expression of the company's conscience, it seems reasonable to assume that Google would be offended by the racist image of the Ms. Obama. And, to a degree, it's up to the developers and the company to make a deliberate decision about whether and how to exercise their conscience in the formulation of the search-engine algorithms. In the case of Google, a spokesman told CNN that "We have a bias toward free expression. ...That means that some ugly things will show up." Despite this assertion, the image of Ms. Obama was removed by Google, though apparently not, according to the company, because it was distasteful, but rather because the page might have had malware that put viewers' computers at risk.

This argument for freedom of expression - and the implicit acceptance that, in the case of search engines, such freedom means results both noble and base - doesn't mean that search engines aren't influenced by the very human attitudes, beliefs, and emotions of its developers. Developers, just like the rest of us, aren't immune to the influence of unconscious perceptions, such as offending knowledge, on the coding of those algorithms. Strong emotional reactions play a powerful role in attention, information processing, and decision making at both the conscious and unconscious levels, all processes that are ever present in the creation of search engine algorithms. Though this implicit bias may be mitigated to some degree as the algorithms progress through the team of developers, it will likely not be eradicated, particularly if team members have similar sensibilities.

So we return to the question I started with: Should a search engine - I really mean the developers - exhibit a conscience? When anyone with a conscience sees offensive material on the Web, their initial reaction may be to remove that material. But we must remember that consciences vary based on upbringing, ethnicity, religion, politics, and a host of other influences. One person's disgust is another person's titillation. And one person's conscience is another person's censorship. We have seen this dichotomy played out in China and Iran. At least in America, if it's not illegal or hurtful, we as individuals should have the right to exercise our consciences as we see fit. Which, in the case of search engines, if we find something objectionable during the course of a search, we can act on our conscience by clicking Close or perhaps reporting it to the search-engine monitors.

The Google spokesman told CNN that, "We're always working to improve our algorithm to provide more relevant search results. We do not make editorial decisions based on our politics or anyone else's." I would suggest otherwise, though certainly not deliberately. Developers shouldn't fool themselves into thinking that they are dispassionately creating unbiased programs, such as search engines, that are purely the expression of endgame optimization. To the contrary, just as individual artists and writers, or teams of architects, project themselves into their work, so, assuredly, do developers and the companies for whom they work. That doesn't make them bad developers. It's just makes them human.

Copyright © 2009 IDG Communications, Inc.

7 inconvenient truths about the hybrid work trend
Shop Tech Products at Amazon