It has often been said that humans can make mistakes, but to truly wreak havoc, technology is necessary. Although true, a recent awkward situation with U.K. retailer Marks & Spencer — that the chain blamed on technology — really has the smell of not only a human, but a mischievous one at that.
What happened on the M&S site defies easy explanation without using words that Computerworld won't allow, but the image captures in this story from The Standard make it all clear. The page displayed brightly colored letters, presumably to be used as holiday decorations. When someone searched for red letters (the most popular holiday color) and asked to display them by price (the most popular query limiter), the letters approximately spelled out an old Anglo-Saxon verb, commonly considered vulgar and not printable on this website, with the direct object "me" following on another line. The suggestion seemed to be that the shopper should engage in intimacy with ... well, I don't know, the retailer, I guess.
The chain's explanation? "This was due to the algorithms used to display products on our website. It was quickly spotted and corrected," the story quoted an unidentified M&S spokeswoman saying. The story then added, "It is understood products are not uploaded manually to the website, but that the page is populated directly from a database by a computer."
Wait a second. M&S is actually trying to say that software accidentally arranged these letters in this way?
Of course, saying that it "was due to the algorithms used to display products on our website" could be technically correct. What is not being addressed is whether a programmer deliberately coded the system to do this, for the programmer's amusement. If the chain is suggesting that this was just a software fluke, it has a credibility issue.
Let's assume that this was an employee or contractor prank, presumably from someone who wasn't worried about losing this gig. (Note to M&S: Have you opted to not renew any programming contracts lately, perhaps with a third party? That's where I'd start looking.) This raises a separate issue. Why wasn't this caught before it went live?
The answer is that the perpetrator was clever. The naughty message would only be displayed when someone performed a common search intersection, of the most popular color with the most popular search criteria. What supervisor is going to try multiple display combinations for a retail site that has such a huge number of SKUs? This was a wonderful way to dodge approval checkers but to be seen by a huge number of shoppers.
But — and I grant you that this is highly unlikely — what if this really was an unintentional software glitch? If so, that presents a different challenge. On the one hand, a human would spot that message — and understand its amorous implications — immediately, whereas software typically wouldn't. But how many variations of every product display should a human check? Given that these were letters, it probably justified special effort. In you sell letters, you might want to add an extra layer of due diligence before pages go live. If nothing else, this M&S situation is going to give a lot of bored programmers some day-brightening ideas.
This article is published as part of the IDG Contributor Network. Want to Join?