Kudos to my Computerworld colleagues who chose to ignore that survey 'showing' Internet Explorer users have lower IQs than users of other browsers. That includes a shout-out to Gregg Keizer, who covers browser issues, as well as our crack news editors. And sorry, but I've got to say I'm feeling a bit smug about concluding that the study was flawed - even if my suspicions turned out to be the wrong ones.
I'll admit it didn't occur to me that the whole thing was made up. But how they *said* they did the survey didn't appear to be a reliable random sample. It just didn't look like statistically valid conclusions could be drawn from their numbers; the methodology they were claiming seemed shaky.
In fact, I almost wrote a snide comment on one report about the study, saying that perhaps "stupid" was less about what browser you use and more about not applying critical thought to evaluating data. I did not, deciding I didn't want to call anyone stupid -- even if I felt strongly they should be a whole lot better schooled in data analysis before jumping to conclusions.
The bogus press release said the data came from more than 100,000 free online IQ tests given to people who were searching for such tests. But that hardly seemed like a statistically valid random sample -- there could be good reasons why more intelligent but less technically sophisticated IE users weren't trolling online to take a free IQ test. Not to mention the problem of trusting people to report their correct age (kids could have claimed to be over 16) -- or whether the results of those free online IQ tests were even accurate.
And, frankly, I found the press release headline -- "Is Internet Explorer For the Dumb?" -- offensive.
I did mention my problems with the study's "methodology" in a Google+ comment thread. Naturally, someone responded by implying that my issue probably had to do with being an Internet Explorer user. (If you're wondering, my default browser at home is Chrome and at work is Firefox; but many intelligent people I know use IE).
As it turns out, my concerns weren't the true problem with the survey. Critical thought is first needed to check whether the numbers are real before worrying about drawing conclusions.
So, kudos to the BBC for digging into the study -- although it would have been more impressive had they done so before reporting the results (apparently readers in the comments to their initial story brought up concerns that led to the unmasking of the hoax).
The real lesson here ends up having nothing to do with the comparative intelligence of browser users. What this "study" shows is how quick people are to believe what they want to believe -- whether because it fits into their technology stereotypes or eagerness to share a juicy tidbit online. Not to mention how desperately we as a society -- and journalism profession -- need critical thought, statistical skills and skepticism.
Sharon Machlis is online managing editor at Computerworld. Her e-mail address is email@example.com. You can follow her on Twitter @sharon000, on Facebook, on Google+ or by subscribing to her RSS feeds:
articles | blogs .