"You lot aren't programmers. You're just coders."
His name was Hugh and he was an IBM salesman. That pithy quote, his first name and his shiny double-breasted suit are about all I can remember of him. I was fresh-faced and two months or so into a job at IBM, a firm I thought embodied the pinnacle of computing prowess. In the middle of a PL/1 course, the bosses had decided to inject some "reality" by having a senior member of the company's sales force deliver a lecture. He was good, I'll give him that. A full four years before Alec Baldwin's great performance as the hard-nosed "motivator" Blake in the film Glengarry Glen Ross, Hugh was there laying it on the line to some spotty kids straight out of school. Unlike Blake, Hugh had no need to ask me derisively what I drove: I was 17 and didn't have a license.
"What's this?" he said, holding up a picture of what I thought was a freezer. Blank looks from everyone in the room. None of us had ever seen an AS/400 before. We didn't know it, but this nondescript hunk of metal was about to swell Hugh's pockets with gargantuan commissions for the next 10 years. A point he was all too keen on reminding us of -- when he wasn't dissing our profession, that is:
"Computers have so much memory, you can do whatever you like. There's no skill to it anymore."
In those days, 10 megs of RAM was considered opulent, but he was right. The generation before me was probably the last for whom memory management in application development was as crucial as one's algorithms. Using an integer to hold a Boolean value wasn't just lazy; it could end up trashing your stack. Keeping your code tight and lean was a source of pride as well as a procedural necessity. It had the desirable side effect of enforcing discipline and keeping the size of your executables down too.
Over the past 15 years, though, we've seen Hugh's maxim taken to ridiculous levels. I'm talking about bloatware. It's basically software with two main features: a whopping, disk-munching footprint, and a ratio of features to features-actually-used-by-normal-people that's well over 2:1. It probably passes your typical end user by -- until the crashes and hangs come, that is. Code gorged with unneeded options equals more scope for bugs, which equals more problems: It's the bane of all our lives, and programmers the world over, including ex-coders like me, put up with it.
Sometimes the bloat is forced upon us: iTunes plus resources comes to a hefty 80MB on disk, but unless you inhabit the shady world of cracked iPhones, you can't speak properly to your device without it. Meanwhile, unasked-for fatty deposits -- Genius feature, anyone? -- build up year-on-year, clogging up our computer arteries under the dubious guise of "essential software updates."
It was software guru Peter van den Linden who said the only people to get rich out of the Internet were disk manufacturers. He had a point. As memory and disk sizes increased, it became gradually more and more old-fashioned and a little bit passé to worry about your RAM and disk usage: tight and lean became spread and preen, and as application developers expanded to fill the space left by hardware's incredible progress, the rot set in with the systems programmers too. An unquestioning rest of the world mostly went along with it. I recall a nifty server redundancy application sold by Novell some years ago that came on a single floppy including the manual. A disgruntled customer called: How could the software be worth the price charged, seeing as it only came on a 3.5-in. disk?
Part of the problem is the programming technology. Twenty years ago, you wrote a program, metaphorically speaking, close to the metal. C programmers could manipulate data with great dexterity most of the time, and for those really tricky bits, you could always drop into assembler and delve right into the hardware. With today's Web-based applications and fancy coding libraries -- themselves prone to bloat -- you're writing several levels removed from the machine, and the opportunity to develop tight code, even if you wanted to, just ain't there. The whizz kids in the software houses of Bangalore could probably render a 3-D Flash image in their sleep but wouldn't know a JNE instruction if it came up and bit them.
Don't get me wrong, I'm not anti-progress. There's no way Windows 7 could be as lean as 3.11 and still take advantage of the fabulous advances in hardware we've seen over the last decade. But if you're telling me a glorified MP3 player has to be 80 megs in size, then I say enough!
I'm beginning to sense I'm not alone
Google announced a new operating system called Chrome OS just a few weeks ago. There were a lot of headlines about how the company was setting the stage for a battle with Microsoft, and much speculation about when this operating system would be ready, but what caught my eye was the bit about how it would start up and get you onto the Web in a few seconds. Nice, I thought, but I'll believe it when I see it. Since then, I've had a bit of a rethink. Over the past few weeks, it's become apparent that there's a confluence and it's been happening for quite some time. Fans of the film Highlander might want to call it a Quickening, but the cut here is all about lard-laden software and nothing to do with Sean Connery's head.
It started off with people within the geek community grumbling about boot-up times. But at some point in the past few years, nontechnical users -- the vast majority of the world's computer owners -- started to notice too, and what was an imperceptible malaise among us techies at some point morphed into general dissatisfaction within the civilian population. I can't remember the first time I heard my mum say, "It takes so long to boot up," but I wish I'd noted it as the exact time when supine acceptance of poor performance became an out-and-out complaint. Google noticed the grumbling last year when it released the Chrome browser (42 megs on disk from a fresh install incidentally must do better), which promised a more lightweight approach, but the signs are all around us.
Take Apple's iPhone apps. There's no doubt the company's encouragement of development for the device has added to its popularity, but a side-effect of this little software cottage industry has been to subtly reintroduce a concept that has been lost for over a decade: buying software to do a specific job. Now, I don't want to over-egg the pudding here -- the top-selling app in December last year was a game that lets you shoot tanks. But whether you want to call it "install-on-demand" or "apps-on-tap," the future here is modular, lean and functionally targeted. You want to find out what's on when at your cinema? You download an app called Cinema Times to do it. Simple, eh? What you don't do is download Apple Entertainment Finder V7.2: New York Edition with Ballet Add-on and Service Pack 3. The age of bloat is withering. When I want to know where opera's playing, I'll download iVerdi for $1.99, and not before.
User habits and expectations are changing. The name of the game now is instant, or least minimally delayed, gratification. We can see it too in e-mail habits. With social-networking sites, I don't have to start up my e-mail client, find an old message from the person I want to communicate with, copy and paste their address from there into a new e-mail and then start writing. I just hit "send a message" in my almost-certain-to-be-already-running browser, and it's there. I'd say in the last year I've transferred 10% of my e-mail traffic to Facebook. Privacy issues aside, as our workspace moves from desktop-bound, heavyweight apps to browsers and handheld devices, it's a number that's only going to rise. Even the hardware manufacturers have noticed. You can get motherboards now with nifty pre-boot features that let you get onto the Web using a mini-Linux stack, so that you're messaging your pals before Windows even boots up. I can only imagine the palpitations when Steve Ballmer heard of that.
Of course, if you're one of those people with an Intel iCore7 chip and an Internet connection so fast your local college borrows bandwidth, then this whole article will generate a giant "Huh?" Lucky you, but that's not what the world is running. People want to spend $600 on a laptop -- they expect it to work and increasingly, will not stand for software that fulfills the programmer's desire to show off instead of meeting their requirements.
The big corporations need to respond faster. It doesn't matter if your software is an all-singing, all-dancing cure for the world's ills; customers now know that anything that comes on more than one DVD or takes more than 50 seconds to download is not necessarily going to be healthy for their systems. The forthcoming battle is a mouth-watering prospect. If Chrome OS ends up booting in under five seconds, then I want Windows 8 to do it in less than three, although I suspect Windows 7 is the last release we'll ever see of this monolithic behemoth of bloat. People don't want Windows anymore; they want light and breezy Mediterranean-style shutters they can push open with a finger. If Redmond won't deliver, then someone else will.
Hugh's generous pay may well have become a humungous pension by now, but as he watches from his villa in Barbados, let's show him that programming as a discipline is returning and disrobing itself Clark Kent-style of any negative connotations it may once have had. Let's bring back the cycle-shavers, those geeks who used to agonize for hours over particular instructions depending on how many megahertz they used. Let's get back in the gym and re-emerge leaner, fitter and ready to take on these rightly dissatisfied users. The online world's population is multiplying, and the natives are restless.
Paul Coletti is an London-based IT consultant and blogger who has worked in IT for 16 years at IBM, Novell Inc. and other companies. The opinions expressed in this column are his alone and in no way reflect the views of Novell or any other entity.