Why we're hard-wired to ignore Moore's Law

Moore's Law rarely influences technology decisions beyond the realm of chip vendors

When Gordon Moore made his prediction in a 1965 issue of Electronic Magazine (download PDF) that the number of transistors on a chip would double every year (eventually updated by Moore to two years and then updated again by Intel Corp. to 18 months), it was just a "lucky guess" based on a few points of data, he recalled in an interview in 2006. But the idea, which has grown to encompass ever cheaper, ever smaller, ever more powerful components, has so captivated the IT industry that you can't attend a technology conference without seeing at least one PowerPoint presentation displaying the Moore's Law graph.

By virtue of its ubiquity, you might think that Moore's Law actually influences technology decisions beyond the realm of chip vendors. But the truth is, few enterprise IT shops actually appear to apply it to their planning. Could this be a mistake? If you know that hardware is bound to get smaller, cheaper and faster, can you somehow put that to competitive advantage for your company? And if you ignore it, does that have untold cost?

"Almost never do people look at processor power or storage capabilities and cost trade-offs and decide, 'What does this mean to us in three to five years?'" says Thomas Moran, systems analyst principal at consulting firm Alion Science and Technology Corp. in Annapolis Junction, Md. "How does that impact our technology refresh cycle? How does it impact training and staffing?"

Moran advises state and federal agencies on risk management and disaster recovery. He believes that by applying the predictability of Moore's Law to their planning, they could better predict when it is time to move to newer technologies that would be less expensive and provide better performance. As an example, he recalls a government office that had decided to maintain legacy mainframe operations in multiple data centers. As a result, he says, operational and maintenance costs have mushroomed. "They're hostage to something that has defied Moore's Law," Moore says.

In such situations, Moran says, decision-makers forget to look at the broader picture. "It's not just that you've got more CPU cycles or storage -- it's that [Moore's Law] has enabled disciplines in other areas that impact you directly." The question his clients are constantly asking is, "What should I invest in?" Pointing to the mainframe decision, he concludes that "even safe bets often end up being problematic."

The hassle of migration

James Damoulakis, chief technology officer at GlassHouse Technologies Inc., which provides data center infrastructure services, isn't surprised when companies ignore the realities of Moore's Law, since its implications can be both a help and a hindrance to planning.

On the plus side are the technology advances that have resulted from the increasing density of transistors. Green computing initiatives and virtual server computing are two examples, he says. "If you can get the same or more computing capabilities within a smaller footprint, there's even greater incentive now to move to new things," Damoulakis says.

The pain comes from keeping up. "Data migration can be a challenge," says Damoulakis. "And [that] impacts all sorts of aspects of day-to-day operations."

Although he considers most enterprise IT people savvy from a technical standpoint, he finds them less advanced in understanding how to scope out the transition process. "They know 8Gbit/sec. Fibre Channel is coming along and 10Gbit/sec. Ethernet, and what the density of drive capacities is, for example, in the storage realm," Damoulakis says. "The challenge is being in a position to react and quickly transition from the old to the new when they're ready to do it."

Damoulakis says he believes that most people who ignore Moore's Law do so because its rewards -- at least in terms of storage -- seem invisible. While in theory total storage costs should come down as a benefit of the law in action, in fact, costs remain the same and go even higher in most environments, he says. The reason: Data-retention rates are outstripping Moore's Law. "Because you can keep more data, you are," Damoulakis points out.

The key issue, Damoulakis says, is maintaining a balance between data center components. And that's hardly the job of the enterprise IT purchasers. It belongs in the realm of the technology makers. When memory was expensive, systems were designed to make efficient use of it. When memory costs came down, the focus on efficiency dissipated. "You could almost argue that the drive toward server virtualization is because of excess computing power that exists and trying to find ways to leverage it more effectively," he says.

Through it all, Moore's Law has remained a steady guidance system foretelling the arrival of new technical capabilities -- whatever those might be. "There have been breakthroughs in so many points of time," Damoulakis says. "I think there's something kind of miraculous in all of it."

Building up, not in

Gregory Wong, an analyst at Forward Insights, which monitors semiconductor memory, agrees. "Every time we say there's a wall we're going to hit, it seems like [manufacturers] are able to rise up to the challenge and overcome those barriers," he says. As an example, Wong points to flash memory, which will one day use components as small as 32 nanometers. He wonders if that will be the wall. "Some people are saying it'll extend to 22 nanometers. Even if it goes to 22, then somebody will say, 'When we go to 15nm, we'll hit a wall.'"

The technical obstacles have to do with lithography -- the process of writing the circuits onto the wafer. "As these dimensions get smaller and smaller, it becomes harder and harder to make a transistor," Wong explains. The components become less reliable. New ways of building chips have begun coming to the forefront, including multicore chips that perform symmetrical processing and memory arrays that build up in layers instead of inward.

That may matter to chip vendors and systems integrators, Wong points out, but not to enterprise users. "If I'm an IT manager, I'm going to make plans for my IT department based on the strategy or needs of the company. 'I don't have enough space in the facility. Maybe I should move to blade servers, which is more space-efficient.'"

Doug Mechaber, a computer consultant in Southern California, fits into that category. He says that rather than any direct application of Moore's Law at the companies where he has worked, budget and immediate growth plans dictate purchase plans for components such as servers. "Sometimes this results in underpowered servers, sometimes, overpowered," Mechaber says.

He prefers to plan for "do-overs," by choosing servers that allow for expansion of memory and CPUs. "I find the lifetime of most servers purchased this way is two 'Moore times' -- about three years," Mechaber says.

We're hard-wired to be linear thinkers

Futurist, inventor and author Ray Kurzweil claims that Moore's Law will always be a difficult concept for technology users to comprehend. That's because while it addresses the exponential growth of technology, we humans tend to be linear thinkers.

"Our intelligence is hard-wired to be linear because that served our needs as our brains evolved, when we were walking through the savanna 10,000 years ago," he explains. "We saw an animal coming at us out of corner of our eye, and we'd make a prediction of where that animal would be in 20 seconds. That served our needs quite well. We have a linear predictor built into our brain -- that's our intuition."

Even scientists, Kurzweil says, rely on predictive intuition, which follows a linear path. "They have an idea of what's been accomplished in the next year," he says. "And then they think about a task: 'Well, that's about 10 times as hard. It'll take about 10 years.' That's the intuition." As a result, predictions tend to be overly pessimistic or conservative, according to Kurzweil.

Even if we were better at exponential thinking, says technology advisor and SanDiego.com CEO Mark Burgess, Moore's Law is a lousy way to handle any kind of planning. "Applying Moore's Law as a planning tool in IT is a little like comparing aging to gathering wisdom," he says. "Because technology changes, [it] doesn't mean the rest of the systems and people around them can, will, should or want to change." The fastest way to slow down an office, he believes, is to upgrade it.

His advice: Forget linear growth; forget exponential growth; forget Moore's Law. Pattern decision-making after the ascending spiral model of history, "where we cover the same ground ... with small changes that move us forward." When new technologies hit, he suggests, "make sure you get 'one' as soon as anyone says they had success with it so you can start the process of figuring out where it fits."

Dian Schaffhauser is a writer who covers technology and business for a number of print and online publications. Contact her at dian@dischaffhauser.com.

Copyright © 2008 IDG Communications, Inc.

Bing’s AI chatbot came to work for me. I had to fire it.
Shop Tech Products at Amazon