When Gordon Moore made his prediction in a 1965-issue of Electronic Magazine that the number of transistors on a chip would double every year (eventually updated by Moore to two years and then updated again by Intel to 18 months), it was just a "lucky guess" based on a few points of data, he recalled in an interview in 2006.
But the idea, which has grown to encompass ever cheaper, ever smaller, ever more powerful components, has so captivated the IT industry that you can't attend a technology conference without seeing at least one PowerPoint presentation displaying the Moore's Law graph.
By virtue of its ubiquity, you might think that Moore's Law actually influences technology decisions beyond the realm of chip vendors. But the truth is, few enterprise IT shops actually appear to apply it to their planning. Could this be a mistake? If you know that hardware is bound to get smaller, cheaper and faster, can you somehow put that to competitive advantage for your company? And if you ignore it, does that have untold cost?
"Almost never do people look at processor power or storage capabilities and cost trade-offs and decide, 'What does this mean to us in three to five years?'" says Thomas Moran, systems analyst principal at consulting firm Alion Science and Technology in Annapolis Junction, Md. "How does that impact our technology refresh cycle? How does it impact training and staffing?"