A few years ago, the idea of a grid – a kind of computing utility, possibly geographically distributed, that provided digital power on demand – was a hot topic. One of the top researchers in this area was (and still is) Ian Foster, and he's been meditating on the relationship between grids and their more recent avatar, cloud computing:
So is “cloud computing” just a new name for grid? In information technology, where technology scales by an order of magnitude, and in the process reinvents itself, every five years, there is no straightforward answer to such questions.
Yes: the vision is the same—to reduce the cost of computing, increase reliability, and increase flexibility by transforming computers from something that we buy and operate ourselves to something that is operated by a third party.
But no: things are different now than they were 10 years ago. We have a new need to analyze massive data, thus motivating greatly increased demand for computing. Having realized the benefits of moving from mainframes to commodity clusters, we find that those clusters are darn expensive to operate. We have low-cost virtualization. And, above all, we have multiple billions of dollars being spent by the likes of Amazon, Google, and Microsoft to create real commercial grids containing hundreds of thousands of computers. The prospect of needing only a credit card to get on-demand access to 100,000+ computers in tens of data centers distributed throughout the world—resources that be applied to problems with massive, potentially distributed data, is exciting! So we’re operating at a different scale, and operating at these new, more massive scales can demand fundamentally different approaches to tackling problems. It also enables—indeed is often only applicable to—entirely new problems.
Here are his thoughts on what the future holds :
I will hazard a few predictions, based on my belief that the economics of computing will look more and more like those of energy. Neither the energy nor the computing grids of tomorrow will look like yesterday’s electric power grid. Both will move towards a mix of microproduction and large utilities, with increasing numbers of small-scale producers (wind, solar, biomass, etc., for energy; for computing, local clusters and embedded processors—in shoes and walls?) co-existing with large-scale regional producers, and load being distributed among them dynamically. Yes, I know that computing isn’t really like electricity, but I do believe that we will nevertheless see parallel evolution, driven by similar forces.
Getting the right mix between computer power drawn from the cloud, via regional utility services, and local microproduction – be it by the company itself, or by its close neighbours - is likely to become an important task for the IT department. This is probably just as well, since the progressive “cloudification” of hardware, and the commoditisation of software – not least by open source - will in any case make many of the traditional roles of the IT department obselete.