RSS FeedBlogs
RSS FeedSubscribe to this blog
About Author
Glyn Moody

Glyn Moody's look at all levels of the enterprise open source stack. The blog will look at the organisations that are embracing open source, old and new alike (start-ups welcome), and the communities of users and developers that have formed around them (or not, as the case may be).

GNU/Linux *Does* Scale €“ and How

Article comments

As everyone knows, GNU/Linux grew up as a project to create a completely free alternative to Unix. Key parts were written by Richard Stallman while living the archetypal hacker's life at and around MIT, and by Linus Torvalds – in his bedroom. Against that background, it's no wonder that one of Microsoft's approaches to attacking GNU/Linux has been to dismiss it on technical grounds: after all, such a rag-bag of code written by long-haired hippies and near-teenagers could hardly be compared with the product of decades of serious, top-down planning by some of best coding professionals money can buy, could it?

And thus was born the “Linux does not scale” meme – the idea that, yes, this stuff is free, but you get what you pay for: code that no enterprise could take seriously. Unfortunately for that narrative, GNU/Linux is not only able to scale rather well, but able to do it in perhaps the most demanding of environments – that of supercomputing.

Ten years ago, GNU/Linux had 10% of that market, according to the Top500 Supercomputers site, with Unix holding a pretty solid 85%. Five years ago, those numbers had nearly switched, with GNU/Linux holding 63%, and Unix 31%, and Windows running in splendid isolation on just one machine. A year ago, Windows had managed to crank that up by a massive 400% - to five machines; meanwhile, GNU/Linux was on 88% and Unix down to 4%.

Now we have the latest figures, and they're pretty stunning. Windows is *still* on just 1% of the top 500 systems; Unix is still on 4%; but GNU/Linux has now soared to 91%. Even I doubted there was much room for improvement here, so this further gain in market share is a truly stellar result.

Microsoft's dismal showing is also pretty extraordinary when you consider the resources that it has poured into this sector with its Windows Server HPC 2008 platform (great name, BTW, people...). You can judge the desperation of the team by the case study they offer on the site – one going back to 2008:

To meet the new and expanding needs of its academic and industrial users, the National Center for Supercomputing Applications (NCSA) at the University of Illinois must support the platforms with which those users are familiar, which means offering more than just Linux-based high-performance computing (HPC) resources. The NCSA achieved that goal by deploying Windows HPC Server 2008 on the center's 1,200-node HPC cluster. Windows HPC Server 2008, with its familiar tools and interfaces, helps to make HPC more accessible to mainstream users.

In other words, unable to compete on price or performance, the only thing that Microsoft can fall back on in this sector is the fact that some customers are locked into “platforms with which those users are familiar” - Windows.

All-in-all, then, the world of supercomputing shows not only that GNU/Linux *can* scale, but that it does it rather better than Windows, as proved by the crushing margin it has over Microsoft's unloved offering in this market.

Follow me @glynmoody on Twitter or



Send to a friend

Email this article to a friend or colleague:

PLEASE NOTE: Your name is used only to let the recipient know who sent the story, and in case of transmission error. Both your name and the recipient's name and address will not be used for any other purpose.

We use cookies to provide you with a better experience. If you continue to use this site, we'll assume you're happy with this. Alternatively, click here to find out how to manage these cookies

hide cookie message