"Through 2013, management tools, processes, expertise, and services (not functionality) will remain the key limiting factor in user adoption of IT virtualisation." That's the conclusion of consulting firm Saugatuck Technology in its recently published report, 'The Many Faces of Virtualisation: Understanding a New IT Reality'.

Part of that finding meshes with what CIO found in our recent survey of nearly 300 IT leaders on virtualisation in the enterprise. Management tools and a lack of staff expertise on virtualisation weigh heavily your minds right now, according to our survey results.

But Saugatuck leaves one gnarled problem out. Internal politics will also remain a top challenge for enterprises implementing virtualisation. IT leaders and their staffs say it's tough to get IT gurus to cede physical turfs, play nicely together, collaborate in new ways.

Here's a look at some other interesting stats from the Saugatuck report regarding virtualisation, and my take on them:

Virtualisation spreads its reach

By 2010, at least 30% of non-desktop IT infrastructure will be virtualised (up from less than 5% in 2007). Server and application virtualisation lead the way, Saugatuck predicts.

My take: This figure would be more interesting if it did include desktops, the infrastructure piece that still costs so many IT groups too much money and time.

A big three scenario?

On the marketing side, Saugatuck predicts that through 2010, a trio of 3 heavyweights, Cisco, VMware and Citrix/ XenSource will dominate IT virtualisation deployments.

My take: Note who's missing here. Oracle. Sun. Microsoft.

When the chips are down

Given the impact that server and desktop virtualisation will have on server and PC deployments, chipmakers and hardware vendors will need to find new ways to appeal to IT. Already AMD is stressing chip functionality designed with virtualisation in mind; look for more work and marketing from chipmakers along these lines, Saugatuck predicts.

My take: Note what this will force chip and hardware vendors to do: Prove their promises with benchmarkable performance results. But benchmarks in this kind of environment will be tricky and perhaps less than useful to IT departments, since server virtualisation will vary wildly from enterprise to enterprise in terms of base hardware, hypervisor, number of VMs on a machine, types of applications running, etc.

Open source stays a player

How will open source figure in the virtualised world? Through 2010, look for open source to play a role mostly via XenSource products and tools for management of virtualised infrastructures, Saugatuck says.

My take: Open source tools for virtualisation management? Maybe. Many CIOs have told me that they want to stick with their existing security and management players. But I'd be interested to see if clever open source tools from outside Citrix/Xen make a name for themselves. Given the timing, and how hungry users seem for management tools, this could be a great opportunity for someone.

A big budget role

"Through 2010, server virtualisation will have the single largest impact on budgets for IT hardware and support. The second largest impact will be network virtualisation," Saugatuck says.

My take: How could server virtualisation not have a huge impact on your IT budget? By the way, "network virtualisation" is describing a virtual private network here.