To make matters worse, they have also had to deal with archaic software licensing models that recent technology changes have made unmanageable.
However, it's not been all bad news. Some IT sourcing managers have had discretionary projects to commission and have found that these have provided unprecedented negotiation leverage in 2009 as vendors of all sizes struggled to win new deals. This leverage may continue into 2010, or fade away if business picks up, so buyers should use it now, to fix some dangerous gaps in their current agreement.
Many IT sourcing managers have found to their cost in 2009 that the most dangerous software licensing model is hardware-based pricing, including metrics such as ‘server' and ‘processor', because:
Rampant data inflation makes it a gravy train for vendors.
IT managers used to accept that they had to buy more software capacity as they added processing power, but restricted budgets have driven buyers to question this assumption. Enterprises need more computing capacity each year, even if the number of users is flat or shrinking, and this has provided steady licence revenue for software publishers. The latter have kept the ‘Processor' or ‘CPU' name but surreptitiously changed their policies to include multipliers based on the quantity and speed of each chip's cores, to ensure that they, not the customer, benefit most from improvements in processor technology.
It cannot be applied fairly to virtualised data centres.
IT sourcing managers cannot safely ignore this game changing technology. Forrester continues to see enterprises facing multi-million dollar bills because a compliance audit has found them to be significantly under-licensed. The software company enforces the letter of the contract, which still limits deployment in terms of physical assets.
This can mean the customer having to buy licences for all its servers, even if only a subset are supporting that vendors' products at any one point in time.
Find your next job with computerworld UK jobs