Government departments are, among other things, rationalising software architectures to reduce costs. They should be aware that IT cost savings can be outweighed by business capability rigidities.
I recently had a conversation about a government agency that is pursuing a software asset management strategy that foresees the rationalisation around three platforms (here I am using the term loosely for vendor, brand, product, suite of products, etc.):
- One platform for middleware, including things like integration and application development,
- One for back office, ERP-type applications
- One for collaborative tools.
That reminded me of the UK HMRC that a couple of years ago embarked on what they called the thirteen machines strategy. In that case the objective was less ambitious: standardising on thirteen software assets, rather than three, but the concept was very similar. And the HMRC is much bigger than the agency I am referring to here, so thirteen makes sense.
Does such a strategy make sense from an IT asset performance optimisation standpoint? Of course it does, particularly in a sector like government, literally plagued by fragmented legacy architectures.
Fewer software contracts to deal with mean more focused supplier relationship management skills, which is likely to turn into efficient and effective purchasing processes and stronger bargaining power. Fewer platforms also mean implementation (development, testing, deployment, integration, etc.) skills and operational skills are more focused thus more efficient and effective.
It is also likely that governance of the IT demand and IT supply dynamic becomes less erratic, as requests for developing new business capabilities or integrating existing ones will be contained within the boundaries of those specific platforms. Similarly sourcing some software management to external suppliers will be more easily structured through proper application development and maintenance contracts, or outsourcing, or cloud, rather than frantically body-shopping resources that can support developing, testing, integrating, and securing the software flavour of the day.
But is it all gold that glitters? Of course not. In the long-term, such a software asset rationalisation strategy, if undertaken blindly, could not be sustainable. It is definitely necessary for the few government departments that are spending 10-15% or more of their budget in ICT (HMRC was a case in point, when they started on the journey), because that means they have ample room to bring to bear a tighter control over their software asset inventory. But there are only a handful of government departments and agencies that spend so much in ICT.
For the majority of government agencies, ICT accounts for 5% or less of total expenditure. Most importantly, with 3rd platform technologies (cloud-mobile-social-big data) ICT is making an increasingly big impact as an enabler and a trigger of business innovation: to offer omni-channel experiences to citizens; to enforce better fiscal stewardship, for example through big data and analytics programs that fight fraud and reduce financial risk; to nudge citizen behaviour for improved quality of life through gaming.
This is a world of doing NEW with less. It is a world where government CIOs can become the stewards of more agile tools that help make business transactional and information capabilities more effective, more efficient, less risky and more innovative. In this context, the tools must first of all support business capabilities and not be considered as stand-alone items that the CIO needs to rationalise, or replace, or upgrade based on pure technical and IT cost considerations. Some variety of software tools, in some cases even different products and skills supporting similar business capabilities will help. And even from a pure tactical standpoint, betting everything on few tools is risky.
For a starter because public procurement rules mandate to retender the contract at some point and if a completely new software platform is brought in, migration (data migration, application interface migration, re-skilling, etc.) will be very costly. Pursuing software asset consolidation very rigidly is also risky, because its costs, such as the need to rationalise a maze of different contracts, technical and business use case configurations and software versions, even of the same product, can be very high.
Eventually depending from fewer software vendors and service partners can backfire, particularly if they are large suppliers, because that one government client represents only a small share of the vendor revenues, so there is limited room for special requirements that deviates from price lists and standard configurations that instead can support innovative business capabilities. Government procurement centralisation programs are making an effort to fend off some of this risk, by dealing with suppliers as one big client, but too much standardisation is always around the corner also in those cases.
Of course I am not suggesting that CIOs should place their trust in a chaotic architecture. Striking the balance between efficient IT architectures and flexibility in supporting business capabilities will be difficult. For example it would be wise to rely on one consolidated platform for 80% of core requirements supporting a specific set of business capabilities, and then use a different platform (maybe open source, or cloud) for the remaining use cases. Nurturing new strategy and architecture, portfolio and program management, and sourcing management skills will also be necessary.
That balancing act is the price to pay in the new world of doing NEW with less, because government CIOs that take software asset consolidation strategies too far are mired to lose the grip on ICT, as end-users start to buy SaaS, bring their own apps, and launch big data and analytics initiatives that quickly shadow the CIO office.
Posted by Massimiliano Claps