Blogs

RSS FeedBlogs
RSS FeedSubscribe to this blog
About Author
Management Briefing

Practical advice for IT managers on everything from project management, technology selection, governance, risk and compliance to team building and HR, from ComputerworldUK.com's experts.

Contact

Email

Virtual modelling: A new Way of making IT decisions

When it comes to re-engineering IT environments to save money or achieve best practice, a trial-and-error approach can be both complicated and costly

Article comments
For several years now, business technology consultancy ImprovIT has been using a new concept:  Virtual Modelling.  This new business tool uses ‘what if?’ scenarios to simulate real world outcomes and identify optimum cost/quality balances, IT migration strategies and best sourcing options without chopping, changing or disruptive ongoing operations.

The challenge

CIOs today are caught between a rock and a hard place:  Having to slash IT costs while retaining productivity and service quality - often due to government mandate.  Of course cost cutting pressures are nothing new, and for many there is little blood left in the stone.  The question now is: “How and where can we make further reductions without knee-capping the entire operation?”  

There are plenty of apocryphal tales about organisations axing staff and abandoning efficiency enabling technology projects only to discover their actions have mortally wounded deliverables and reputation. The result: a panicked and costly rehiring and/or re-purchasing exercise to redress the balance.  

Finding the cost/quality balance

Wouldn’t it be great if you could work out the exact cost and productivity balance without the cost and disruption of making changes on a trial-and-error basis?  

Virtual modelling creates scenarios that are based on real, current and accurate data mined from your own ICT operation that can predict real world outcomes without impacting current operations. 

However, it can only do this based on available KPI data, and if it doesn’t already exist it must be generated via benchmarking studies. For as Lord Kelvin, the 19th c. physicist once said:  ‘If you cannot measure it, you cannot improve it.’  

Measure it first

Once created, this baseline data provides the tools to compare performance against other public service (and commercial) entities of a similar size and complexity in terms of things like value for money, quality of service, best practice and competitive pricing.  

Digging a bit deeper, you can also find out where your organisation stands in relation to best practice standards for staffing (quality and quantity), process complexity, outsourcers (scope & service levels) and IT governance.  

All of this information is then used to create ‘what if’ scenarios, typically dealing with areas such as: Cost/Price, Volumes, Staffing, Quality & Service Levels, Service Scope, Complexity, Project Efficiency and Process Maturity.  

Providing the model has been mapped with well-researched data, the outcomes obtained offer some highly accurate indicators that can be used to make decisions about outsourcing, staffing, process re-engineering, cloud migration or anything else. 

Building up the model

Below is ImprovIT’s own modelling system designed to pinpoint the impact of one (or several) parameters upon all the others. For example:  If I change Service Quality (SLAs) and/or ‘Service Scope’ what effect will this have on ‘Cost?  

Or: If I reduce ‘Complexity’ what effect will this have on ‘Processes’?   It also shows the changing balances of the whole picture when one or more parameters are altered.  For example: ‘If I want to increase ‘Volumes’ or ‘Service Quality’ what changes do I need to make to all the other segments and how will this impact the enterprise as a whole?
service-cost-complexity-model.jpg

So, to find the Goldilocks balance between IT cost and service quality let’s start by feeding staffing metrics into the simulation model, given the high impact of staffing on cost.  But this isn’t just about a straightforward set of numbers: it also has to allow for a range of ‘soft’ factors such as varying levels of knowledge, skill sets and the specialist expertise that can make an individual or team difficult to replace.

Next let’s look at complexity - typically the highest contributor to an IT department’s  spend after staffing.  This involves any and everything from security and data confidentiality to high availability requirements, legacy system integration and the number of nodes in the enterprise network. 

Rule of thumb: The greater the complexity the higher the cost.  A virtual modelling analysis determines where simplifications can be made without jeopardising mission-criticality. Once established that these changes are advisable, modelling can also provide an accurate estimate of cost, timelines and impact on staffing and service levels.

Then there is the question of outsourcing. Will it save money?  What services should be outsourced?  And if we are to outsource, what kind of service - a traditional provider or a cloud-based service?  And what business model:  Iaas, SaaS or PaaS?  Data fed into a simulation model can provide an accurate estimate of the likely ROI and TCO - with timescales - of each option. 

Process maturity also impacts the cost/performance balance.  There are industry standards which provide best practice guidelines, such as ITIL (IT Infrastructure Library) ‘Agile’ and ‘Lean’ (a production practice that looks to reduce resource expenditure down to the minimum required to deliver value to the customer).  

Comparisons with these guidelines can indicate where improvements can be made, but virtual modeling can determine what will cost and whether it’s worth the disruption to operations.  It’s also worth noting that achieving process maturity is rarely a quick win: it takes time and requires clear, unequivocal goals and plans led from the top.

G-Cloud migration

Given the chequered history of public sector IT projects, and the challenges that so many ITC departments are going through in making decisions about things like whether, when and how to migrate to the cloud, and how to optimise resources on an ever-diminishing budget, using Virtual Modelling to run scenarios on all the available options provides new decision-making tools that help to identify the best roadmap ahead while avoiding wrong roads and dead-ends.

Posted by Robert Saxby, Consulting Director, ImprovIT


Share:

Comments

Send to a friend

Email this article to a friend or colleague:


PLEASE NOTE: Your name is used only to let the recipient know who sent the story, and in case of transmission error. Both your name and the recipient's name and address will not be used for any other purpose.


We use cookies to provide you with a better experience. If you continue to use this site, we'll assume you're happy with this. Alternatively, click here to find out how to manage these cookies

hide cookie message

ComputerworldUK Knowledge Vault

ComputerworldUK
Share
x
Open