The business world is frustrated. Before computers, the accounting department was the hub of every company, recording transactions, updating ledgers and providing statements of financial condition. The computer brought much needed automation to mundane, repetitive processes, such as payroll, bookkeeping and order taking, dramatically improving productivity.
Over the past 20 years however, we’ve seen the global economy boom and information stored within IT systems become increasingly complex and difficult to manage. Data now stored in a myriad of systems using wildly different methods drove the IT community to look for a solution beyond traditional reporting. Today, companies have spent (and carry on spending) vast sums on the supposed aspirin to this data headache - business intelligence (BI).
The increasing need for business managers to make faster, more strategic decisions based on the plethora of information delivered to them means that prioritization and organization of data has become a huge priority for many businesses. Further, the business insight that was important during economic boom times can yield the competitive advantage needed to survive during a worldwide recession.
With such heavy investment - Gartner predicts the BI market to hit $7.7 billion market in 2012 - users’ expectations are justifiably high. According to Gartner, organisations implementing BI principally expect it to speed up and improve organisational decision making, enabling them to quickly respond to user needs for data and better align and track against corporate strategy and objectives.
This comes in addition to the cost savings and efficiency improvements anticipated with most IT solutions. However, when asked about BI, respondents to an Economist Intelligence Unit report revealed that a staggering 72 per cent of senior and IT business leaders said their data was still inconsistent across their departments, while 40 per cent said their workers made poor decisions because of it. It’s therefore little wonder they are so annoyed.
So why has BI failed so spectacularly to live up to the hype? For this, we need to look at the lifecycle of a typical company. As a company grows, it begins to form separate departments dedicated to particular parts of the business, often significantly increasing the business’ complexity and creating silos of data - and information.
Separate business units may appear, usually operating autonomously - each making its own decisions and each with its own IT department. At some stage, information from these units are often brought back together to get a ‘corporate’ view - or the company is centralized entirely - but the mass of disparate information sources leads to major reporting problems.
Even basic reporting, on a weekly or monthly basis, becomes a struggle. So the problem does not lie with the BI tools themselves, but rather what’s behind the BI - the outdated processes and systems used to marshal the information and service it up to the BI tools for delivery to the end users.
The simple fact of the matter is that these environments use IT tools designed to help build the environment efficiently, such as data modelling tools, ETL and data quality products, all working on their own small part of the data. Having a toolbox is great, but trying to use a hammer, screwdriver, saw, level and pliers to make a car run is a highly inefficient proposition.
With today’s level of complexity, IT needs new methodologies and technologies, not more tools. A tool used to build a system is not the same as a technology designed to manage change, the one constant in business. The pace of change is a major factor in the failure of BI.
Global organisations adapt their working patterns, methodologies and products on almost a daily basis which brings disruptions that BI cannot handle, and must respond to competitive, market and economic forces swiftly. In a typical large enterprise, products are launched and retired, packaging sizes could increase or decrease, sales team regions and even national borders change.
By the time the BI team can adapt the BI tools and infrastructure to reflect a particular change, further changes have taken place and most end-users have reverted to their old friend, the spreadsheet, to make an attempt to create meaningful business information.
As a result, the IT department is unable to give the business the information it needs in a timely manner, so it should come as little surprise that the business has minimal faith in BI and is often reluctant to give its buy-in to future projects.
The human element has been instrumental in contributing to the disappointment in BI - traditional processes are highly manual and require intense manpower. Multiple steps are needed to get to the stage where BI tools produce their reports, requiring a wide range of skill sets and processes from analysts, architects, ETL developers, BI developers and administrators.
With so many parties, and the ongoing disconnect between IT and business executives, precious time is wasted on the inevitable errors, leading to both development and testing delays.
So what can these organisations do to stop the rot of poor decisions and actually use the wealth of data to their advantage? Instead of examining what’s wrong with their BI tools, organisations need to take a step back and look at what's behind their BI tools. With a solid foundation in place that can deal with change and human error, BI tools will be able to process the information fed into them with significantly less effort.
Business managers need tools that help them do their jobs, not tools they have to work around to get the job done. Business and IT need to work in unison - through a joint business modelling process, for example - to design their requirements and reflect the way the business is run in a way in which both IT and business will understand.
Companies need to identify areas where the highly manual information management process can be automated - such as time-consuming data integration work - so that their limited IT resources can be more strategically deployed.
Conversely, they need to recognize areas like data governance where human intervention is not only desired, it’s necessary and deploy software like master data management that can help streamline the workflow and processes.
In executing the above steps, companies will essentially be deploying an information engine that can power the BI - automatically feeding information to end users through their BI tools, making them more productive and reducing internal costs.
The main benefit of this approach is the creation of a dynamic environment that can instantly adapt to change and deliver more accurate, consistent and reliable information faster to the BI tools.
This in turn leads to better analytical performance and so better decision making, and improved growth, savings and opportunities for the organisation. It can also cut down the number of players involved in BI development, thus reducing the time to delivery and the scope for human error.
Particularly in a time of economic turbulence, when the price for ill-judged decisions based on poor data organisations can be catastrophic, it’s never been more important to ensure BI is working for you and not against you.
Companies can no longer afford to take the chance on inaccurate, out-of-date data from their BI tools. But with an “information engine” feeding information to end users through their BI tools, C-level executives will never again have to wait for months to answer fundamental questions about business performance.
Bill Hewitt is Chief Executive Officer at business intelligence vendor Kalido