About two years ago, CareFirst BlueCross BlueShield implemented a self-service business intelligence platform to aggregate and analyse vast amounts of data from multiple repositories scattered throughout the company.
The technology from QlikTech was brought in as a supplement to a project management product from CA Technologies. So far, it has saved CareFirst $10 million in project costs and helped the health insurer reduce the number of outside contractors it uses by 25 percent.
Activities that used to take up to 18 months are now accomplished in less than two days. Moreover, the project management office no longer has to depend on its centralised analytics team to run BI reports.
Organisations like Maryland-based CareFirst are at the forefront of what analysts say is a dramatic transformation in business intelligence and data analytics practices at many companies.
Consulting firm PricewaterhouseCoopers (PwC) calls it the "new analytics." Unlike previous BI and data analytics models that depend on centralised, top-down data collection, reporting and analysis, the new wave is all about giving access and tools directly to line-of-business users, who benefit the most from BI reporting and data analytics, PwC said in a report released Tuesday.
"[The] new analytics taps the expertise of the broad business ecosystem to address the lack of responsiveness from central analytics units," PwC noted in its report. "The challenge for centralized analytics was to respond to business needs when the business units themselves weren't sure what findings they wanted or clues they were seeking. The new analytics wave "does that by giving access and tools to those who act on the findings."
What's behind the new analytics
Two trends are driving the transformation. One is the data explosion caused by cloud computing, mobile computing and social media. Inexpensive hardware, memory and storage technologies have made it easy for companies to collect large, varied and fast-growing data sets. Many are now looking to see if they can gain business benefit from examining and analysing all that data.
The other trend is the increasing availability of tools that allow companies to more easily aggregate and analyse large data sets. Many of the tools are designed for handling big data and incorporate capabilities such as in-memory databases, NoSQL support, data visualisation, associative searches, and natural language processing, all of which allow companies to analyse data more quickly and easily than before.
With the self-service business intelligence QlikView technology, for instance, CareFirst can receive real-time visibility into projects and resources at a fraction of the time and effort it would have taken with a traditional BI approach, said Carol Church, director of the project management office at Maryland-based CareFirst.
The technology allows CareFirst to pull in data from multiple data repositories, mash it together in a fast in-memory database and run all sorts of analyses on it at much faster speeds than previously possible.
With traditional analytic processing tools, analysts have to first develop a set of questions and then wait for IT to aggregate the relevant data, cleanse it and build paths between different data elements to enable analysis, said Church, who is responsible for managing 120 to 140 projects every year.
QlikView, on the other hand, lets analyst freely compare data elements and look for associations between them on the fly and on an ad hoc basis, she said.
Another organisation that is taking advantage of similar capabilities is The CementBloc, a New York-based company that helps large drug companies fine-tune and optimise their communications and marketing strategies. The company uses Tibco's Sportfire analytics platform to integrate and analyse data from multiple information sources.
What does your enterprise need to change?
"With traditional BI tools, you have to know what you are going to predict," said Ira Haimowitz, executive vice president, intelligence and analytics at CementBloc. "You need to know what you are going to predict by customer segment, or by geography, and map that out to a program, and then you generate queries and reports," Haimowitz said.
Spotfire's in-memory database technology and its search and data visualisation capabilities eliminate such requirements. The technology has allowed CementBloc to explore big and diverse data sets at will and find relationships between data elements they didn't know existed, he said.
QlikView and Tibco are not the only vendors offering BI, data visualisation and data analytics tools. Over the past few years, many vendors, including Birst, Tableau, Datameer and Splunk, have joined traditional enterprise players such as IBM, Teradata and SAS in delivering capabilities that are driving new BI applications.
The tools offer enterprises "more and more ways to capture, move around, scrub and analyse data," said Bill Abbott, principal of applied analytics at PwC. Some companies are applying these tools to integrate, extract and analyse existing data sets. Many others are using them on top of brand-new data infrastructures based on big data technologies such as Hadoop, Abbott said.
"Twenty years ago, there was this heavy emphasis on requirements-gathering because you wanted to precalculate all the answers," said Anthony Deighton, CTO at QlikTech. "You needed to work with users upfront to get a feel for all the questions they would likely ask. It led to a service-heavy implementation model for BI projects," he said.
New analytics "is about detecting opportunities and threats you hadn't anticipated, or finding people you didn't know existed who could be your next customers," PwC noted in its report. "It's about learning what's really important, rather than what you thought was important. It's about identifying, committing, and following through on what your enterprise must change most."