For an online retailer, data velocity means interacting in a relevant and engaging manner with a real-time shopper, tying her experiences together based on past interactions, purchase history, social links, and other data, and doing it within seconds of her clicking onto the website.
For an airliner or an oil rig company, it means the sooner an equipment malfunction or imminent shutdown can be detected through fast data responses and rapid analysis, the quicker it can be repaired and made productive again - potentially saving hundreds of thousands of dollars a day in down time.
For a business looking to more effectively and efficiently meet customer and supplier demands, it means creating a more responsive, resilient supply chain through a combination of historical data and new intelligence.
It’s becoming increasingly important for organisations to match the speed of their actions to the speed of their opportunities. In a digital world marked by such established practices as mobility, cloud, virtualization and big data, data velocity is viewed in the Accenture Technology Vision 2013 report as one of the next key technology trends helping organisations improve their competitiveness, operations and business results.
Notwithstanding its ability to mine enormous amounts of data for precious insights, a business stands to lose out on a potential business opportunity if too much time elapses between the acquisition of data and the ability to use it.
A surge of new technologies today allows for the acceleration of the data cycle from insight to action, enhancing the company’s ability to maximise data velocity. Appliances that rely on solid-state disk (SSD) to speed data input and output now offer huge speed increases over conventional storage. To squeeze out even faster performance, in-memory techniques sharply minimise input and output by moving data sets into main memory rather than onto (and off of) any form of disk drive.
According to Gartner, the adoption of in-memory computing will increase 300 percent by 2015 due to decreases in memory costs. In-memory computing and insert-only databases enable transactional and analytical processing to be unified, while in-memory data warehousing offers the promise of real-time computing. This gives business leaders the ability to ask ad hoc questions of the production transaction database and get back answers in seconds.
Meanwhile, newer low-cost analytical packages decrease the time needed for problem-driven exploration. These tools - mostly open source - greatly facilitate the iterative querying of data, accelerating users’ ability to hone in on the right questions to reveal the greatest insight. Even new big data technologies designed to handle large volumes of unstructured data - largely batch technologies - are being adapted to work in real time, or close to it.
Speed costs money, however, which means that wholesale changes won’t happen overnight. This makes it critical for IT groups to continue to rely on non-real time data, blending together fast and slow to solve problems in a cost effective manner. This reliance on “hybrid insight” places an importance on changes in skill set as well as in architecture.
High performing businesses today understand the competitive advantage in “time to insight” and have begun to invest not only in the tools that can help accelerate their data cycles, but also in the skills and capabilities that drive this “need for speed.”
Going forward, it will no longer be about the size of the data. What really matters will be the ability to improve your rate of response. It will be about matching your data velocity to the pace your business processes can act on it.
Posted by Paul Daugherty, Chief Technology Officer, Accenture