In-memory technologies move databases to real time

Last week, application-performance monitoring service provider New Relic launched an offering that allows customers to mine its operational data for business intelligence.

Share

Last week, application-performance monitoring service provider New Relic launched an offering that allows customers to mine its operational data for business intelligence.

The new beta offering, called Insights, has been a big hit, according to the company. The service has been fielding queries that, on average, consult 60 billion metrics a minute, with an average response time of 58 milliseconds.

In-memory database technology has been key in this new line of business, said New Relic CEO Lew Cirne. In-memory databases are relational databases that run entirely within the working memory, or RAM, of a server, or set of servers.

Cirne expects that the service, once live, could be used in all sorts of ways, such as for customer service, security and targeted marketing. As soon as a user has a question about some aspect of operation, the service can return detailed metrics on that topic, drawing from data that has been captured just seconds before.

New Relic built the database from scratch and assembled a large cluster that can muscle through terabytes of data to quickly arrive at an answer. In-memory technology allows the service to provide answers within milliseconds, even when finding the answers involve combing through large sets of machine-generated data.

Once a boutique item for well-funded fast-trading financial firms, in-memory database systems are starting to become more widely used, thanks to the falling costs of server memory and the demands on the part of customers who've come to expect speedy Internet services, such as Amazon's.

"Customers can transform their business by taking advantage of this technology," said Eron Kelly, Microsoft general Manager of SQL server marketing.

Microsoft has released to manufacturers SQL Server 2014, which has in-memory capabilities built in. Big-data software company Pivotal also released the first full commercial version of its in-memory database for Hadoop, Gemfire HD.

Microsoft and Pivotal's new offerings join an increasing number of databases with in-memory capabilities, including IBM Blu for DB2, SAP Hana, VoltDB's eponymous database, and Oracle TimesTen, among others.

Add to this list a growing number of caching tools that allow organisations to keep much of their relational database content into memory, such as Redis and Memcache. This approach is favored by Facebook, for instance, which uses MySQL to store user data, but relies on Memcache to get material quickly to users.

Traditionally, an enterprise database would be stored on disk, because it would be far too large to fit in memory. Also, storing data on nonvolatile disk helps ensure that the material is captured for posterity, even if the power to the storage is cut off. With volatile RAM, if power is interrupted, all of its contents are lost.

These assumptions are being challenged, however. Online transactional databases, in particular, are being moved to main memory.

"If your data does not fit in main memory now, wait a year or two and it probably will," said Michael Stonebraker, a pioneer in database development who is chief technology officer at VoltDB, the company overseeing his latest database of the same name. "An increasing fraction of the general database market will be main-memory deployable over time."

Stonebraker's opinion is echoed by others in the business.

Find your next job with computerworld UK jobs