Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

In-memory technology speeds up data analytics

John Moore | June 27, 2013
Long the purview of financial firms looking for an edge as they make lightning-fast transactions, in-memory technology is starting to catch the attention of many firms that conduct real-time analysis.

What's new-nearly a quarter century later-is the sharp rise in interest now surrounding in-memory technology. "The last two years have been the biggest change for us. In-memory has become really hot and...in terms of applications, it has just exploded," Gaskell says.

Financial services and telecommunication firms had been Kognitio's bread and butter, but now in-memory demand is surfacing in markets such as retail, he notes.

Terracotta's Allen said he has seen interest in in-memory in financial services, logistics, ecommerce, government and healthcare, among other sectors. "That lightbulb is going off everywhere. People are saying, 'How do I leverage this?'" he says.

As demand grows, the number of vendors offering in-memory technology has also increased. In May, for example, Teradata introduced its Intelligent Memory offering, which it says lets customers exploit memory through capabilities built into the company's data warehouses.

"There's no need for a separate appliance," said Alan Greenspan, a spokesman for Teradata. The technology tracks the temperate of data, he adds, moving hot, frequently used data into memory.

Processing, Indexing Touch With In-memory Technology
In-memory databases have the potential to produce dramatic results when organizations need to crunch a lot of data in short order. However, the field is not without a few wrinkles. Misconceptions regarding the nature of in-memory technology are among the issues.

Industry executives say an in-memory deployment calls for more than dumping data in memory. They say the data management software must be designed to work with memory.

"It's not just about putting all the data into memory," said Chris Hallenbeck, vice president of data warehouse solutions and the HANA platform at SAP. "It's about rewriting the entire database from the ground up to use memory as its primary method of storage, as opposed to disk." (The SAP HANA real-time platform includes an in-memory database.)

Another issue: The speed of in-memory technology places heavier demands on processors. As a consequence, organizations must parallelize the code that will access the data and deploy load balancing across the cluster, Lindquist says. "Load balancing becomes a critical piece of your ability to take advantage of the in-memory database."

AdJuggler has created a pull-based load balancing system, using commodity hardware and in-house developed software. Each instance of AdJuggler's transaction processing engine will pull work from the load balancing component, complete the task and then go back for more work, Lindquist says. The system brings up more instances if additional capacity is needed.

Organizations with in-memory products must also take care when it comes to database indexes. Businesses using a traditional database can afford to devote a large amount of disk space for indexes. But in-memory databases call for greater precision.

 

Previous Page  1  2  3  4  Next Page 

Sign up for CIO Asia eNewsletters.