Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

In-memory technology speeds up data analytics

John Moore | June 27, 2013
Long the purview of financial firms looking for an edge as they make lightning-fast transactions, in-memory technology is starting to catch the attention of many firms that conduct real-time analysis.

"If you're using the in-memory store like a database-with searches-you have to index for performance," Lindquist says. "You have to be more precise with it, because RAM is more expensive and limited."

The volatile nature of RAM presents another issue for in-memory adopters. Should a system fail, the data must be reloaded. This can prove time-consuming.

At USPS, Houston and Atkins say data protection is one of the greatest challenges of using in-memory databases. USPS currently performs all its heavy processing in-memory, then feeds the relevant results back to a relational database. The Postal Service also maintains a checkpoint file of the transactions running in-memory, so some limited recovery may be performed should an outage occur.

"We have reasonable assurances that the most important data to us is protected," the officials say.

The task of recovering an in-memory system from the check point file, however, takes some doing.

"As you can imagine, reading back in 16TB can take considerable time from traditional storage media," Houston and Atkins note, referring to the size of the Postal Service's in-memory data store. "To address this issue, we are currently exploring adding flash card technology closer to processing in hopes of changing our reload time from hours to minutes."

 

Previous Page  1  2  3  4 

Sign up for CIO Asia eNewsletters.