Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

In-memory technologies move databases to real time

Joab Jackson | March 25, 2014
Last week, application-performance monitoring service provider New Relic launched an offering that allows customers to mine its operational data for business intelligence.

"In the way SQL Server does memory, it eliminates latching. So the user does not see any delays as they access the system," Kelly said. (Currently, Edgenet is going through a Chapter 11 bankruptcy protection, so perhaps the use of dynamic pricing will help the company regain its footing with the regional competitors).

Pivotal's newly launched Gemfire HD extends in-memory databases to big data, through its integration with the Apache Hadoop data-processing platform.

"Gemfire is essentially a SQL database in-memory that can pull data from the Hadoop File System [HDFS], or persist data down into HDFS," said Michael Cucchi, Pivotal senior director of product marketing.

Gemfire can ingest large amounts of data extremely quickly, even when the data is coming from multiple sources, Cucchi said.

"If you have to handle a 1,000 requests a second, you insert the in memory layer of Gemfire and it effectively offloads the real-time requirements for the application," Cucchi said.

This technology could be used, for instance, by a wireless telecommunication provider. Gemfire could be used to ensure all the calls it handles at any moment can be routed through the best network path at that time, Cucchi said.

Not everyone is convinced that special in-memory databases are needed. Many others have taken the approach of running an in-memory caching layer above the database, to serve the most requested fields.

"People have been doing in-memory databases for years, where you have terabytes of RAM across many machines, all fronting MySQL" said Bryan Cantrill, software engineer at the Joyent cloud service. Cantrill points to the growing popularity of Memcache, which is being widely used as a front end for relational databases.

What should an organization do if the amount of material it has on hand exceeds the available working memory?

Many in-memory technologies, such as Microsoft's and IBM's Blu are actually hybrids, in that they can still store some material on disk, and only keep the most frequently consulted data in working memory.

Like other in-memory systems, SQL Server 2014 has a few tricks to preserve data should the power go out, causing data to disappear from the volatile RAM. The tables are never written back to disk, but any changes are written to data logs. Should the power go out, the database can rewrite the lost data using the logs.

SQL Server 2014 comes with diagnostic technologies that can examine the user's databases and suggest the tables that should be moved to memory, based on how often they are consulted. "Just move the tables that are hot into memory. That will allow you to get big performance gains without having to buy new hardware," Kelly said.

 

Previous Page  1  2  3  4  Next Page 

Sign up for CIO Asia eNewsletters.