Why in-memory computing is going mainstream

In most computer systems, random access memory (RAM) plays a comparable role to the ‘working memory’ of the human brain: it is a temporary store for a small amount of data that can be retrieved extremely quickly.

But as the cost of dynamic RAM (DRAM) chips has fallen, and their performance characteristics have improved, the amount of data that can be viably stored ‘in memory’ has grown.

So much so, in fact, that the kind of large datasets that would have once sat in a disk- based data warehouse can now be placed in memory. From there, data can be retrieved at lightning speed. Putting this into practice in order to improve business IT systems is known as “in-memory computing”.

To date, in-memory computing has been applied primarily to analytical systems. Moving an analytical database into memory removes the i/o bottleneck associated with physical disks.

This means organisations can perform complex analyses in “real time”, and allows users to slice and dice large data sets with the kind of agility usually associated with desktop spreadsheet software.

Now, though, in-memory computing is being proposed as a platform for transactional systems such as ERP applications. In particular, SAP, whose HANA in -memory database platform sits at the heart of its innovation strategy, claims that in-memory computing will both improve the performance of business applications and allow more sophisticated analytics within them.

The case for in-memory computing appears to be winning converts. Analyst company Gartner says that in 2012, 10% of large and medium-sized organisations had adopted in-memory computing in some capacity. By 2015, that figure will have more than tripled to 35%.

“The in-memory computing market is going to grow massively over the next two years,” says Massimo Pezzini, vice president and analyst at Gartner. “There are more than 50 vendors and we are finding more of them every day.”

In-memory business intelligence

The phrase ‘in memory’ computing rose to prominence a few years ago in the context of business intelligence systems such as QlikView, from Swedish supplier QlikTech, and Spotfire, from Tibco, which load data into memory before analysis.

Boris Evelson, an analyst at Forrester, says that the concept of “in memory computing” has gained traction in the last two years as businesses have acknowledged the value of these systems.

“In traditional BI, the database underneath is the limiting factor because it’s on mechanical disks, which are slow,” he says. “Loading your dataset in-memory gives you instantaneous responses.”

That allow the kind of flexibility and responsiveness that users know from desktop spreadsheet software, but for large corporate data sets, Evelson says.

“These new analytical in-memory applications can do everything Excel can, except that they can operate on much larger data sets and they’re much more visual and intuitive,” he says. “They are like Excel on steroids.”

This flexibility allows users to conduct complex analyses on a whim, meaning they can look for patterns in the data without necessarily knowing what they are looking for.

“Before in-memory computing, it was almost as if you had to be a seer and have a vision of the future,” he says. “Obviously, ?????nobody has that, so you had to do your best to pre-build as many questions that you wanted to ask of your data in the database.” “The new term is exploration and discovery,” he says.

The speed of DRAM memory also allows organisations to build systems that perform complex analysis in “real time”, in response to business events.

One company putting this to good use is Nike. The sportswear maker has launched a number of products, including shoes and wristbands, with embedded electronics that record the wearer’s performance – how far they ran, for example, and in what time.

Runners can upload and compare their statistics with other Nike customers on an online platform, in real-time.

“Nike has something like eight or 10 million subscribers to this service, which generates a lot of data,” explains Pezzini. “To provide those statistics in real time, they have to process it very quickly, and that velocity is possible due to in-memory technology.”

This is not a technically impressive gimmick, Pezzini says – it makes Nike’s customers more loyal. “Once people’s shoes have worn out, they will buy another pair from Nike as their data is sitting on their servers. That’s business innovation.”

Another example is Avanza, an online banking start-up based in Sweden. The company uses in-memory computing to calculate the risk profiles of new customers in real-time when they set up an account.

“They are a small company, but they are adding something like 1,000 customers every week, so using in-memory technology allows them to scale out very rapidly,” he says. “It allows them to compute your risk profile in real-time by analysing live data, and they can offer you customised terms and conditions on the spot.”

Applications in-memory

SAP is one of in-memory computing’s most vocal cheerleaders. HANA, its in-memory columnar database platform is of supreme strategic significance for the company – it hopes the system will tempt enterprise customers away from Oracle’s ubiquitous database platform and provide a new source of revenue growth.

So far, HANA has only been available as a platform for SAP’s business intelligence and data warehousing software.

But in January, the company announced the forthcoming availability for HANA for Business Suite, its range of mainstream ERP applications.

“That announcement validated our assumption that in-memory computing was not a niche technology that makes sense only in the context of a select few types of applications in vertical sectors,” says Gartner’s Pezzini. “I can’t think of anything more horizontal than ERP or CRM as they apply to every industry sector.”

Speaking to Information Age after the announcement, SAP UK and Ireland’s chief innovation officer Adrian Simpson explained ?that the company has rewritten the Business Suite applications so that some data processing that previously would have taken place on the application server is conducted by HANA at the database layer. This means that data-intensive processes and transactions can be performed much faster, Simpson claimed.

For example, a company could compile an end of year financial report from multiple data sources in “vastly reduced time”. In future, HANA will allow more sophisticated analytical functionality to be built into SAP’s applications, Simpson said. In fact, using HANA to support business applications may preclude the need for a separate data warehouse for reporting and analytics, he added.

Competitors have raised questions about SAP’s ability to pull this off – it is, after all, a relative newcomer to the database industry.

If it does, though, it could be transformational, says Gartner’s Pezzini. “It would be a huge cost saving because every enterprise, theoretically, could half all of the storage costs that they have today.” This ties in with what Pezzini believes is a common misconception about in-memory compuying.

It is true that on a component basis, DRAM is still more expensive than disk. But, he argues, the performance of in- memory computing may lead to systems with a lower total cost of ownership. 

“The usual line of reasoning goes that if I have to put things in-memory, DRAM is more expensive than hard disk, so I’m going to spend more,” he says. “But you have to also consider that in-memory allows very efficient compression mechanisms, which reduce the amount of physical memory that you need in order to store large quantities of data.”

So while it may seem like the expensive option today, in-memory computing may eventually prove to be as cost effective as it is performance-enhancing. ?????????

Ed Reeves

Ed Reeves co-founded Moneypenny with his sister Rachel Clacher in 2000. The company handles more than 9 million calls a year for 7,000 UK businesses and employs almost 400 members of staff. Reeves remains...

Related Topics

Analytics