Remember the titans: who will win the in-memory battle?

With SAP shaping its recent strategy around it, and database giants Oracle, Microsoft and IBM all battling to push their own offerings, in-memory computing has been one of the most talked-about new technologies of the past few years.

But the notion, at least, is not new at all – stretching back as far as 1948, when Alan Turing wrote in his landmark ‘Manual for the Ferranti Mk. I’ that information found in the machine’s ‘electronic store’ was preferable to its magnetic equivalent.

Over two decades ago the database ‘buffer cache’ received similar approval for avoiding disk I/O and keeping frequently accessed database blocks in memory. Then the first commercial in-memory database, TimesTen, debuted in 1997, later acquired by Oracle in 2005.

The advent of 64-bit operating systems around the year 2000 vastly increased the amount of memory that could be addressed for in-memory databases and caches, as opposed to 32-bit operating systems that were limited to around 2 gigabytes.

The continual increase in memory chip capacity and decrease in the price of memory have grown the breadth of use cases where in-memory technology can be practically applied.

Such progression has brought us to where we are today: four of IT’s biggest vendors, amongst a wealth of smaller players, fighting to take a lead on a technology that Gartner predicts 35% of large and mid-sized organisations will have adopted by next year.

Naturally, Oracle, IBM and Microsoft have pushed in-memory technology as extensions to their existing database products using hybrid architectures, while SAP, new to the space altogether, has championed an ‘all or nothing’ architecture.

>See also: Why in-memory computing is going mainstream

‘These new in-memory technologies represent a new approach that goes beyond just putting data in memory to eliminate slow disk I/O,’ says Andrew Mendelsohn, executive VP of database server technologies at Oracle.

‘They employ new techniques and algorithms, and in some cases leverage on-chip routines, to take advantage of how data in-memory can be processed for performance gains. This far exceeds the mere advantage of removing physical I/O to speed performance.’

First to market

Being first to market, and in such gung-ho fashion, has allowed SAP the privilege of market educator.

The German software pros have aggressively marketed their HANA solutions, which possess an overlapping data layer that prevents modules from being replaced by the aforementioned rival players.

Its confident approach has led many to perceive SAP as being ahead of the game in this space, but the only thing that is certain at this stage is that it has made a big head start in the awareness stakes.

One barrier it does face is convincing businesses to rewrite applications for the entirely in-memory database. CIOs must see through all the marketing spiel to find the best technical solution for their organisation.

‘The implementation is difficult, but not horrible, and shouldn’t put off CIOs,’ David Akka, MD of Magic Software Enterprises, says on SAP’s HANA. ‘Organisations are already accustomed to investing large sums of money and manpower to move from one version to another, and they realise that this is the price they need to pay in order to implement the latest technology.’

Wrong end of the curve

Actian CTO Mike Hoskins is more blunt in his appraisal: ‘I’m not sure that I agree with the characterization that HANA is ahead of the game. CIOs should be wary of SAP HANA, which looks to be exactly at the wrong end of the price-performance curve, just when big data elevates price-performance to critical levels.

‘While some workloads will benefit from the highly proprietary and expensive HANA hardware and software architecture, its price-performance is out of bounds when measured against modern, parallel end-to-end offerings.’

The future, he believes, belongs to software-only solutions that scale up and out on commodity hardware, with HANA on the wrong side of that future.

The opinion of SAP executives, of course, differs. If you put more memory and CPU on your computer, they say, you may find that your existing applications are not really becoming that much faster, despite the investment you have made in your hardware. The reason being that those applications were never designed to run on, for example, a multi-core processor.

They will still run, and they will still run faster, but users will have a non-continuous experience.

‘This shouldn’t be putting CIOs off,’ says Stefan Sigg, SVP, HANA product and development at SAP. ‘A good optimisation of applications can change the value proposition.

‘Thus, CIOs shouldn’t see this as an inhibitor, but more as an opportunity to shoot applications into another orbit by leveraging the power of the in-memory database.’

In terms of the database incumbents, Hoskins is just as unflattering in his analysis of their capabilities: ‘Oracle, IBM and Microsoft are all built on 35-year-old row-set online transaction processing (OLTP) database technology, and while adding in-memory techniques will no doubt make their slow databases go faster, the truth is that the next generation of big data and analytic workloads should run on next-generation columnar analytic databases.’

>See also: Flash flood: the second wave of flash storage has arrived

Mendelsohn, on the other hand, assures users that despite not being a completely in-memory database, nothing is given up by adding in-memory technology to a conventional database.

Rather, if a user wants to support both analytics and transactional workloads in the same database, forcing them into one data store format is a bad idea, he claims.

‘Columnar data structures are best for analytics, but row-store formats are best for transactions,’ he says. ‘In addition, if the new in-memory technology is added in such a way that it inherits the existing features for availability, reliability, security and robustness, one could be years ahead of a brand-new product that has to develop that functionality anew.’

Cutting through the hype

With less noise being made on the contrasting offerings in this space, CIOs face the challenge of getting beneath the hype to find the right solution for their business.

Most businesses will accept that to take on the avalanche of big data in the coming years, in-memory capabilities are necessary – but how can they decide on the right approach?

CIOs who want in-memory computing to impact their business directly need to evaluate visually driven business discovery, according to James Richardson, senior director of global product marketing at Qlik.

The growth of this space has shown that such systems deliver the speed, usability and relevance that business leaders need to aid decision-making,’ he says. ‘It does this by compressing millions of rows of data into RAM, and then allowing users to search, navigate and explore the data in an unfettered way, finding and following data associations where they lead.

‘This is not the case with SQL- or MDX-based BI on disks that, in order to achieve acceptable performance, channel users down pre-built, restrictive data-drill paths and so limit the questions that they can ask of the data.’

Businesses must define what they are looking for in making the most of their data, whether that be speeding up access to transactional data, minimising latency to enable high throughput or analysing data in real time.

Having determined internal priorities, for many firms the ideal response in moving to an in-memory solution will be the ability to handle a much greater volume of transactions at high speed, enabling intelligent decision-making and rapid responses.

‘There is a range of very different in memory solutions,’ says Dr Mohammed Haji, senior solutions architect at Software AG.

‘To enable a fully effective response, businesses are likely to fundamentally move away from traditional disk-based storage through in-memory data management and grids and ultimately to a best-practice ‘in memory compute’ solution.

‘Here, all components of the platform, including, say, streaming analytics and low-latency messaging and real-time data visualisation, are happening in memory, to ensure fully informed and intelligent real-time responses to any customer query or need.’

It’s a marathon, not a sprint

Over the next year or so, big data and cloud will be the vital topics in the progression of in-memory in the enterprise. The pace of maturity of both will dictate the rate of adoption of in-memory computing.

For sectors such as finance, utilities and the public sector, which are already generating huge amounts of data, in-memory computing can’t come quick enough.

>See also: Living in memory

For others, the choice is perhaps less straightforward, as they may not yet feel that they have a big enough need to currently justify the cost of an in-memory implementation.

‘Although we already have big data, we are also starting to experience a new breed: the disorganised, non-regulated and non-structured data from machines,’ says Sigg.

‘It is important that we build capable and strong applications that are able to quickly gather and monetise this data. There are lots of opportunities and directions that in-memory can be taken in.’

According to Philip Adams, head of group IT at Mercury Engineering and chairman of the UK and Ireland SAP User Group, an element of handholding will be necessary to encourage enterprises to adopt. His advice is for CIOs to take advantage of all the sources of information provided by vendors and user groups in order to help them make their decision.

‘For many, they still want to see more real-life use cases before taking the plunge,’ he says.

Where is the market going? What the experts say

'In the past, only historic customer data was available to marketers looking to influence buying decisions. With more data now available and accessible around past and current customer behaviours – what they are doing now, as part of the current interaction with the business – this will lead to greater use of mixed storage infrastructures in improving analytics and developing more attractive propositions in real time to influence the buying decision.' – Artur Borycki, director international solutions marketing, Teradata

'Increasingly, organisations are realising the benefits of in-memory computing. In the recent past, and the trend will continue in the next couple of years, organisations are moving from silo uses for in-memory computing into larger-scale mission-critical uses. In spite of that, in the next couple of years disk-based storage will continue to be the most common storage approach for most applications.' – Alex Duran Panades, deliver manager, GFT

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Big Data
Data
Storage