Cloudera chief Tom Reilly on the evolution of big data

Last time we spoke to you it was in 2014, when Cloudera had just announced a monster $740 million investment from Intel, along with a strategic partnership. Where has that investment gone, and how is the relationship going?

Yes, it was the two-year anniversary of the partnership in May, and it continues to be a game-changing relationship. I’m very impressed with the things we’ve done. There’s some pretty exciting work happening now with Intel around designing chips optimised for analytic workloads. Some work has been done to make sure analytic workloads on Apache Hadoop are the first workloads deployed in that chip, working on transactional memory, and other next-generation technologies with Intel.

It’s a five-year roadmap, and we’ve had many deliverables in the past two years. We have been working on them in terms of additional planning on how they see the data centre of the future, and how it’s going to be architected and designed. It’s a long-term view of innovation that’s coming. We need to be able to prepare for it and make sure Apache Hadoop takes advantage of these architectures.

Just from a business perspective, we don’t anticipate ever needing to raise any funding again, which obviously gives us a lot of flexibility in how we build our business, and when to time our IPO.

Being partnered with Intel has allowed us to invest significantly in our roadmap, which allows us to mature the platform and advance the capabilities much faster than we could without that funding.

>See also: What are the numbers, facts and figures behind big data?

Cloudera has also been busy with acquisitions over the past few years. Can you talk us through them?

The first acquisition we did after the Intel investment was Gazzang, an encryption and big data company. One of the first projects we had with Intel was how to take advantage of the x86 chipset and do encryption on the hardware, so when we saw they were doing that we had a value proposition for our customers where we say that any data landing in our enterprise data hub should be encrypted. Now Gazzang do all of our security development and manage our platform for cyber security use cases.

We recently acquired San Francisco-based company Sense.io. The data scientist workbench allows data scientists to use their favourite program language framework, share libraries and algorithms and roll those into production a lot more efficiently, making the data scientists much more efficient, helping them collaborate, and allowing non-data scientists to benefit from those libraries.

We also acquired Xplain.io, focused on helping customers analyse and optimise where they do their SQL workloads. After we introduced Impala, we wanted to help customers look at analytic workloads and determine which run best on Hadoop and optimise them for Hadoop, so it’s really helping customers bring the right analytic workloads to the platforms. Financing from Intel allows us to do this, and we certainly intend to continue these kinds of great acquisitions.

How do you think the big data market has evolved since 2014?

The industry and market has evolved significantly since 2014. In 2014, we spent a lot of time talking about what I called the ‘zoo animals’ – all the open source Apache projects, Pig, Scoop, Hive, with a lot of talk around tech. In 2015, Cloudera helped lead the discussion to a much higher-level value proposition.

We introduced the Cloudera data hub, which combined those assets, and we could talk to customers about here’s where all those assets fit into your IT landscape. That was a conversation that CIOs got very interested in.

In 2016, it’s remarkable how this industry is maturing. The industry fully understands now the notion of data lakes and hubs; now we want to understand what use cases are most impactful for industry. Everyone’s talking about high-value business use cases; in fact, I like to talk about what I call boardroom use cases. What is the boardroom use case this platform is addressing, and should it be discussed in the boardroom?

Two years ago, everyone was wondering what Hadoop was good for, and there were varying opinions on how it would be used and whether or not it would go mainstream.

Just a few months ago, Forrester Research published a paper predicting that in two years 100% of large enterprises are going to adopt Apache Hadoop as part of their IT infrastructure. In just two years Hadoop has become a core part of every single large enterprise for data management. I’ve never seen
an industry mature that fast, and that’s incredible.

Is it the same story in Europe?

The European market is our fastest-growing market – I’m amazed at how quick the adoption is here; while it may have happened slightly behind where the US was, it’s been incredibly fast. The reason is that a lot of the European customers have been experimenting with the software
and working with it for many years.

Whether in Southern Europe or the Middle East or Central Europe, all across we are seeing great adoption.

With your growing customer base in Europe, do you find they have a lot more concerns over the privacy and security of their sensitive data in the cloud thanks to the upcoming EU GDPR, and if so how are you helping to reassure them?

The use cases are going to be the same, but there’s heightened concern around privacy in Europe and data across borders. Our product was designed with core strength of data security, data governance and auditing of data in mind. Those capabilities are much more appreciated and valued in Europe, and of course we are a big data company, capturing data about customers or purchase behaviours, a lot of which is very sensitive – that is why we have a data security platform that goes right down to encryption.

When the first databases were developed in the late 80s and early 90s, Oracle or DB2, there was no such thing as the internet; at best there were wide area networks. So databases were not vulnerable to attack. It was only ten or 15 years later, when the internet emerged, that suddenly databases became vulnerable to attack.

>See also: How big data is changing business innovation

A third-party products cottage industry emerged to protect databases, with third-party role-based access control systems, third-party auditing and third-party encryption. But after this hodgepodge of products, why do we still see databases that are vulnerable to attack? What we’ve done with Hadoop is develop a modern data store for a connecting world – designing in all the data security features that are needed in a modern database.

Our platform has native authentication, native authorisation, native auditing, native data encryption. Because we are building them natively, customers realised that Hadoop is more secure than databases that had been around for 30 years.

As it’s more secure, that’s a reason to put data in the platform. In the US, we did a project with MasterCard, requiring us to be PCI certified.

When their auditors inspected the security of our platform they called it the secure data vault – the safest place for data inside MasterCard.

Now their consulting arm is taking the platform to every merchant and institution that handles credit card data and they’re recommending that credit card data is stored in our enterprise data hub because of those security features.

If you have sensitive data, you need tight controls over who’s allowed access to it and what they’re allowed to see, and then you want to audit it and make sure you have full audit trails of who looked at what and where data went. Again, we design those capabilities into our platform. In this modern world, where privacy is concerned we are all leaving tremendous data exhaust, so we want to make sure that it’s for positive purposes and not bad purposes.

The killer app for Apache Hadoop is cyber security intelligence. Across every industry, but mostly in large government organisations and financial services firms, our platform is being used by cyber security teams, in protecting financial accounts, anti-money laundering and counter terrorism. So we are becoming that cyber security intelligence platform.

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Big Data