Whilst many organisations have publicly stated their aim to be 100% cloud in the near future it is likely that for many, a hybrid state will exist with some combination of software as a service vendors (SaaS) such as Salesforce, adoption of Platform as a service vendors (PaaS) such as Amazon, Microsoft, Google and some traditional on premise applications.
There are compelling cost, skill and agility reasons to adopt cloud in whatever format, but this re-platforming exercise will re-ignite some traditional issues that many organisations have invested heavily in trying to solve.
In a world of increasing real time analytics, mobile applications and more stringent data governance, many organisations are struggling to manage a complex data landscape of many applications located in many different data centres with different skills sets.
The problem was then how to integrate, secure and deliver data for analytical purposes to create value in a manner that would pass a regulators scrutiny. In such organisations invariably a data management strategy was developed to tackle this siloed data phenomenon.
In the new hybrid world, data fragmentation will be taken to a whole new level with data in multiple, private clouds, public clouds and SAAS applications. Not only is the data more fragmented than ever before, but now data will likely reside outside the firewall introducing new security fears over how sensitive data is managed and regulated.
As companies look to balance their data and processes between these three broad locations – private cloud, public cloud and on-premise – it’s essential that they have tools in place to enable a consolidated view over their data.
With a data management strategy and toolset they can support business agility whilst ensuring they don’t lose track of critical data assets. Otherwise this could result in worryingly low productivity, or worse, a major security breach or large regulatory fine.
What’s the situation?
In the two decades since the internet boom, the enterprise world has gone through several infrastructure phases. First came on-premise, when companies built massive in-house data repositories and proprietary systems.
As the skill base eroded and costs spiralled to keep the lights on, cloud became a logical alternative. Initially it was thought that eventually everything was going to move online into the cloud.
Now there is another, more moderate change in how businesses talk about cloud – companies are putting different data in different places – cloud or on-premise – depending on their requirements.
For example, a bank might keep its customer account data on-premise, to ensure it has total control over its security. That helps it to meet regulatory demands and reassure its customers.
On the other hand, a ticketing company might move its bookings system into a private cloud, to give it scalability as well as security so that it can handle huge numbers of requests without worrying about scams or data loss, and make it cheaper to stand up and shut down new services.
A global professional services company however, might decide to move its day-to-day running data (for sales and marketing, say) into a public cloud, enabling its employees to access key information wherever they are – helping them to remain agile, with workers collaborating across the globe.
What’s the problem?
The adoption of cloud is rapid and increasing exponentially quarter on quarter.
One company can now access essential information that is spread across a whole range of locations. That might sound very efficient and sensible, but it comes at a price. Keeping company data in these siloes makes it much harder to get a single view over information – people continue to make the same mistakes, whatever the technology.
Cloud is easy to adopt as it often requires no hardware and installation so increasing with the promise of agility this has meant many tech savvy line of business managers have rushed to implement new applications and environments in the cloud.
This will often benefit their own business unit but it frequently causes a headache to the wider business as it becomes another silo of data which is not easily integrated or governed.
>See also: The golden rules of hybrid cloud
It’s important not to quash this entrepreneurial spirit and cloud can provide lower costs if managed correctly and supporting both these elements will offer a competitive advantage.
However, this cannot be done in a silo. It’s essential that companies have a comprehensive view of their operational data if they are to drive business efficiencies and new strategic initiatives.
If sales can’t use marketing data, for example, or if finance can’t access customer databases, then each department is going to miss out on insights that could benefit the company as a whole. After all, the more you know, the more accurate and effective your strategy will be.
But hybrid cloud siloes are not just a problem for operational efficiency. Security too can be compromised by a segmented approach to data management. Using multiple clouds and on-premise solutions increases the chance of a breach, as more potential entry points arise. If not properly tracked, this will expose the business it to unnecessary risk.
This problem is compounded by the fact that many organisations are now coming to the realisation that major cloud providers are more able to successfully protect their data than they are themselves.
This makes companies more likely to ‘file and forget’, trusting their providers without considering the damage that could result from operating across multiple providers and clouds.
What’s the solution?
Clearly, organisations need to approach hybrid cloud with caution to avoid these pitfalls. The good news is that although there may be functional cloud siloes in the hybrid world (gaps between HR, marketing, sales and so on), in the technical sense, there are no silos – it’s just a case of getting information out of multiple clouds and into a central view for business analytics purposes.
With that in mind, companies must ensure they have a strong data management strategy and toolset in place to integrate, aggregate, sort, clean and monitor data from across their hybrid cloud infrastructure and deliver it for analysis or use by a person or application.
>See also: Digital business trends 2017
Wherever data originates, whether on-premise, in a public SaaS app or a private cloud server, businesses must be able to bring it under a unified management process if they are to get the most of out of their information assets and remain compliant.
A holistic view of data will enable them to drive decisions based on the expertise of the company business-wide.
Finally, it’s essential to track data wherever it moves to ensure it is secure both at its source and at its destination.
Companies using a hybrid cloud environment must be able to identify sensitive information wherever it resides and ensure that it is does not move anywhere without appropriate security. For example, ensuring that customers’ personal information is never allowed to move onto the public internet through a system backdoor.
Granular data management is a must for a secure and efficient hybrid cloud. If businesses are to make this new, varied environment work, they must equip themselves to view their information from a single, comprehensive viewpoint – or risk being undermined by divided, insecure data.
Sourced by Greg Hanson, VP EMEA cloud, Informatica
Nominations are now open for the Tech Leaders Awards 2017, the UK’s flagship celebration of the business, IT and digital leaders driving disruptive innovation and demonstrating value from the application of technology in businesses and organisations. Nominating is free and simply: just click here to enter. Good luck!