Information lifecycle management: key considerations in 2016

Information lifecycle management (ILM), in essence, is the idea that data has a beginning, middle and end. It used to be that data had a fairly clear lifecycle, and so the flow of information through an enterprise could be managed at its various stages from birth to obsolescence. But with the introduction of the cloud, that lifecycle has become less distinct, and the deletion and retention of data is now a slightly more complicated topic.

Most clouds use what’s known as a ‘multi-tenancy’ architecture, whereby many cloud customers share a hidden, common infrastructure, yet utilise a defined set of secure services. It’s the big secret behind many low-cost, efficient cloud deployments, and often the most logical way to separate environments for different SaaS customers.

But when it comes to ILM, it’s important to understand the nuances of this architecture, and how those invisible tenants might affect your data. There is always the potential for human carelessness and error, and complete control over data throughout its lifecycle might not always be possible.

>See also: By any other name: why Information Lifecycle Management principles still hold true

‘Because of the way public cloud functions, multi-tenancy can give you some big data control issues,’ says Howard Frear, sales and marketing director at Easy Software UK. ‘You have to be sure that you have visibility and security of the management of that data, especially around archiving and retention policies.’

Frear goes so far as to say that if you’re an organisation that for strategic issues has to be deletion focused – and you have to know that data is deleted for good, when required – ‘then I really wonder if a public cloud provider is your best option, and either private or hybrid should be your route of travel instead’.

But fortunately, multi-tenancy cloud platforms have evolved considerably over the past few years. With newer deployments of this kind, most users will now find themselves able to use granular controls to access and manage all of their data just as they would were the data to be stored on-premise.

‘Fears over accidental data deletion are often unfounded,’ argues Mark Ebden, strategic consultant at Trustmarque. ‘Providing that the IT team has planned their migration (i.e. ensuring that ongoing services are not disrupted), managing ILM shouldn’t look any different in a multi-tenancy environment than it would elsewhere.’

For those needing to manage the granular retention and deletion of their data, it is the service level agreement (SLA) where organisations should be focusing their attention.

A provider will set a basic service level from which all others are based. Therefore, any tenant who may require a higher SLA will need additional configuration, management and risk parameters.

‘Some service providers won’t be willing to do this as it will drive up complexity and risk on their side, so it will be key to look at the basic SLAs of different providers and look at which most closely meet your needs,’ says Ebden.

‘But your biggest single issue here is lack of clarity about where data is. That is part of the USP of a public cloud, after all. Data cannot just get moved around; it could live in different geographies on a variety of different media – that’s part of the way the provider makes this cost effective for it and you.’

Data destruction

The biggest mistake people make when they think about their data in the cloud, argues Frear, is to think that cloud is an infinite resource. In a way, it is: you can buy more storage and vendors love to sell it you, although you do have to pay per gigabyte.

But what that wealth of resource tempts you to do is skimp on the disciplines that carry over from the physical to the virtualised world. These are mainly around the correct way to delete data.

‘Not all data should live forever, basically,’ Frear says. ‘There is plenty that for compliance reasons needs to be deleted at set times, which is really what ILM is about – and which matters in the cloud, though not all enterprises have woken up to this fact yet.

‘The big worry there is how sure you can be that it’s all deleted, everywhere, permanently, at the time you – or your regulator – need it to be.’

Data deletion in the cloud presents a pressing issue for companies that have to comply with the EU General Data Protection Regulation (GDPR). Even post-Brexit, many UK businesses will still face the burden of complying with the EU’s recently introduced data protection rules, which are due to come into force in 2018.

This is because every data processing business, no matter where they are based, is an international business, and the GDPR will apply to all firms handling data on EU citizens, even if the country in which a supplier is based is outside the EU, such as Switzerland.

UK organisations that are dealing with the data of EU customers and companies will have to ensure that they’re fully compliant with the regulation or face fines as a result.

‘While upcoming EU GDPR legislation has “right to be forgotten” clauses, this is typically only focussed on personal information,’ explains Ebden. ‘In order to effectively delete data, organisations need to ensure that the data stewardship processes in place are fully documented and acted upon regularly.

Indeed, data destruction plans as part of an ISO 27001 certification is another method through which data can be permanently deleted. However, the most effective way is simply to remove ownership of this responsibility from the cloud provider.

‘By encrypting an organisation’s data with a key known only to that organisation, then, as the data is deleted, the only method of recall requires the encryption key to make sense of the data,’ says Ebden. ‘This is data protection by obfuscation.’

Decision time for cloud users

Another big question for cloud users is how you choose the right cloud provider: one that aids you in managing information throughout its whole lifecycle.

The obvious, but often forgotten, question is what you are looking to achieve,. Whether it is consolidation, an increase in efficiency or the ability to archive over a long period of time, this outcome needs to be mapped to a solid strategy in order to see success.

‘Organisations need to consider the regulatory controls that may legislate how a strategy can be deployed for that industry,’ says Ebden. ‘When looking for a solution to enable full ownership of the ILM process, organisations need to look for de-duplication functionality, DR/BC [disaster recovery/business continuity], search, automation and the effective tracking and logging of data.’

While there’s a great deal of cloud support available to organisations, it’s important to avoid any option that seems to be offering a ‘one-size-fits-all’ approach – every cloud journey is different.

Plus, organisations should remember that while many vendors are happy to sell a cloud service to their customers, they’re not always able to provide ongoing support should any issues arise. This is especially important for organisations without vast resources to manage it themselves, such as those in the public sector.

‘If cloud-based ILM is a more cost-effective choice for an organisation,’ says Ebden, ‘then its provider needs to be able to support its ILM processes from start to finish, in order to ensure that every penny invested in moving to the cloud is maximised and can demonstrate a return on investment.’

>See also: Don’t panic! 10 ways to manage major IT incidents

Experts also suggest partnering with a more mature provider. Smaller or younger cloud providers are primarily focused on growing their footprint, and as a consequence they might not really understand the full lifecycle story of information.

‘That suggests that if you are a financial services firm or a big government or public sector body, you can only really contract with a provider that has the experience and track record to work with you on this issue to the level of satisfaction and clarity you need them to,’ says Ebden.

‘I would also suggest looking at the longevity of your prospective supplier. What happens if they go under in five years’ time? If you have a pension database that needs to be accessed for decades, that’s not good enough.’

Organisations can’t predict this, of course, but it’s important to factor it in to their partner selection process.

‘A great backup is to have a clear statement in the contract that your data is always your data – that you can always take it back, in the form you need it, at any time, which could be your insurance if the company looks as if it could be struggling or you decide that another storage mechanism would be closer to your business goals.’

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Data
Storage