Evolving technologies, more stringent compliance policies, and the need for big data analytics and long-term data retention, are all important factors that these professionals must consider when attempting to manage their data.
There are two main reasons why data planning has become a higher priority for today’s organisations. First, the sheer volume of data that is created each day has increased immensely and will continue to grow exponentially over time.
According to IDC, it was estimated that the digital universe would double every two years to reach 44 zettabytes (ZB) by 2020. This astonishing forecast places an unprecedented pressure upon both IT teams and CIOs.
>See also: The UK’s top 50 data leaders and influencers
Secondly, the amount of time organisations are expected to store their data for has also increased drastically. More stringent compliance and regulatory policies have been introduced globally, requiring data to be retained for longer periods of time.
For example, the Payment Card Industry Data Security Standard requires companies that process, store or transmit credit card information to store this data safely for seven years, which could be lengthened to ten years or longer in the near future.
As a result, IT managers need to be much more savvy about prioritising information within the data centre.
What impact does data planning have on businesses?
Data storage has been considered an integral component to any successful company for quite some time, but is now being viewed as a necessity, crucial to the functioning of an enterprise.
As more companies deploy private or hybrid cloud technologies, they are essentially provisioning services to the end user, which requires careful management and resource allocation.
A successful business cannot invest unlimited quantities of resources into IT storage. Instead, a process of assigning storage to optimise the overall performance of the storage network area, or storage provisioning, must be implemented.
As some data becomes less important over time, it can be moved to deeper tiers of storage. For example, rather than treating backups as long-term archives, organisations should develop the habit of real archiving, where the primary copy of data is moved to archive storage (ensuring that there are two copies in the archive), thus removing the need to backup that archived file. The end result is a much more efficient data centre, both from a performance and cost perspective.
Also important to consider is the impact data planning has on the working processes of colleagues across the business. In this situation, setting expectations is essential. For example, if users are initially provided with limitless storage, but are then told later on that there are restraints, you can expect to hear complaints.
However, it is highly necessary and timing is everything, because once the data centre has become overloaded with data there is very little that can be done to streamline processes.
As data planning becomes more integral to businesses’ success, it should be overseen at the highest level. CIOs must understand the value and layout of the tiered system to make sure data is stored at the right level, with the correct level of access. Ultimately, the CIO’s role is to ensure the data centre is as cost effective and high performing as possible.
If the occupant doesn’t decide what to dispose of, what to put into storage and what they want to keep close at hand, their possessions will end up managing them.
Because the value of data directly impacts a company’s bottom line, it is only common sense to go through this sorting process.
By using active archive technologies, a tiering system can be created that will seamlessly and transparently manage the data tiers while keeping overall costs to a minimum by placing data on the most cost-effective medium.
Only by understanding and prioritising data can it be stored in the most efficient way.
Sourced from Matt Starr, CTO at Spectra Logic