Neil Jones, director of cyber security evangelism at Egnyte, discusses how organisations can rein in content sprawl to master data migration
Amongst the various IT challenges that impacted organisations last year, one of the most significant was their massively-growing volume of unmanaged content.
What began as a logical consequence of the overnight move to remote working in early 2020 has only gotten worse. For companies everywhere, the volume of business-critical content they are creating grows unchecked, and management of it is widely dispersed amongst cloud venues, corporate hard drives and personal devices.
Solutions like Microsoft Teams, Slack and Skype are valuable tools that have helped remote teams around the world foster collaboration and connection. Despite their benefits for communication, they also play a part in dispersing content outside of the organisation’s master content repository, which can present security risks in today’s Work from Home environment.
If they are not careful, companies can quickly find themselves with lots of content in disparate locations, which poses challenges for IT to govern, legal to manage compliance efforts, and end users to maximise productivity. This phenomenon is commonly referred to as ‘content sprawl’.
Content sprawl represents one of the most complex information management challenges that businesses face today, particularly when data needs to be migrated at scale. As a result, most IT professionals approach data migration projects with a sense of dread. From system outages to a constant flow of support tickets from unhappy internal customers who cannot access mission-critical data, to malfunctioning databases and unexpected costs, the list of potential pitfalls can be daunting.
In particular, migrating content to a new data management and collaboration platform during a global pandemic (with a Work from Home user base) presents even the most organised companies with complications. While the pandemic may have created greater urgency around these projects, effective migration still relies on a systematic approach and ongoing end-user education if organisations are to avoid common pitfalls – and reap the rewards.
A checklist of potential business-level and project level considerations includes the following:
Business level considerations
- What are the data sources? Where are they physically located? This information will play an essential role in determining what tools are available to migrate the datasets in question.
- How big is the data payload? More specifically, how many gigabytes, terabytes or more must be moved? In addition, how many objects, files and folders does that include?
- What are the timescales and milestones? First, is there a firm completion date, and what contingencies might occur if deadlines are missed? Ideally, teams should allow ample time to seed data, run tests and decide when users will stop using the original data source and be required to transition their activity to the new data repository.
- What migration skillsets and experience are available in-house? Does the team responsible for the migration have relevant experience related to the project being undertaken? Depending on the answer to that question, organisations can make an informed decision about whether to engage outside resources for further assistance.
- How far does the migration budget go? Although certain migration solutions can appear to be expensive, they can actually simplify the process, making it easier for users of any skillset to use. In contrast, others may be cheaper but require more management and deeper expertise – meaning you may need to bring in a third-party expert to assist you.
How to leverage AI and automation for cloud migration success
Project level considerations
- Should the migration process be carried out as a group of smaller jobs? In any migration scenario, there will be an optimal breakdown of responsibilities, but it’s always possible that organisations won’t be able to move an entire directory at once. Instead, it is often more practical to focus on migrating manageable batches of directories/data in a series of unique projects.
- How is the network configured? For instance, what are the anti-malware virus, firewall settings and general IT security parameters? These factors will impact how easily data can be moved and how deeply network and whether the security teams need to be involved in the migration process.
- Do permissions need to be migrated? This is a critical question to ask from the outset, as not all tools support permissions migration.
- Does the migration tool permit bulk uploads of larger files? This offers an important benefit in that migrating “chunks” of data in parallel enables faster transfer of larger files. In the event of an error, the chunk in question can be re-uploaded, rather than running the entire file migration process over again.
Effective migration isn’t just about moving data from one venue to another – it’s also a vital component of any comprehensive lifecycle management strategy to avoid inefficient content sprawl.
For instance, data migration is a common requirement of the typical merger and acquisition process, a required component of business expansion strategy and even a potential result of a compliance audit. In each situation, organisations need access to the right tools at the right time in order to retain full control over their content.
And, in an era when many employees are working from home and companies are building highly-distributed workforces across multiple time zones, pressure is growing on organisations to deliver practical, high-performance and secure data management systems that meet those diverse needs. Companies that succeed will improve their ability to comply with data privacy and protection rules, while improving their ability to gain maximum value from organisational content.