How can organisations escape the scandals and get big data right?

Fuelled by high profile data breaches, there is a much documented sense of wariness towards big data applications and the commercial use of personal data by private companies. When headlines are dominated by the likes of Yahoo! – who was fined $35m by US financial watchdog, the SEC, for failing to tell anyone about one of the world’s largest ever computer security breaches – and Facebook, who admitted to the misuse of user data by research firm Cambridge Analytica – it’s perhaps no surprise that these concerns are front of mind for many.

What we need to come to terms with is that big data is here to stay. It powers healthcare, travel, shopping and even the way we meet and fall in love. Simply, data makes the world go around. The onus is now firmly on the companies who deal with data to work to build (or rebuild) public trust. They must demonstrate that the data they rely on to do business is stored, processed and used properly – and, importantly, that it’s powering applications we want and use. Only then will consumers realise that the good outweighs the bad.

How to get big data right

The data centre sits firmly at the heart of an organisation. You might be forgiven for thinking that the IT department isn’t the natural home of innovation and business leadership, but the big data revolution can only be delivered from purpose built highly efficient data centres. Getting the data centre strategy right means that companies have an intelligent and scalable asset that enables choice and growth. But if they get it wrong, their entire business could fail. Data centre strategy can help address security concerns; one of the cornerstone arguments of big data naysayers.

There’s a lot of data in the world – more than one million megabytes of data is generated by mobile connections alone – every single minute of every single day. And what companies want to do with it is increasingly complex. Many applications demand real-time or near real-time responses and information from big data is increasingly used to make vital business decisions.

>See also: Big Data Trends and Technologies – Data Analytics

All this means intense pressure on the security, servers, storage and network of any organisation – and the impact of these demands is being felt across the entire technological supply chain. IT departments need to deploy more forward-looking capacity management to be able to proactively meet the demands that come with processing, storing and analysing machine generated data.

For even the biggest organisations, the cost of having (and maintaining) a wholly owned data centre can be prohibitively high, and so in the perennial build versus buy debate, buy is winning. Outsourcing to a third party provides the best protection against increasing data centre complexity, cost and risk, and eliminates the need to worry about uptime. Carrier-neutral connectivity, offered by many, means that companies within the data centre environment can choose the carrier service provider that best fits their needs – leasing a facility offers a substantially lower up-front cost. In addition, data centre providers can seamlessly allow companies to scale – quickly and easily handling growing storage needs.

Perhaps most crucially the ‘buy’ option addresses reliability and security concerns. For many, these concerns mean that a wholesale move to cheap, standard, cloud platforms in a hybrid model – where security isn’t as advanced –  isn’t an option. Instead the savviest organisations are quickly recognising that moving into a shared environment means that IT can more easily expand and grow, without compromising security or performance.

By choosing colocation instead, organisations will get access to a range of security services –  including DDoS mitigation, intrusion detection management, managed security monitoring, penetration testing/vulnerability assessments and compliance advice – that are unlikely to be available to the same level in-house.

Colocation or managed services can also help to deal with disaster recovery needs. There’s a growing recognition and acceptance that, wherever your data resides, sooner or later it will be compromised, so it’s important to know how to deal with the inevitable rather than to try and defend the impossible. When you buy a service from an expert, it’s their business to get you up and running again quickly.

>See also: How big data is revolutionising the way people travel

By choosing colocation, companies are effectively achieving the best of both worlds; renting a small slice of the best uninterruptible power and grid supply, with backup generators, super-efficient cooling, 24/7 security and resilient path multi-fibre connectivity that money can buy that has direct access to public cloud platforms to provide the full array of IT infrastructure – all for a fraction of the cost of buying and implementing them themselves.

So, the big data cat is firmly out of the bag – we need data in our business and personal lives. As the website browsing public grows warier of data breaches, the pressure will (and already has) come to bear on the business community to pay more attention to securing, storing and using data in the right way.

Proper infrastructure and good data management can help to control the bad and make the good better. Whilst we’ll continue to see big data scandals making waves, the savviest companies will be focusing on what happens under the hood, not in the media limelight. IT infrastructure will make or break the big data revolution.

 

Sourced by Darren Watkins, managing director for VIRTUS Data Centres

Related Topics

Big Data