Will overconfidence kill big data?

A new Qubole report has revealed big data initiatives are at risk due to high demand, false confidence and immature processes

Big data DevOps

'Until IT teams adopt a DataOps approach versus a more traditional command-and-control model, they’ll remain a primary bottleneck to insights and their big data initiatives will continue to struggle'

Qubole, the big data-as-a-service company, today announced the results of its report – a survey of IT and data professionals on the progress of their big data initiatives.

The survey revealed a clear reality gap: while data teams have high confidence they can enable self-service insights to meet growing demands across the enterprise, few have delivered on that promise.

According to the survey, 76% of respondents said their company currently has a big data initiative, and another 20% said they plan to soon. In addition, 93% of respondents said business demand for big data analysis is growing.

>See also: Best DevOps practices for 2017

Over two-thirds of IT teams recognise that to get to ubiquitous access to data and analytics, they need to enable a self-service DataOps approach. And most respondents – 87% – felt confident to extremely confident that they could deliver self-service analytics.

Yet, respondents characterised their big data processes as still in the earliest stages of maturity: only 8% of respondents consider their big data initiatives to be fully mature.

A deeper dive reveals that IT is besieged by operational and technological challenges that interfere with improving big data maturity:

  • Only 12% of respondents said they have multiple big data projects running.
  • 98% said they face numerous challenges with their big data initiatives.
  • 78% still support data requests on a project-by-project basis.
  • 45% can’t satisfy business needs and expectations.
  • 61% rely on third-parties for big data expertise.

“Having experienced this firsthand at Facebook, delivering on the promise of self-service access to data and analytics across the enterprise is extremely difficult and goes way beyond technology, involving rethinking processes, company culture and the operational model of the data team,” said Qubole founder and CEO, Ashish Thusoo.

>See also: Digital business trends 2017

“Until IT teams adopt a DataOps approach versus a more traditional command-and-control model, they’ll remain a primary bottleneck to insights and their big data initiatives will continue to struggle. But there is a path — some companies have successfully made the transformation, and others can learn from their experiences.”

Additional findings

Data analytics is moving rapidly to the cloud: nearly six in 10 companies are currently using at least some cloud resources for big data processing, while 14% are running all big data processing in the cloud and 41% are running at least some data processing in the cloud.

Another 30% of respondents say that while they are currently running big data processes on-premises, they are considering cloud as a future option
Amazon Web Services (AWS) leads the pack, with 32% of respondents saying they use Amazon’s cloud platform for big data processing.

Microsoft Azure, however, is not far behind, with 26% of respondents using it for big data projects. Google Cloud Platform is used by 12% of respondents and Oracle Cloud is used by 11%.

>See also: Big data vs. privacy: the big balancing act

Businesses are in need of big data expertise:

  • 83% of respondents said their data teams are growing.
  • 36% of respondents said they are having difficulty finding people with expertise in big data projects.
  • 31% said there aren’t enough technical resources to run big data operations effectively.
  • 61% of respondents reported that their organisation uses third-party consultants with big data expertise.

Comments (0)