Two thirds of high performance computing sites use big data – IDC
The IDC market study shows that supercomputer sites employing co-processors and accelerators have more than doubled in the past two years
There has been a substantial increase in the number of high performance computing (HPC) sites around the world applying big data technologies and methods to their problems, according to a new study by analyst firm International Data Corporation (IDC).
A surprising two thirds of high performance computing (HPC) sites are now performing big data analysis as part of their workloads, IDC announced at the International Supercomputing Conference in Leipzig, Germany yesterday.
67% of sites in 2013 said they perform big data analysis, with an average of 30% of the available computing cycles devoted to big data analysis work, according to the 2013 “IDC Worldwide Study of HPC End-User Sites.”
The figures show how the boundaries between HPC big data and high-end commercial analytics are increasingly dissolving, highlighting how HPC systems are becoming viable options for executives looking to boost revenue and cut risk. Early use cases include fraud detection, genomics and personalized medicine.
HPC vendors are increasingly targeting commercial markets, whereas commercial vendors, such as Oracle, SAP and SAS, are seeing HPC requirements.
Co-processors and accelerators are increasingly gaining momentum among HPC sites, with the proportion of sites employing co-processors or accelerators leaping from 28.2% in 2011 to 96.9% in 2013. Intel Xeon Phi co-processors and NVIDIA GPUs were the most popular, with FPGAs a close third.
The end-user report also confirmed IDC’s supply-side research finding that storage is the fasting-growing technology area at HPC sites. Storage will play a pivotal role in big data for the next fear years as businesses become increasingly data-driven.
“Data-driven businesses with a constant thirst for storing and analysing large quantities of data will force suppliers to develop big data–friendly solutions — solutions that are designed to minimize the movement of data and at the same time provide the economies of scale needed to store this data at the cost of pennies per gigabyte,” said IDC’s report on worldwide storage in big data published in May.
Cloud computing is also being used for HPC workloads, with the proportion of sites exploiting cloud computing rising from 13.8% in 2011 to 23.5% in 2013, with public and private cloud use about equally represented among the 2013 sites.
IDC forecasts that revenue for HPC servers acquired primarily for big data use will approach $1 billion in 2015.