Some experts have likened the evolution of software-defined networking (SDN) – the ability to enhance network capabilities with software technology – to the jump from bicycles to cars. Similar to the intelligent dashboard control system on a car, SDN is able to realise intelligent and automatic network control. And as a pillar of this intelligent control system, big data has an important role to play.
What was once impossible under traditional network architecture – the ability to collect a massive amount of network logs to track operation and running states – is now a built-in part of SDN. But how can enterprises make best use of these capabilities, and in turn use SDN technology to drive better analytics across an organisation?
As Len Padilla, VP product strategy, NTT Europe explains, the information gleaned from network logs can be used as an important reference for daily maintenance of the network, and as a new tool for network fault diagnosis and analysis, as well as prediction of network problems or attack behaviours.
> See also: Paving the way for enterprise mobility with SDN
‘To really get the most out of big data for SDN, you need to make sense of the information that resides in the network and turn it into actionable guidance,’ says Padilla. ‘SDN has the power to automate and make the network faster, whereas big data is the brains behind the operation. It allows you to classify, qualify and quantify, not only how much traffic is flowing, but also what applications are driving it.’
For example, armed with insights on the traffic a certain application needs, SDN enables you to direct it accordingly, boosting the quality of service.
‘It’s about looking at ways to make the network support the data as best it can,’ says Padilla.
Having a comprehensive overview on how their network and apps are behaving makes it easier to pinpoint correlations and patterns. If a business identifies a bottleneck in the traffic, they can quickly do some detective work to find out the cause. Once found, SDN can be the perfect remedy to fix it in real-time.
‘For example, if a business notices sluggish app response times and sees that part of the network is experiencing a lot of latency at the same time, it could immediately take the matter in hand and re-route traffic to a stronger connection,’ he adds. In doing so, high-performance applications can be restored.
Keeping it real
One of the biggest advantages of SDN is that it puts analytics into real-world application. Some typical use cases of SDN and analytics include applying it to an anti-DDoS cloud cleaning solution. As SDN provides visible, flexible network control, the source of an attack can be located and blocked quickly. Meanwhile, the agility of SDN makes traffic cleaning intelligent.
On an e-commerce platform, big data can collect the network load in each promotion campaign to help e-commerce operators determine the effects of the promotion scale of the network traffic, so as to make adjustments towards better network performance.
‘Data about end user behavior is the important piece of information to understand for successfully deploying SDx infrastructures,’ says Matt Goldberg, VP of strategic solutions at infrastructure management firm SevOne. ‘Understanding traffic patterns and looking for data anomalies throughout the course of a day, week and/or month within that environment is a key indicator of what future metrics will look like.’
In addition, he adds, taking the time to look into more granular data points at times of high capacity within the infrastructure, gives the best insight as to what kind of traffic – such as voice or video – to prepare for.
Just the beginning
While there is huge interest in both SDN and big data, as a cross-technology practice, the application of big data to network analysis lacks maturity. Right now there are no clear standards or reference architectures on how these technologies are to be coupled, and non-standard and non-systematic information provided by different device manufacturers makes it difficult to perform data processing.
> See also: Muddy waters: the future of networking
‘One initial challenge is enabling all vendor technologies to be agnostic, both in terms of SDN controller integration and the instrumentation itself,’ says Tony Kenyon, director product management, Office of the CTO, A10 Networks. ‘Although considerable progress is being made through initiatives such as OpenStack and OpenFlow, and there is a clear trend by vendors to provide open APIs based on RESTful principles – key for data centre orchestration and automation.’
Another major challenge is how to secure big data lakes from external hacking, and how to ensure data integrity is maintained over time – beyond technologies such as Kerberos. ‘With so much control now centralised, how do we ensure the network is not hijacked?’ asks Kenyon. ‘Again there are several initiatives here as well as emerging technologies such as block chain that will help when paired with traditional security and fine-grained security policy.’
There are further challenges around instrumentation, storage and analytics; if organisations want analytics over real-time as well as long-term data then the granularity of the instrumentation needs to be considered.
According to Kenyon, the main questions that need to be covered include : ‘where should data be collected from, how frequently, do we store it in a Hadoop cluster or move old data into SAN, what other applications need access to this data?’
All these open questions that remain simply highlight that we are at the start of something new, but there are signs from early adopters that this converged solution offers the potential for significant cost savings, as well as new opportunities to restructure operations and target new business.
‘For those waiting in line it’s clear that some of these user cases may not be made public too soon,’ says Kenyon, ‘since the process of getting this right requires serious investment and appropriate skills to analyse, integrate and the fine tune these environments. Skills such as data science are in very short supply, and that needs to be combined with deep systems engineering knowledge, so any competitive advantage gained is likely to remain a closely guarded secret where advances may be difficult to protect intellectually.’
Despite the challenges that remain, technology in the areas of SDN and analytics is accelerating quickly, and Padilla believes our learnings from server virtualisation in particular are going to drive the development of SDN and big data.
‘As a result of the cultural change brought about by virtualisation, people are now more comfortable with the virtual concept rather than the physical, and are also more accustomed to running server infrastructure in an automated way,’ he says.
> See also: A bird’s eye view of software-defined networking
Industry analysts expect the blurring between hard and virtualised infrastructure will continue, with hardware needed to enable core and edge scalability, and for high performance low latency applications; but importantly – everything instrumented and accessible through common APIs and policy.
Many facets of SDN and analytics are essentially new territory for data centre designers and the data science community – after all, this kind of meta-coupling goes well beyond managing ‘state’.
‘Nevertheless it’s clear that the marriage of big data analytics and soft-programmable infrastructure will promote the ability for organisations to deploy services and support users in a way that was simply not possible in the past, either because it was too resource-intensive, too costly, or simply too risky to attempt,’ he argues.
‘We should expect to see fully automated data centres becoming the norm, using closed feedback of policy-driven programmable elements, instrumentation, all informed by analytics.’