The concept of software-defined infrastructure has gained vast momentum as organisations seek to reduce their reliance upon expensive hardware and find better solutions for dealing with data growth.
What started out as a server craze – compute virtualisation is now widely deployed – has spread across the data centre through the network and storage layers.
By virtualising the components, and wrapping them with highly automated software, organisations can gain new levels of scalability and the ability to deliver applications on any hardware.
Now, security is also joining the party. Security conversations are often steered by the individual solutions that can make IT environments less vulnerable – but what about the model in which they are implemented and managed?
With software-defined security, most or all of the security controls are automated and managed through software, depending on how virtualised or cloud-supported the infrastructure it sits on is.
Such a model sees any new devices in the environment controlled under the base security policy, allowing underlying environments to scale with increasing resources and seamlessly moved or migrated if necessary.
‘This term is most often used by vendors to describe an approach to automation and virtualisation that abstracts infrastructure to the point that it’s primarily controlled through higher-level functions and policies,’ says Paul Briault, director at CA Technologies.
Not the end
By negating the bulk of the heavy lifting with infrastructure, IT organisations can shift resources to more innovation-oriented and application-centric endeavours.
It is a part of a number of defences that organisations must now consider in order to protect their assets and analyse where attack vectors are.
However, that doesn’t mean the end of hardware-based security.
‘Software-defined security services need to be part of a larger process,’ says David Robinson, chief security officer at Fujitsu UK and Ireland. ‘The more efficient it is, the more effective it will be.’
Just like traditional security, software solutions still need maintenance, updates and reviews of its efficacy, and will still require some hardware.
‘I think there will continue to be a blend of hardware- and software-based security, with perhaps an increased focus on the software aspect,’ says Kevin Linsell, head of service development at Adapt.
‘But the move towards this type of security is more around enabling devices and solutions to be driven by software calls from a wider software-defined environment.’
This will definitely see the reliance on security hardware gradually decrease. The significance of access controls will move up the stack, while the hard network boundary approach to security will diminish in importance – a trend that has already begun.
Of course, anyone expecting a sudden shift away from hardware will be disappointed, and organisations that have been relying on hardware for authentication purposes will not be able to go ‘all software’ quickly.
‘Part of the problem is that organisations are not yet ready to embrace an all-software approach,’ says Briault. ‘Many are still struggling to properly implement BYOD and, as such, the industry can expect a significant phasing-out period, throughout which hardware will continue to play its part in IT security.’
In response to the software-defined paradigm, more and more security vendors are attempting to become hardware agnostic.
There are still appliances that are being specifically tweaked to run a vendor’s software, but many organisations have realised that implementing software-defined services is easier and cheaper – especially in cloud and virtualised solutions.
‘Creating an organisation that is hardware agnostic is the way that many businesses are heading,’ says Robinson. ‘Hardware will still be needed because, without it, software cannot be run, but development and services are now more reliant upon the management of the software.’
The shift towards software-defined security will also result in much more granular and appropriate security policies.
The focus will be to use digital identity attributes to enforce fine-grain access entitlements to gain access to systems and applications.
It will make security more intrinsic and integrated within a business, which will be particularly obvious from a change process perspective.
‘Removing the human error risk can be a big positive, but there will still be a need for strong governance, control, testing and ultimately accountability,’ says Linsell.
Briault adds, ‘Security will become more real-time and transaction based, with a focus on data and user access requests, irrespective of the channel being used.’
So, going forward, will the real value and intelligence of security come predominantly from software?
Common issues in relation to authentication and access management could be solved by software-defined behavioural analytics, which will vastly improve organisations’ risk posture and real-time transaction decisions.
‘It will also lead to better user experience, which is key to business success today,’ says Briault.
But, as always, there needs to be a balance between technology, people and process.
The technology piece will always be a mix of software and hardware, but the change in ratio will enable faster design and deployment – without having to invest in lots of training, hardware and assets.
It’s important to remember, however, that value cannot be attributed to software alone.
‘It will always be a mix,’ says Robinson. ‘You can’t run a service without people or process, and you can’t run software without hardware.’
But, as long as it’s hardware agnostic, it will be easy for organisations to implement software architecture, helping to drive down costs and reduce operational time.
Its usability is dependent upon how it’s written and created. If it’s complex then it’s going to take more effort for a business to implement and use.
Like any project, its size and complexity will require larger numbers of people and project management. Ease of implementation will always drive the project and help manage it over its life cycle.
If an organisation is already using a virtualised service, the process of moving environments is pretty straightforward. But if businesses are thinking of embracing a completely software-defined data centre, they must first ensure that it is the right step and do a thorough risk assessment and due diligence.
‘This transition has already happened,’ says Robinson. ‘There is no specific challenge – the barriers are often cultural and fear.
‘When you have invested in software-defined security, you aren’t reliant upon customised hardware and the need to have a return on investment for the purpose it was bought.
‘Moving onto something that is capable of switching software services onto standard architecture is a positive step. But like all software, it has to be kept up to date, and organisations cannot just fit and forget.’
It is important for businesses to measure risk and configure software in a way that is right for the organisation.
>See also: Dissecting the software-defined data centre
Those vendors that make it easier to implement software are faring well, highlighting the fact that this trend is certainly on the rise.
Although hardware is getting cheaper, margins are becoming more difficult. The virtualised approach is only going to become more attractive, with businesses moving away from hardware and investing instead in ‘as a service’ models.
‘We are seeing people move away from traditional hardware to a more agile approach,’ says Robinson. ‘There is always going to be a mix of software and hardware defences, but either way an organisation’s protection needs to be based upon the risks that it faces and how it can manage them.
‘The adoption of software-defined security is a natural evolution, but it’s vital that businesses embrace it in the right way – not get frightened – and have a balanced approach.’