The key message from SUSECON 2017 – the annual global technical conference for SUSE Linux Enterprise customers, partners and community enthusiasts – highlighted the importance of IT infrastructure in enterprise digital transformation goals.
In today’s business environment, every company is a digital company. IT infrastructure needs to not only keep pace, but also move fast enough to accommodate business transformation initiatives related to emerging technologies.
This hybrid IT infrastructure, explained Nils Brauckmann – chief executive officer of SUSE, part of Micro Focus (the 7th largest pure-play software company in the world)- is crucial in helping “bridge the value of legacy infrastructure and future enterprise innovations”.
>See also: How to get the board on board with hybrid I.T.
The “open, open source” approach is necessary in helping enterprise customers and partners transform IT infrastructure, create a more agile business, make room for innovation and ultimately, spur digital transformation underpinned by software-defined applications infrastructure.
Open source, concluded Brauckmann in his press briefing with journalists at SUSECON 17, represents the “innovation engine” for the enterprise.
It will lead society and business into the future, through collaboration, shared knowledge and technology.
Delving into the world of open source and its potential for hybrid IT and subsequently enterprise innovation that is being explored at SUSECON 2017, Information Age was granted an exclusive interview with Thomas Di Giacomo, chief technology officer at SUSE.
Within the extensive interview Di Giacomo discussed SUSE’s place in current market and dissected the importance of bridging legacy infrastructure with the ‘new stuff’ in order to accelerate digital transformation goals in order to stave off disruption.
>See also: Hybrid IT organisations struggle with IT skills gap and security concerns
He also elaborated on the significant role a hybrid IT environment would have in complying with new, more strict data regulations, while at the same time envisaging the future of the IT department.
Can you explain your role within SUSE?
I’m chief technology officer at SUSE, and I work with the management team – together with the president of marketing and alliance, the president of engineering, and I help in identifying the industry trends, the key requests or requirements from our large customers as well. I have an external and internal role.
Externally I communicate in the industry events, in the press where SUSE is going in the future, and internally we work together to find a strategy, technology strategy for the next 12 to 24 months.
>See also: Can you afford to ignore hybrid network training?
Even though it is a very dynamic market, no one has a crystal ball to know exactly what is going to happen in five years from now. But keeping track on what’s happening in the open source community and industry is important.
What is SUSE’s current strategy?
SUSE has been a Linux company for 25 years. It started with Linux, and one thing that stays is that SUSE started with getting Linux ready for enterprise. Because, open source is great – you have a lot of innovation going on – but to be used in mission-critical production you need to make sure that it is integrated with other technologies, that it is secure, that it’s stable, that you can support your customers. So that’s SUSE started to do with Linux, and the tricky things are the partnerships with the hardware vendors, with the other software, and also the enterprise-readiness of open source technologies.
Since 25 years it has evolved to more software-defined infrastructure, like OpenStack, software-defined storage, software-defined networking, virtualisation, containers.
Today, we are still focusing on software-defined infrastructure, and we add on top some solutions for application delivery, because at the end of the day you have traditional IT people or infrastructure people, and you have the developers – but they have to work together to be faster for the business requirements to deliver applications to support the business. Especially now that every industry, even if you are a taxi company, is getting disrupted.
>See also: Hybrid cloud: the key to enterprise innovation
Every company has to become more and more digital, and quicker, and so we have to facilitate the way that they develop an application on top of the software-defined infrastructure. So, we also provide application delivery to get help customers.
You mentioned the importance of merging technologies in this hybrid manner. How important is this for enterprises to continue innovating and staving off disruption?
It’s absolutely key, because we see a lot of new technologies that are great. For example, cloud native, starting from developing things on AWS, public clouds and those kind of things. It’s great if you are a startup, and you start today from scratch then you can adopt easily all the new technologies. But more than 90/95% of companies have an existing IT department, existing infrastructure and existing software.
So, we have to make sure that it keeps running, that the time the company spends to run its infrastructure is reduced. We need to help them to automate some of their legacy, but also adopt some of the new things to get out innovation at the same time. It’s not about one or the other. It’s not legacy vs the new stuff. They have to work together in order to innovate, and you have to be able to move some of the applications from legacy to new when it makes sense financially, when it makes sense in the process of the companies.
>See also: Carmakers must introduce a new “hybrid” when it comes to data
It’s really about hybrid IT in the sense of legacy and new stuff having to co-exist together. And you could also say that hybrid is also the hosting: so, private data centres, public clouds are also hybrid. So you need to make sure there are good bridges between what companies have internally, and what they might want externally. Move to a public cloud, and vice versa. Move between different public clouds as well, and there is not a single solution that fits everybody. And at SUSE we provide a lot of solutions and tools to facilitate hybrid IT like that.
Have you come across any challenges in facilitating a relationship between legacy and the new technologies? And if so, what best practice advice can you provide that will help ease the transition?
It’s mostly on a case by case basis, because depending where the company is today and where they want to go, you have different strategies to adopt. The first thing is on the technology side. There are solutions that can support both the existing and help you to move to the new stuff. And containers on top of virtualisation is one. Or OpenStack can facilitate those kind of things. But it’s also about processes and culture inside a company, because you have teams that have been working the same way for 20 years, and it’s as much about technology as it is about people. And sometimes it’s also about companies to understand that they need a solution to some of the processes.
>See also: Hybrid cloud and blockchain solutions will be the future for data backup
What we try and avoid with customers is doing a big bang kind of approach, but more of a phased approach, and maybe a new solution that a company is willing to do, then they can do it the modern way and keep the existing legacy way, and progressively over time – maybe it will take ten years, it’s not like you have a destination to reach, it’s a constant journey, because the technology will continue to innovate. So, you have to be ready as a company to embrace new technologies when it makes sense for you and keep what you have running. It’s an interesting challenge.
Can you provide any insight into those companies that are overcoming this culture challenge?
First of all, diversity is important in IT, but also in people. It’s good to have different profiles of developers, of IT people. It also needs to have the management to understand that they need to transform their business and their IT. And you can also train the old people to do new things. And so, we have partners that are doing that to help companies adopting new ways of developing applications, or architecting applications. Working on the people is important.
Technology is not always the bottleneck, but still there are things that need to be improved on the technology side of things, because everybody talks a lot about hybrid cloud and hybrid cloud management, but there are still not a lot of standardised solutions for moving containers from private cloud to public cloud.
There are still things to be looked at to make services work together better: how they talk between each other, how can you have interoperable services between platforms, how the network looks between those services.
>See also: On-prem or In cloud? Most suitable location for apps in hybrid environment
Again, there are still a lot of things to be done on the technology: how to migrate old workloads to new ones automatically is also still a technical challenge.
There are solutions we are doing with partners, but the open source community still has a lot of things to do, and that’s why you see projects like the Cloud Native Computing Foundation.
In terms of new data regulations, how will adopting hybrid IT help with ensuring compliance?
That’s a very interesting question because the regulations are changing. Regulation A is today and you can find a solution that will meet that regulation. But, you’re solution is only good until that regulation is changed. You need to think about something that can help you to move your data from public to private very easily without disrupting your business.
So, when regulations change then you can change and adapt your IT based on the new laws. It’s important for the present, but also for the future when changes come from regulation or when public cloud providers might change their business model (their pricing), then you might want to have the freedom to move to another cloud service provider. It has to be flexible, because it is not enough to just find a solution for today, because it will not be the solution for tomorrow.
A lot has been made of the security of the cloud. Can you discuss the security implications of adopting a hybrid IT, cloud-first strategy?
This is again about diversity. If you rely on a single provider, if it’s public or private, then you are more at risk than if you split across multiple. It’s the base of disaster recovery in a way. If you put everything in the same place then you have a major risk that it breaks and it is true of public cloud, it’s also true in private data centres.
>See also: Taking the IT departments pain away
The best thing is to have different private data centres and capabilities to go to public cloud, or have different public cloud providers, so your data is not all under the same roof. If that house disappears then you lose everything.
The good thing with open source technology is when you have security issues, then it is very quickly identified, very quickly fixed. I think that is one of the benefits of open source. There’s a lot of discussion about whether open source is more secure than traditional proprietary solutions and I tend to think that open source is a lot more secure. Of course everyone sees the code, including attackers, but you have so many eyes looking at that code, and the Linux kernel is a good proof point of that.
The Linux kernel is still being patched for security, but it is being patched a lot faster than whatever you would find in non-open sourced software. You have less visibility on what’s happening on public cloud, that’s the whole point of it, you don’t want to go into the infrastructure or have access to everything, but open source will be used as much as possible in public cloud. In public cloud you have a chance to benefit from all the security patches from open source.
>See also: Cloud computing now the norm in a predominantly hybrid IT market
The IT department is arguably the only department that runs throughout an organisation in its entirety. What role will it play in the future of enterprise and enterprise innovation?
Looking at virtualisation, it’s going to be more and more self-service based, automated. We might not need someone to activate ‘that VM anymore’, and those kind of things.
There’s a lot of complexity that is hidden from applications. It will still be there, but it will be automated and compute, network and storage resources will simply be abstracted from the end user.
Information Age will be bringing readers all the latest announcements from SUSE in the coming days