Why getting digital transformation fundamentals right keeps customers happy

Digital transformation has been significantly accelerated for most companies thanks to the changes driven by Covid-19, particularly those surrounding working practices. But this altered pace of change has had several consequences.

Firstly, businesses now expect the pace of digital transformation to remain high, rather than reverting to the slow programme mentality that many had pre-pandemic. Secondly, while the accelerated pace has been necessary for businesses to keep up with the fast-moving pandemic era, some have lost sight of the need to take stock of the changes and transformations they are making. Yet, this is vital to ensure crucial elements such as cost effectiveness, security and overall value of the changes made are meeting a business’s goals and needs.

For many companies, the past year has also been the first time that they have truly begun to embark on a digital transformation journey. This also means it is the first time that business-led change is driving the technology agenda, instead of the other way round. As a result, IT teams have been forced to quickly react to business needs, ideas or innovations and find solutions.

This has largely involved taking capabilities that may exist as individual components in order to build a solution or service that meets their specific requirements. These elements may be off the shelf as software-as-a-service (SaaS), which are then integrated into a bespoke solution and deployed as containers or microservices that can span several hyperscale cloud providers.

Tackling developer talent shortage is key to digital transformation acceleration

Richard Billington, CTO of Netcall, discusses the importance of tackling developer talent shortage when it comes to accelerating digital transformation. Read here

The importance of having a clear and concise strategy

The consequence of this segmented approach is that some crucial elements can be forgotten, such as networking and security. This is where the digital agenda is moving from a simple cloud-based application to something more complex.

What we are witnessing now is the need to re-consider how end-to-end design is changing the best way to merge the ready made services in the cloud from a hyperscale or SaaS provider, with the telecoms world, while providing connectivity in a secure manner. When IoT or edge services are added to the mix, connectivity and security become even more important because data is being collected in a much wider variety of locations and situations.

How does this manifest itself during a transformation within an organisation, and importantly, how can a business align its strategy to its implementation? The answer lies in having concise messaging built upon a clear strategy.

These should be well structured and clearly articulated across the entire business without agendas, local solutions or individual preferences. That’s because these can be highly disruptive, leading to cost and time-to-value challenges, which can risk losing buy-in from the wider business.

The crux of the modern challenge

For the first time, digital transformation puts the network right at the heart of a business and its internal changes, which is vital as the value of data increases.

Solutions are increasingly going to either span multiple hyperscale cloud providers with data moving between them, or services delivered concurrently from multiple clouds for true portability of service. As such the following becomes true:

  1. Understanding where data flows from and to becomes a key design point, both from a capacity and processing speed perspective, but also for the security of the data.
  2. When a component fails, resilience is imperative. This is especially true in regulated environments, such as financial services or healthcare, when recovery actions need to be conducted in a specific order and to be actively managed.
  3. Due to concentration risk and regulatory frameworks, services will need to be able to operate on multiple clouds either separately or concurrently in the future. Therefore, load balancing and data synchronisation of transactions will become key challenges to overcome.

With all of this in mind, non-functional designs as well as the functional elements are still crucial and cannot simply be left to the cloud provider, which is still what many businesses believe.

Resilience of a service and recovery actions in the event of a failure need deep thought and consideration. Ideally, these should be automated via robotic process automation (RPA), but for this to be a success, instrumentation of a service and event correlation are needed to truly determine where in the service chain an error has occurred. For example, a group of customers may experience a delay when using a mobile application. The problem may have occurred in a number of places: the mobile data stream, local Wi-Fi or a disk drive supporting a back end database.

This is the crux of the modern challenge, and it could span multiple clouds, SaaS or home grown applications and a back-end data store in a traditional data centre.

Building in the correct service instrumentation, monitoring from the start and carefully correlating events within a service, are far easier to do at the outset than after the event has occurred, by which time, you could have alienated hundreds or thousands of potential customers.

Good design takes these items into account and reacts, while great architecture is able to anticipate potential errors and pre-empt failures or issues, using AI to make changes before a customer notices the problem in the first place.

Having the right expertise significantly aids great design by bringing all the different required perspectives into the same place. For example hyperscale cloud knowledge, modern and traditional application construction knowledge, as well as internet-working expertise. Layer in integrated security and you have an ideal blend to deliver successful digital services to customers, which are not only performant, but also resilient and secure.

Three guiding principles to establishing data resilience for a hybrid cloud strategy

Stephen Gilderdale, senior director at Dell Technologies, identifies three principles to consider when looking to establish data resilience for the hybrid cloud. Read here

Confidence is key

Bringing together the world of cloud managed services, security and networking for businesses will enable digital transformations to be better constructed and designed from the outset. All of which helps to deliver world class solutions that can be future-proofed as technologies evolve and move into the mainstream.

This allows business owners to focus on providing outstanding and differentiated customer experience without worrying about the capabilities of the respective components and confident in their network’s ability to provide always on availability and data security for customers.

Written by Simon Bennett, EMEA CTO at Rackspace Technology

Editor's Choice

Editor's Choice consists of the best articles written by third parties and selected by our editors. You can contact us at timothy.adler at stubbenedge.com