Why the wise man built his mainframe upon the rock

More than 50 years after it was first introduced, the mainframe remains the bedrock of some of the largest and most important IT systems in the world. Every time someone accesses their bank account, or a retail stock system is updated, the mainframe is called into action to provide the immense compute power needed.

While these are just two isolated examples, recent research found that 88% of organisations expect the mainframe to remain a core business asset over the next decade. It’s clearly a widely held priority, and it’s easy to see why the wise man chose the mainframe.

The high stability and reliability of the mainframe have made it the perfect asset to support the growing number of critical business services that keep today’s digital world turning. However, maximising the investments made on the mainframe requires careful management of tools and resources. In a world where the mainframe is often – wrongly – seen as a legacy burden, how can you build a solid rock of arguments for current and future investments?

>See also: IBM celebrates 50 years of its landmark mainframe computer

The rising flood of challenges

The generation of IT workers with the skills needed to keep the mainframe on its feet is starting to retire, and those coming in to replace them don’t have the experience needed to allow organisations to preserve and advance these business-critical applications.

As such, enterprises need to fully leverage new tools to ease the transition, equipping inexperienced staff with the means to do their jobs faster when developing, testing and maintaining mainframe applications.

However, that’s easier said than done when IT budgets have been tight for some time – the latest market forecast from Gartner indicates that IT spending will shrink by 5.5% this year, so the purse strings won’t get any looser in the foreseeable future.

As a result, IT teams will be under even greater pressure to justify the business case for any investment decisions to the board, using rock-solid information to prove they’re based on fact, rather than the shifting sands of assumption.

The problem within many organisations is that once the investment has been made, this scientific approach to justifying the need for an IT solution is usually abandoned, leaving the business unclear as to whether or not it paid off.

This outcome is entirely dependent upon whether the system is being used consistently and effectively as intended across the organisation. For example, a more user-friendly development interface designed to help inexperienced workers to maintain mainframe applications could be hugely beneficial in bridging the growing skills gap.

However, if the only people using the tool are experienced mainframe teams, whilst the less experienced workers who were expected to benefit the most fail to adopt it, it’s unlikely to pay off to its full potential.

Without visibility into how tools are being used by the workforce after they’ve been deployed, it’s impossible for businesses to identify whether their investment has indeed paid off, or if their hard-fought for budgets are just being washed away from under them.

Turning the tide with visibility

Businesses clearly need to move away from a state of apathy and take a stance of action in order to regain control of the ROI from their IT budget. They will therefore need to adopt a more evidence-based approach, analysing product usage statistics from deployed toolsets to identify where intervention can significantly improve efficiencies and the business ROI.

Reviewing the deployment of an IT solution, and comparing its rate and maturity of adoption within different areas of the business, will quickly identify any gaps where it has yet to be implemented to its full advantage.

For example, in pinpointing functionality that is being under-utilised, IT teams can identify where training may be needed to accelerate the adoption of new technologies.

Not only can this help to identify where internal staff could become more effective, but it can also be useful for identifying areas where outsourcing partners could be making better use of the tools provided to them.

Increasing adoption in this way can significantly improve service quality and delivery rates, as IT teams become more productive and agile – rapidly speeding-up the time to value on an investment. This capability will be critical as experienced mainframe workers continue to retire.

With such a high level of staff turnover in both in-house and outsourcer IT teams, it’s an ongoing challenge for businesses to maintain the knowledge base needed to effectively use the tools they provide.

There needs to be a structure in place to ensure that young developers coming onto the mainframe and new staff at outsourcers can be brought on board quickly, to minimise disruption to the business.

>See also: The IBM mainframe: Golden oldie or modern marvel?

By harnessing usage statistics, organisations can identify their top users and those that are most in need of training. Those who are using the tools most effectively can then be engaged to foster a culture of continuous improvement and sharing of best practices across their teams to help the less experienced staff get to grips with everything much faster.

This level of control and visibility can also help to support greater collaboration with outsourcers, as in-house IT teams can work with their partners to bring new members of their account team on board much faster, ensuring a better quality of service.

Analysing product usage statistics can also help to show where there are tools with overlapping functions, or that are simply not being used. This can help organisations strip out unnecessary costs and identify where staff training can help increase the adoption of under-utilised tools to drive better performance. In the long run, this can significantly help save time and money.

Ultimately, product usage statistics hold the key for businesses to safeguard the future of the mainframe for the next generation of IT workers. This will be critical at a time when the mainframe remains a core asset supporting critical business services, whilst budgets are increasingly squeezed.

By building their mainframe upon the rock of hard usage information rather than the sand of assumption and blind faith, IT teams can intervene and take targeted, corrective action to truly realise the full potential return on their investment decisions and achieve the business results that they aimed for.

 

Sourced from Dr Elizabeth Maxwell, IAPP and Compuware

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Mainframes