The great integration

The financial services sector has been in the process of making risk management more transparent and more accurate since long before the credit crunch.

The first of the Basel Accords, which set a minimum capital requirement for banks (how much money they must hold on to protect against collapse) became law in 1992, and was updated in 2004 with Basel II. The equivalent regulation for the insurance industry, Solvency I, was introduced in 2002.

But the events of 2008 and 2009 changed what was a matter of good corporate citizenship into something far more crucial. Not only did they prove that near-total collapse is possible, they also made financial services regulation a matter of genuine public concern.

Since the credit crunch, updated versions of both the Basel Accords and Solvency have been drafted – Basel III will be enforced from 2019, Solvency II from 2013 – and the banking and insurance industries are currently figuring out how to accommodate the new requirements of this regulation.

Financial services organisations have historically operated risk calculation and reporting as a distinct function of the business, and not without reason. Calculating the risk exposure of a retail bank, for example, requires its own kind of data models, analysis and expertise.

“For many years, risk organisations have operated entirely separately from the marketing and finance divisions,” explains Rick Hawkins, principal advisor at KPMG Performance & Technology. “And each of those separate functions have used separate databases, tailored for their own needs.” However, the information produced by the risk function is invaluable for those other divisions of the business. Likewise, integrating the data collected by the finance and customer departments can improve risk calculations.

This is not a new realisation for the financial services sector, but increased demand for transparency and a requirement to invest capital more effectively are today providing the impetus to put it into action.

The risk of customers

The integration of risk data and analysis with customer and finance operations will be a multi-year, high-investment initiative for the financial services sector, and as such it will test the ability of the sector’s IT departments to cost-justify and implement long-term programmes of change.

According to Edwin van der Ouderaa, global head of Accenture’s financial services analytics practice, though they operate separately, the risk function and marketing department of a financial institution are fundamentally trying to answer the same question. “They are both trying to understand what the different segments among the customers are,” he explains.

In fact, van der Ouderaa says data that has statistical significance for risk calculations often overlaps considerably with the data that is needed to target marketing campaigns or evaluate the right product for a given customer.

“We’ve done some very large customer segmentation analyses for big banks, where we collect up to 800 different variables and find out what the statistically significant clusters of customers are,” he explains. “When we look back at the data, it is usually only a few tens of variables that make a difference in defining those clusters. And you might find that as many as two-thirds of those variables are the same for risk segmentation as they are for marketing segmentation.”

But while both departments analyse what is essentially the same information – who a customer is, what transactions they have conducted, their account history, etc – they define the data in very different ways.

“They might not even share a definition of what a customer is,” he says. “For example, the customer for the risk department might be the breadwinner in a household, whereas for the marketing department it might be the entire family.”

This separation of data models has in the past had some absurd consequences, says Tony Brown, senior finance industry consultant at data warehouse vendor Teradata and a former marketing executive in the banking sector.

“I’ve seen cases where the marketing department would identify thousands of customers to be offered a new product,” he says, “but when those customers accepted the offer, as many as half of them would be rejected by the risk department.”

As well as preventing wasted effort such as this, integrating risk and marketing data allows for risk-based pricing, i.e. pricing a financial product such as a loan based on the risk that the customer will default.

This is by no means a new concept – it is standard practice for large, institutional loans, and it is practically the basis of the insurance industry. However, the desire among banks to invest their capital more efficiently means that their retail functions are making greater use of risk-based pricing.

There has been some public opposition to risk-based pricing (in fact, it is illegal in the US for certain products) and some banks have made a virtue of not using it. But according to Brown’s colleague James Hunt, “the long-term trend is against that, realistically. Risk-based pricing is getting more common and more sophisticated.”

Beyond risk-based pricing lies risk-based portfolio management, explains KMPG’s Hawkins. “There has been a view that greater profitability comes from winning a greater share of a customer’s wallet – that’s been a driver for the likes of Barclays as they have realised that industry consolidation is starting to squeeze them,” he says.

“But offering more products to a customer may actually expose a bank to greater risk. Integrating the risk and customer data allows them to make decisions about whether taking a greater share of your wallet is a good or a bad risk.”

Hawkins reports that some banks see the integration of customer and risk data as a source of competitive advantage. “Santander is probably the leaders in this,” Hawkins explains. “If you phone a Santander call centre, the agents will probably have some idea of what kind of risk profile you have.”

Meanwhile, more conservative banks see it as a way to minimise their risk exposure. “Knowing the risk profile of a customer when they call in allows them to know how interacting with that customer will affect their overall position,” he explains.

Next>>> Integrating risk and finance

Page 2 of 3

Parallel to this integration of risk and customer data is the demand from finance departments to integrate the operational data used in risk calculations into their accounting systems.

Unlike in the case of marketing, however, the datasets that the risk and finance departments use are quite different. While risk analysis is based on frequently updated customer and transaction data, the finance department has traditionally been concerned with aggregating numerical data, principally for periodic financial reports.

According to Stephen Skrobala, senior director for financial services at systems and software giant Oracle, integrating operational data, including risk analyses, into financial systems permits what is known as ‘managerial accounting’.

“Rather than just producing a financial report, which is a snapshot of the balance sheet at one time, the finance department wants to be asking whether the institution is doing enough of the right kind of transactions to achieve its goals,” he explains.

There is also a desire to include risk data in financial reports, in order to provide greater transparency not only to regulators but also to potential investors. “Investors want to see the risk that the bank is exposed to,” he explains. “That is not to say that risk is necessarily bad, but they need to get a handle on what they might be investing in.”

Furthermore, there is a good reason to integrate financial data in risk calculations, Skrobala argues. “You can be taking risks based on the algorithmic calculations, but if you don’t include the finance data, you don’t know if the balance sheet can support that risk,” he explains. “So from a transactional point of view, the risk department might say its fine, but the finance department might say ‘this is beyond our limits’.”

Next>>> A step-by-step approach to multi-year integration programmes

Page 3 of 3

It appears, then, that allowing the risk, customer and financial divisions to use the same data offers numerous, substantial benefits. How might that be best achieved?

The ideal is a single data repository, with a common data model that serves the requirements of all three departments. But much stands in the way of this ideal.

“Integrating these separate sets of data is not just a question of putting them in tables and joining them up,” explains Teradata’s Hunt. “It may not necessarily be consistent, so this integration will need a lot of work around data quality, data governance and master data management.”

What’s more, large, multi-year projects to create a single database to support multiple departments have a notorious track record, says Accenture’s van der Ouderaa.

“We aspire to having a single data model, in a single data warehouse, providing a single version of the truth, but I’ve never seen an organisation that has been able to do that at a large scale,” he says. “The problem is that it’s a moving target. If you embark on a five-year programme to build a universal data warehouse, by the time you have finished the requirements will have changed.”

In the current climate of tight cost control, he adds, no organisation will invest hundreds of millions of pounds on a project that may or may not deliver results five years down the line.

Instead, van der Ouderaa argues, organisations must figure out what a unified data model for all departments would look like, and work towards it in stages.

“At Accenture, we have identified three kinds of IT project,” he explains. “First, there is the kind that is self-funding because it is being driven by a specific business need. It is up to the CIO to make sure that these projects support the ideal data model.

“Then there are data warehouse rationalisation projects. If you have multiple instances of Teradata, for example, you can reduce the cost to as little as a quarter by harmonising the versions and consolidating the hardware,” he explains. “Again, the CIO needs to use these projects to work towards the universal data model.”

But while these kinds of project are self- funding –they will not lead to the desired end state on their own, van der Ouderaa says. “There are also infrastructural projects, such as standardising the extract, transfer and load (ETL) systems, that are essential for this universal data model but that no-one wants to pay for.”

It is up to CIOs to make the case for this kind of project. “They need to take leadership and tell the business ‘we’re doing this because it’s going to enable all manner of projects that I know you are going want in five years’ time’.”

That said, van der Ouderaa advises that CIOs exploit self-funding projects to develop the universal data model as far as is possible. “If you look at a data integration programme in that way, the amount that you have to invest in infrastructure in good faith becomes much smaller, maybe just a fifth of the cost,” he says. “It’s much easier for the CIO to do that than to say ‘I need £100 million and five years to do this’.” What will the impact of this integration be for the financial services sector?

For Teradata’s James Hughes, it is matter of maintaining the status quo in the face of a growing regulatory burden. “Regulation is becoming more and more of a drain on the sector’s resources, so it needs to become more efficient,” he says. “At the end of all this integration, financial services institutions may only end up being where they are today. But if they don’t do it, they would be in serious danger of becoming very inefficient indeed. And of course, that would be passed on to customers through higher prices.”

But for Oracle’s Skrobala, making risk management more transparent and better integrated is critical for the financial services sector’s ability to attract its most prized resource – capital investment. “Banks and insurance companies are not just competing amongst themselves; they are also competing with other industries for investment capital,” he explains. “If they can demonstrate that they can mitigate risks better and can provide greater transparency than other industries, there is an opportunity to attract more capital.”

Pete Swabey

Pete Swabey

Pete was Editor of Information Age and head of technology research for Vitesse Media plc from 2005 to 2013, before moving on to be Senior Editor and then Editorial Director at The Economist Intelligence...

Related Topics