Ageing systems affecting your operational risk

By Graeme Wood | 25 June 2014

In recent times, financial institutions and regulators alike have invested a great deal of time and effort in tightening up their risk monitoring and reporting capabilities. The importance now afforded to risk management is no coincidence however, with all parties agreed that a return to the dangerous levels of exposure across the market experienced in late 2008 would be devastating. Improvements have undoubtedly been made as market leading institutions embrace a progressive approach that questions the traditional view of risk management as an end product and instead extracts value from an effective and proactive approach.

Despite firms’ best intentions, barriers to effective operational risk management abound. The majority of institutions are currently operating with a system architecture that has evolved and grown in an ad-hoc nature over the years, probably to facilitate trading. Due to the proliferation of incompatible, ageing systems it has become impossible to gain a holistic view of exposure across all departments, posing an obvious threat to operational risk. This fragmentation did not occur overnight but was borne of years of modifying multiple legacy systems rather than retiring them and implementing a new solution. Issues with this approach include; the need for agility and speed to market taking precedence over control, as well as the fact that systems were updated at disparate intervals. This ultimately leads to information sitting in silos which cannot easily be accessed or compared inter-departmentally.

The aforementioned issue was compounded by the high importance of cutting costs for so many organisations which led to the widespread offshoring of functions and a simultaneous reduction in highly skilled, onshore experts. This carving up of functions had far-reaching implications, scattering personnel far and wide, so that processes became over complicated and individuals only took ownership of their small section of the workflow. Much like the ageing systems then, a fundamental concern is the lack of key personnel who have a truly holistic view of the organisation’s risk profile. Such a realisation has inevitably led to the emergence of ‘back-shoring,’ where firms are reversing decisions on which roles are suitable to be performed in offshore locations based on their overall exposure.

Although focus has turned sharply towards the shortcomings of disparate, ageing systems, they are certainly not the only reason for some of the landmark losses incurred by banks in recent years. Such incidents have brought into question the effectiveness of many common operational risk management practices, including a failure to embed a culture of risk and control across functions, leading to frequent and severe inter-departmental discrepancies. Often, where processes have been approved and deployed they merely provide managers with a false sense of security. Consequently, managers feel ‘the need to act’ and their reaction to a perceived weakness is to deploy additional tactical controls, which usually have long-term, negative impacts.

Transforming siloed systems which are currently leaving firms exposed, whilst simultaneously affecting significant change to the business process model is a costly endeavour, so firms need to be proactive in how they make decisions. With change budgets being consumed by regulatory compliance, firms are facing a difficult decision.  

The changing regulatory landscape has also pushed firms to adapt their risk assessment methodology to encompass a far greater range of inputs and quantitative measures. These typically include a tailored scoring of design and performance characteristics to achieve much more tangible results showing a combination of reduced residual risk and increased operating efficiency as the basis for prioritisation. There’s clearly a cost-benefit analysis to consider, and much of this ‘new data’ is being sourced manually, but decisions are at least being based on information rather than instinct.

Underlying many of these challenges is the issue of data integrity. Organisations struggle with this on many levels and, because of operational risk’s pervasive nature, it has traditionally been incorporated as a series of historical data points with trending analytics. Whilst this form of data gathering clearly has a role to play, I would argue that operational risk management should be the starting point rather than an end product. With regulators pushing for increased accountability for failure, some banks are eagerly embracing the opportunity to conduct a thorough review of their operating models. It’s no longer acceptable for senior managers to claim ignorance and renounce responsibility on the basis that they were not alerted to a particular risk.

The ultimate objective for all risk departments is to be able to confidentially report an automatically generated single risk exposure number for any given counterparty across all business areas and products at a given point in time. Although few organisations can currently achieve this precise requirement, many have it clear in their sights and are actively working towards it.

To achieve this, a firm will need to create a metrics dashboard which will receive and process the currently incongruous data from all departments. Therefore, before this can be done interdepartmental data must be analysed so that relevant, universal trends may be identified and highlighted by the dashboard. Having performed this function the adoption of centralised risk management becomes a reality; one in which firms aggregate data from different business areas in order to gain a clear enterprise wide view of risk exposures across the organisation. This is supported by the emergence of ‘big data’ technologies, which ease the challenge of accessing and aggregating information from hundreds of different computer systems in order to feed the risk calculators. This would all be in line with the favoured proactive, preventative approach to risk management which firms should aspire to if they are seeking efficiency improvements and error and loss reductions.

There is no doubt that progress has been made over recent years, with many financial institutions having significantly improved their risk monitoring and reporting capability. Despite this, organisations cannot be complacent. Firms continue to face challenges and most would agree that the upfront cost of implementing a ‘fit for purpose’ risk capability represents good value when compared with the market and reputational costs resulting from any potential failure. There is no simple answer to the operational risk challenge, but by using a combination of smarter systematic solutions and nurturing greater cultural awareness to tackle any potential issues, firms can lessen the likelihood that they will find themselves at the heart of the next investment banking mass media scandal.

 

By Graeme Wood, Specialist in Operational Risk and Process at Rule Financial

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development