Risk has gone through three distinct phases over the last 30 years. The changes have reflected both the external focus on the financial services industry from regulators and the way that technology has changed what’s possible. It’s a process that’s likely to continue as the industry strives to achieve a balance between hazard and opportunity, suggests Anthony Pereira, founder and CEO of Percentile, risk management and regulatory compliance technology specialists.
In the late eighties and early nineties, when you talked about risk to a financial institution, the chances are you’d be focusing on credit risk: the potential that a trading partner or counterparty might not be able to meet their obligations and leave a bank with bad debt. Recognising the threat to their bottom-lines, most financial institutions became adept at developing mechanisms and systems that would warn them about potential issues, and, broadly speaking, the industry came to an effective way of quantifying trust.
Around the turn of the century, the focus moved from credit risk to market risk, and the potential for adverse market movements to expose positions and cause losses. Again, financial institutions became proficient at ensuring that trading positions were balanced, taking a wider view of their markets to minimise the potential that they could be caught with an exposed position. While some traders would complain at the time that this reduced their room for manoeuvre, most now accept that it’s improved the financial services industry’s stability.
The era of operational risk
We are now in the era of operational risk, where measurement has become so finely calibrated that risk analysis has become focused on the potential for actual losses differing significantly from forecast losses. This is because of what Basel II defines as “inadequate or failed internal processes, people and systems, or from external events”.
Operational risk encompasses the entire institution, covering activities far beyond the trading desk. This evolution reflects both changes in focus and the growing capabilities of institutions to understand themselves and where they’re exposed.
Several things have made it possible to understand risk at an operational level, but one of the main factors is the availability, flexibility and credibility of technology.
Watching the horizon
When computers first started making their way onto trading desks, tools were developed around specific individual needs. This suited the silo approach that was prevalent in many institutions 25 years ago.
With time, however, systems have developed that can support multiple products. And as they are supporting multiple products, they are also able to report across an institution, making it possible to get a far broader view of risk.
There are now also discussions about the potential to create industry utilities that pool data, which could provide an even wider view of risk. Whether these amount to anything is difficult to say at this stage, but the fact that they are even being talked about shows how far the industry has changed over since the early nineties.
Embracing the future
Whether it is regulation or competitiveness, technology is the key to staying ahead of the curve. Machine learning and Artificial Intelligence are making great leaps and slowly being adopted by the industry. We believe AI as Augmented Intelligence will be the key to risk management success. Where the hard graft, data crunching is performed by machines, and humans are empowered to make better decisions based on insights deciphered by transparent algorithms.
There will always be risk, however. In the end, without risk, there won’t be opportunity, but what the industry can achieve with the right mix of technology, enthusiasm, innovation and collaboration, is far greater, and far safer, than we can achieve by working alone.