The future of robust financial risk management exists in integrated risk management system where various risk related process are digitized so that there is minimal manual touch points, the process are formalized and uniform across the financial institution.
Various technology vendors have up their ante to provide technological platform for the risk management activities to their financial clients. Often a disconnect in understanding exists in the expectations of the risk managers and services claimed to be provided by the technology vendors. We in this article try to fill that gap by stating expectations from a financial risk manager’s perspective.
First step in any risk management process is business understanding. Risk managers should have a clear understanding of the portfolio for which they are planning to quantify the risk with the company’s business objectives.
For any digitized framework, it will be important to be able to help risk managers in recording their understanding of the business easily. The framework should enable the risk managers to document discussions with the business managers, help them to make the flow of business process impacting the model they are developing as well as processes impacted from the output of their models.
Second step is understanding of data associated with the business process and its subsequent cleansing. The digitization framework should enable the risk manager to understand
- Properties of data like probability of default cannot be more than 1, stock prices are more than zero.
- It should be able to help the risk manager to triangulate the data from its origin. For example a given loan portfolio has three different models predicting Probability of Default (PD), Loss Given Default (LGD) and Exposure at Default (EAD), These models are developed by separate teams. These teams should be able to ensure that the data they are using is consistent. Interestingly it often happens that a portfolio of 100 billion may have 6 different PD models (say 10B, 20B, 30B, 10B, 20B and 20B) and 4 LGD (25B each) models and an EAD model catering to a 250B portfolio where this 100B is a part of that portfolio.
- The framework should also ensure that the data cleansing experience is recorded so that there may be possibility of automating it as the process reaches its standardization.
- In case there is limited data, the risk system should be able to tell the risk manager about similar portfolio within the organization whose data may be used to have deeper understanding of the business risks problems.
Often a disconnect exists among the risk modeling teams. A team developing PD model for a portfolio is not aware about the LGD model developed for that portfolio. As discussed in point 2 there may not be a dedicated PD, LGD or EAD model for a given portfolio. This makes the problem further complex. Risk system should be able to tell at the data set level where and how any data is being used.
Third step of MRM is development, testing and implementation of model. Choosing methodology is unique to the underlying data and its properties. For simple regression based models the digital framework should provide all the statistical measures which are relevant to ensure that the regression is correct. It should be able to warn/explain the risk managers if there is a possibility of spurious regression. Two practical examples we would like to share
- When data is limited, regression relationships are held ransom by some outlier observations. Their inclusion/exclusion makes or breaks the relationship.
- Due to absence of stationarity, statistical measures may present false validity of the regression relationship.
There should be a provision to record such experience study so that future model developers and validators can use that to ensure robustness in the risk models.
When models are just implementation of standard methodologies, they are easily implemented in the risk systems. But it is also common that model methodology represents a business process. It is a challenge to develop such models completely system based.
They require advanced tools like R, Matlab, SAS or C++ etc. The digitized framework should be robust enough to ensure that
- Data inputs/outputs are easily imported/exported in/out of these programming languages.
- It supports the automated running to codes written in various programming languages.
It is also the responsibility of the developers that in case the model is coded by them externally, it is robust enough that it can be run in automated fashion with minimal human intervention. The should follow coding best practices like no hard coding of numbers, coding the model in any ‘one’ language of their choice etc.
Forth step is model documentation. This is the most important aspect in risk management and probably the most value add any digitized framework can provide. The digitized framework should provide
- Possibility to simultaneously develop the documentation starting from the business understanding to final step of model monitoring process.
- Hence it should be a representative of status the risk management exercise for any senior manager and leader.
Next steps in MRM is model validation step. There are two schools of thoughts in the financial risk management industry around model validation:
- Model validation exercise starts after completion of model development.
- Benefits: Ensures that model validation is independent of model development.
- Limitations: It is a sequential process and hence time consuming and even more time consuming when model is failed by the validators or later by auditors.
- Model validation exercise runs parallel to model development.
- Benefits: Model validators are part of discussions since the very beginning and give independent opinion of the modeling process on agile basis. Hence the whole risk measurement exercise is done in shorter duration.
- Limitations: Higher possibility that independence of model validation and development is compromised.
The digitization engine should ensure that in both the above scenarios validators are able to independently critique the business process, able to test the model for its conceptual validity and perform their independent back testing and sensitivity analysis.
Once the model is validated and implemented, for ongoing monitoring of the model the digitized framework should be able to
- Keep track of the model outputs, back test and sensitivity results.
- Risk managers should be able to compare the results with similar model in the company and hence draw an inference about the business and any model risk.
- Using the historical outputs and inputs risk managers should be able to find model sensitivity to the risk factors and recognize patterns using advanced analytics.
Apart of smoothing the process of model risk management, the biggest impact digitization will and is bring in the process is a sense of discipline and a degree of uniformity/standardization in the management of financial risk within the industry. This will make easy for the financial institutions to share the best practices in risk management. Due to standardization it will be possible for banks to share the relevant risk experience at “arm’s length” within peers. For example
- Experience related to operational risk.
- Experience related to Loss Given Default for industrial benchmarking.
- Experience related to money laundering so that strong and robust anti-money laundering strategies could be developed.