The Cato Summit on Financial Regulation – June 2 2015

On June 2 2015 I attended a summit in NYC, topic being: Capital Unbound: The Cato Summit on Financial Regulation.

 I would like to present a brief summary about this summit.

The summit was about industry response towards current challenges regulatory policies pose to the financial institution. It was a philosophical critique of SECs action and its counterproductive impacts.

The speakers gave a rhetoric view that because of regulatory policies the current growth is late to arrive and also there is no guarantee that this growth will be sustainable.

Key points:

  • Reason why earlier recession arrived was because of the irresponsible underwriting of housing mortgages to subprime customers. This policy of underwriting was started by Fannie Mae and Freddie Mac which was followed by the private players. So the fall in house prices created a systemic challenge. Because both the organization were govt. sponsored so it was the govt which started this trend.
  • Speakers claim that government (Fed, SEC etc.) have not addressed this key issue but rigorously started putting risk limitations in other commercial, consumer lending. They also claim that their interference in the bank’s trading book have severely limited the business viability.
  • They claim that policies of government are creating opportunities to create even more SIFIs and then limiting them in their activities. They claim that smaller banks are becoming more and more vulnerable because of regulatory compliances. They claimed that this possess a severe limitations on banks to act as a small business lenders and thus hamper a long term growth of the country.
  • The speakers claim that it would have been a good idea if the banks were allowed to fail, that would have resulted in a stronger financial system.

Key Speakers were:

  • Commissioner J. Christopher Giancarlo, U.S. Commodity Futures Trading Commission
  • Kevin Dowd, Professor of Finance & Economics, Durham University and Adjunct Scholar, Cato Institute
  • Commissioner Michael Piwowar, U.S. Securities and Exchange Commission

Further details can be found at: http://www.cato.org/events/capital-unbound-cato-summit-financial-regulation

Books provided in the summit:

As I said earlier these comments were more of a rhetorical views in the limited time span of the summit, the organizers provided several complimentary books to explain their conviction. The books I was able to collect are the following:

  • The Leadership Crisis and the Free Market Cure: Why the Future of Business Depends on the Return to Life, Liberty, and the Pursuit of Happiness

http://www.amazon.com/Leadership-Crisis-Free-Market-Cure/dp/0071831118/ref=sr_1_1?ie=UTF8&qid=1433345037&sr=8-1&keywords=1%29%09The+Leadership+crisis+and+the+free+market+cure

  • The Financial Crisis and the Free Market Cure: Why Pure Capitalism is the World Economy’s Only Hope

http://www.amazon.com/Financial-Crisis-Free-Market-Cure/dp/0071806776/ref=sr_1_2?ie=UTF8&qid=1433345037&sr=8-2&keywords=1%29%09The+Leadership+crisis+and+the+free+market+cure

  • The Libertarian Mind: A Manifesto for Freedom

http://www.amazon.com/Libertarian-Mind-Manifesto-Freedom/dp/1476752842/ref=sr_1_1?s=books&ie=UTF8&qid=1433345183&sr=1-1&keywords=the+libertarian+mind

  • Reckless Endangerment: How Outsized Ambition, Greed, and Corruption Created the Worst Financial Crisis of Our Time

http://www.amazon.com/Reckless-Endangerment-Outsized-Corruption-Financial/dp/1250008794/ref=sr_1_1?s=books&ie=UTF8&qid=1433345228&sr=1-1&keywords=reckless+endangerment

I have checked the reviews of these books in Amazon, all are rated 4.5/5

PRMIA Model Risk Management conference NYC, Oct 15th 2015

I attended model risk management conference organized by PRMIA

http://www.prmia.org/civicrm/event/info?id=6716&reset=1

It was a very generic conference on Model Risk management not focused towards any specific agenda. I would like to list out key points which were talked about:

  1. Documentation of model is key challenge for model risk. There are many reasons of lack of proper documentation of models, namely:
    1. Model developers are quants and are not interested in documenting their models because this is not interesting for them.
    2. Business owners are do not want to document their business activity that translates into model because they do not want to disclose their business strategies. Once they document them then those strategies will be on public domain.
  2. There should be a specific team whose responsibility is documentation.
  3. The panel stressed the importance of translators. These are those people who understand business, quants and IT aspects. So they will co-ordinate with these respective teams and hence limit model risk.
    1. This will give rise to the role of consultants.
  4. Model risk rise due to lack of understanding of the ecosystem. When two banks merge, in addition to merging of balance sheets they should also merge the models. There was an incident where two banks were merged and its two divisions were selling a derivative at two different prices. The raised an opportunity of making arbitrage within the bank.
  5. It is observed that when the banks fail a regulatory test (CCAR) they make huge investments in hiring quants for model development and model validation. Ideally they should hire more business professionals who can understand their business and streamline their processes. This streamlining will help in robust model development and hence limit model risk.
  6. Model risk also arises because lack of questions being asked by the risk managers when they receive the risk numbers from the business. There needs to be a cultural shift where risk managers should insist on explanation from the business on the numbers they provide.
  7. Ongoing monitoring is very important in aspect to check model risk. Best practices need to be developed to handle these processes. Banks can:
    1. Automate the whole process: This will give risk to new model risk.
    2. Transfer the process to countries where cost of low.
    3. Prepare dashboards which alerts the stakeholder in case the model is performing not on expected lines.
  8. Irony is that banks invest in risk management when they fail regulatory test and not when they pass giving rise to model risk.

My opinion

I have opinion on one point: Business owners being afraid in disclosing their strategies.

Being in business because you have a secret sauce with respect to your business is a thing of past now. Winners will be those who understand their customers well.

 

Date: Oct 16 2016

Interesting article by the speaker:

https://www.linkedin.com/pulse/all-model-risk-comes-from-models-martin-goldberg-ph-d-?trk=hp-feed-article-title-like

 

Simulation of Non Normal Distribution in Simple Steps

In risk management, assumption of data distribution is important because using that assumption the risk Managers come up with the required risk related numbers (especially Value at Risk and Potential Future Exposure). The accurate calculation of these numbers ensures robust risk management in the organization.

Assumption of normal distribution is common for many data series because of the business judgments for example return series.

Unfortunately in reality most of the data series are far from normal even though by theoretical intuition they should follow normal distributions (example: distribution of price returns). Those series demonstrate high amount of skew and kurtosis due to which statistical tests for normality fail. Also there is a limit in availability of data which prevents the risk practitioner to use historical simulation approach to come up with the required risk number. Under such circumstances, the practitioner has to rely on Monte Carlo simulation approach to represent the actual data distribution by a simulated distribution.

This issue is a widely researched in academia and they have also given good practical solutions; but most of those research scripts are quite mathematical in their discussion and pose a challenge for any practitioner who has not received a rigorous training in any quantitative subject.

In this article without going into the mathematical rigor, we present a simple and step by step approach to develop a non-normal distribution using Monte Carlo simulation. By saying simple, we do not mean that we simplify any mathematics around it, but rather refrain from going into the details of their proofs. For any reader who is interested in going into the details can read the papers mentioned in the references in this article. Our purpose is that after reading this article a practitioner with a little quant back ground should be able to understand and then develop his own non normal distribution.

In the next section we present the methodology to generate the non-normal distribution. We generate the nonnormal distribution using a non-linear transformation. This involves an optimization exercise, so in the subsequent section, we provide a SAS code where the parameters which will generate required skew and kurtosis values are derived. In the next section we summarize the steps in simple manner.

For users who do not have SAS, we provide tables (Appendix B) for the parameters needed for transformation to get various values of skew and kurtosis.

Methodology

Fleishman (1978) suggested a method for generating non-normal variables with specified values of skew and kurtosis. The idea is simple: Generate a random standard normal deviate X and compute

equation 1

Here Y will be having mean 0 and variance 1 and required skew and kurtosis. However, the difficult part is to determine a, b, c, and d such that Y has the specified values of mean, standard deviation, skew and kurtosis. Given X is standard normal distribution, the expected value of Y will be

equation 2

equation 3

As discussed E(Y) = 0. Hence a = -c. The equation now becomes

equation 4

For Y to have required skew and kurtosis b, c and d must satisfy the following three equations

equation 5

But before going further, please ensure that the combination of skew and kurtosis falls in the dotted region below. For other combinations this methodology is not appropriate. The equation below graph follows:

equation 6

equation 7

Summarizing Simulation Steps
Suppose the user wants to generate the non-normal random distribution with mean (μ) , standard deviation (σ), skew and kurtosis. He/she should follow the following steps:
1) Confirm that the skew and kurtosis combination follows in the graph above or the equation 2.

2) Derive the values of parameters b, c, and d using the SAS code above or from table (link in Appendix B).

3) Generate X as standard normal distribution.

4) Derive Y using equation 1.

5) Get the required random number distribution Z = μ + σ × Y

Conclusion

Assumption of normality of data distribution is simple and intuitive for data modeling exercises. But in spite of sound logical assumptions from theoretical intuition (typical example is that returns of an asset are assumed to be normally distributed); real data series fails the statistical tests of normality. This is because of skew-ness and fat tails (kurtosis) in them. Researchers have made a lot of progress to include skew and kurtosis of the data; but most of the papers describing the process are mathematically overwhelming for practitioners and business professionals. We by no means undermine or compromise the importance of mathematical rigor; but believe that as long as right references are known and available to the business professional; he should be able get his assignment done correctly without any need of detailed understanding the rigor. Hence in this paper without including any mathematical rigor we guide the practitioners how they can implement their required non-normal distribution without losing conceptual soundness.

References

1. Fleishman, A. I. (1978). A method for simulating non-normal distributions. Psychometrika, 43(4):521-532.

2. Luo, H (2011). Generation of Non-normal Data – A Study of Fleishman’s Power Method

Appendix A: SAS Code

proc nlp tech=trureg outest=res;

min y;

decvar b c d;;

f1 = b*b+ 6*b*d+ 2*c*c+ 15 *d*d-1;

f2 = 2*c*(b*b+ 24 *b*d+ 105 *d*d+ 2) – RequiredSkewValue;

f3 = 24 *(b*d + c*c*( 1 + b*b + 28 *b*d) + d*d*( 12 + 48 *b*d + 141 *c*c + 225 *d*d))-RequiredKurtosisValue;

y = (f1 * f1 + f2 * f2+ f3 * f3);

run ;

Appendix B:

Fleishman’s Power Method Coefficient table
Table https://chandrakant721.wordpress.com/fleishmans-power-method-coefficient-table/

Ways to look at Valuation vs Pricing

I would like to share some perspectives to look at valuation vs pricing:

Definition wise

Valuation: Valuation of a portfolio is discounted value of the future cash flows. Accurate knowledge of these cash flows is important, but the future cash flows are assumed to be projected assuming current economic scenarios. Value “quite a lot” depends on utility, and utility is most of the time stable hence value is stable. It is assumed that whatever the relationship portfolio holds with the external factors today will hold good in down the line in future.

Pricing: Price as discussed, depends on current demand and supply, which is very much dependent on the current circumstances.

Held for Sale and Held for Maturity wise

In the most simplest term, assume we have bought a new car. The car’s price is 20,000 dollars.

From pricing perspective: Once we get out of the showroom and try to sell the car, it won’t be worth 20,000 dollars. Because once the car is out of showroom any new buyer will consider two scenarios, first the car has some defect, second that car is perfect. Each scenario will have some probability, so from the risk neutral perspective he would like to get some discount in the money he pays for the car to be indifferent from the risk that the car has a defect.

From valuation perspective: If we do not intend to sell the car, its value is still 20,000 dollars for us. Because we need the car and if it is taken away from us we would need to replace the car and again the original car will be 20,000 dollars. Also we know the car is perfect so from our perspective because we know its history, so we would consider the car to be 20,000 dollars.

Value and price in the context of financial derivatives

Another way to look at price and valuation is comparing the model prices (~value) and the actual prices.

Models in risk and front-office systems use certified methodologies for pricing of financial instruments. However, while models prices act as a benchmark, market prices are only finalized after rigorous negotiations. The level of urgency and the current demand/supply play an important part in the final negotiated price, which may not reflect the observed market parameters a model may be using for pricing.

Pricing systems use data from market data providers to price the securities in the portfolio, but there are two major limitations on the data provided by them: (1) they try to capture all the known deals in the market, and provide market parameters based on this information; and (2) the market parameters depend on an immediate amount of demand and supply, which may not be reflective of a bank’s portfolio size while marking the portfolio to the market

Regarding the first limitation, to keep themselves competitive, banks often do not disclose the deals they are providing to their prime customers. So, even though the reach of data providers in the market is deep, they still do not have complete knowledge of the market.

The second limitation, meanwhile, makes it challenging for market participants to know the exact price of an existing security, unless a bank decides to go to the market to sell/buy it. Indeed, there is a fair chance that the realized price of an instrument will differ from the model price.

I will keep updating this discussion as I collect/understand/gather more insights.

Management challenges in Model Validation Exercise

In financial risk management small models are primarily those which cater low portfolio size in comparison to the total portfolio of the company. On the other side big models:

  1. Cater huge portfolio chunk of the company hence they have got high visibility in the senior management.
  2. They are generally network model, that is their inputs are outputs of other models. So those input models come in the validation scope of the main model.

When big models are to be validated, a team of validators is constructed. This team is an adhoc team constructed from personals from different teams. The members work together for the requisite time and then they go back to their respective teams. It is common that these members have their other respective responsibilities which they have to adhere while validation of the model in hand.

Generally team is composed of personals who are subject matter experts (SME) of the business which model caters, PHDs or masters in quantitative field with many years of business experience. Such a team even though composed of highly qualified and matured individuals has its own management challenges. In this article I would like to discuss some of them.

Deciding team structure: Selection of two members is extremely critical for the success of the validation exercise: lead validator and SME of the business which model caters.

The lead validator has to be not only a functional expert (quantitative and risk management concepts) but also a people manager. The SME will have to be a business expert and also an expert in quantitative techniques or in risk management. It is rare that SME is an expert in all three: business, quantitative techniques and risk management. Again SME should be a person who gels well with the team because in most of the times he would be the only goto person for every member for any business related understanding.

The team can be composed of employees of the bank and external consultants. The lead validator needs to ensure that roles and responsibilities of each individual are decided at the very beginning. The most important responsibility is documentation of the report. He may ask each analyst to document his work and finally coagulate or assign someone in the team to keep track of all the analysis.

Another challenge will be finding team members who have worked on same business and class of models. Some classes of models are similar models, for example suppose there is a portfolio “X”. One model is about its valuation, another is about stress testing. Ideally the validation team of these models should be same because of the similarity required in their understanding, but this is not always possible.

Limiting the number of meetings: Meetings are important to share the knowledge, analysis, ideas, and brainstorming. Meetings with model developers are also very important. But excess meetings result in wastage of time. Challenges are:

  1. Deciding who should attend which meeting: Involving every member in every meeting is a clear wastage of time and energy. On the other side it is important that every member should have clarity of the bigger picture so that they may provide relevant analysis and perspective.
  2. Lead validator needs to ensure that time of model developers should not be wasted by asking them same questions multiple times by different validators. Asking same question multiple times will not only waste the time of the developers but also create an impression that validation team lacks consensus thus reducing their credibility.
    • Discipline to maintain written record of all the communication needs to be followed.

Adherence of time lines: There are occasions when some other project/model comes with higher priority. That may require immediate attention of the validation team. This can create delay in analysis for the model in hand.

As discussed, the team members belong to different teams, so there can be a situation when some members may have to de-prioritize temporarily the model validation exercise in-hand. As a matter of fact the model in hand may not be of equal priority for each team member.

Reshuffling of resources is common. Back-up plans should be in hand.

Deciding the limit of analysis and number of findings: Every validator desires that his finding and analysis should find place in the final validation report. Also some validators in enthusiasm of performing deep down analysis tend to over analyze an issue.

Lead validator has to ensure that he stucks right balance on the number of findings related to the model and clarity of the validation team opinion. Fewer findings may appear that they have not gone into the details. Too many findings may leave the model developers distracted from the crucial issue.

Lead validator also needs to ensure that all the analysis and findings are iron clad accurate, to the point and consistent. Any flaw in their findings caught by the model owners will put the credibility of the whole model validation report into question. Such situation goes into a loop where model owners start validating the analysis and findings of the validators and vice versa.

Managing the ego of model validators: Model validation exercise is similar to mountain climbing. In mountain climbing target is to reach the top. Each mountaineer has to reach the top and on the way he has to find trails and treasures. Each one has his own tools and physical strengths. Some mountaineers may climb the mountain faster, some might be slower. Accordingly there would be variations in their findings of trails and treasures. On reaching the top the lead mountaineer has to decide which set of trails and treasures best describe the mountain.

Coming back to model validation, just replace the mountain with the model, trails with analysis and treasures with findings.

This creates a very competitive atmosphere in the team. Validators are often very emotional about their own analysis and findings but are critical of their peers’. This results in ego clashes. The lead validator needs to ensure that this competition remains healthy and productive. The lead validator has to make sure that each validator feels that he is given optimal of opportunity to present his analysis and given fair credit.

Reverse Mentoring – Critical to Success in Analytics Business

In last 10 years there has been unprecedented rise in information. This rise is coupled with growth in technology which has made the access of this information easy. Thus giving rise to innovation and making analytics business world faster than ever. Every 6 months we see a new technology or perspective coming.

In company that wants to be in analytics business, organization is nothing but the people in it and organization culture is the people’s belief. Its asset is the openness of its people to learn new ideas, enhance their capability and innovate.

An employee wants a good mentor for development and be successful. In current fast changing times, it is impossible for mentors to keep tab on market changes. But for leaders and managers it is important to be on top, if the company want to progress in analytics consulting domain.

We need to turn the table around. It is the employees who should now mentor their managers.

This idea of reverse mentoring is not new in the industry. Years back Jack Welch when he was chief executive of General Electric directed his top executives to learn how to use internet from their junior colleagues. But this was a short term coaching not mentoring in a true sense.

In current dynamic world where technology and ideas are moving with the blink of an eye reverse mentoring needs to be taken into totally different paradigm. Managers and leaders cannot be always aware about arrival of new technologies and innovations. They need to be more dependent on their Direct Reports (DR) to be aware of the markets.

This will be possible if at all levels, we give employees responsibility/opportunity to help their managers decide how they (managers) should develop in this fast changing business world.

I would like to present a simple approach by which we can improve an organization culture and make it robust to face changes in analytics domain.

We will need to change attitude starting from hiring talent as well as in nurturing that talent. Typical industry mantra is – hire DR smarter than you. But the challenge is how to recognize that fact. Hiring managers should ask themselves before hiring the candidate:

  1. Will I be happy to report this candidate after 5 years?
  2. What step I need to take to unleash his talents so that organization can bank on him?

The candidate should be hired based on answers to such and similar questions.

Once the manager hires the candidate he should give him opportunities to express himself. In the appraisal interviews it is the manager who should ask his DR:

  1. What should I do to get my next promotion?
  2. How can you help me?
  3. What should I do to help you in helping me?

First question is important in promoting innovative thinking. Every moment new ideas are coming the DR will be encouraged to explore those ideas and innovate, hence will be able to develop a vision. Question 2 and 3 will ensure that that innovative thought is implementable, measurable and aligned to company’s business objective.

A development plan combined with company’s business objective, manager’s interest and all DR’s inputs should be developed for manager. Each DR should be given responsibility on this development plan. This will be obviously above their regular responsibilities. A manager should be given due credit if they demonstrate role of their DRs in their development.

Even if the manager has not hired his resources, the suggested approach will motivate them in having right understanding and expectation from his resources.

Finally thought

It is observed that this is the attitude followed by every company at CXO level. The differentiator between great organizations and others is that in great organizations this strategy is followed at the most junior level. As this strategy gets diluted in the lower hierarchy the organization gets lower in the food chain.

Model Documentation Best Practices – To help Model Validators

With rise in regulatory scrutiny, modeling standards have increased and so their documentation. I have seen models which were developed 15 years back, they were not even documented!

As such no documentation is perfect, because model developer is a different person and validator is a different person. They both have different past experiences and hence different perspectives and hence different way of look at issues. The best documentations are those which help the validator to formulate the gaps in their understanding and make easier to ask right questions to the developers.

Still a lot more can be done by the model developers in improving their documentation. Some of them can be:

  1.  Some models are model within model. Model developers thankfully are developing separate document for each model. They include flowcharts for the model where it is clearly demonstrated who are the feeder models and which ones are receiver models. It will be great if they also provide a flow chart of model documents as well. This flow chart will guide model validators where they should start reading so that they can understand the model without getting confused in minimum time.
    1. It is helpful if the model owners provide locations in the document (which will be many) about the various connecting models. These pointers will give location where conceptual details are provide.
  2. In model documentation developers provide the thought process of the model development, statistical analysis which they have gone through. It is helpful and important. But when the model is model within model, it will be great if in each submodel documentation, model is described quickly in first few pages so that model validators are able to see the big picture quickly before going through complete documentation.
  3. When there are multiple models in a model, the triangulation of the data is a big challenge. A document especially focusing this issue should be developed.
  4. When model is within a model, a prototype model (favorably an excel based) should be developed which will help the model validators to understand the conceptual details of the model structure in an unambiguous manner.
    1. Clubbing this point with 2, it is important to have a quick mechanical overview of the model so that the validator has an understanding what are the components of the model, how are they mathematically related etc.
  5. There are various excel sheets associated with the model. It will be really helpful if these excel sheets contain a tab which acts as a read me tab explaining the other tabs and inter-dependencies as well as what is the source of this excel sheet and where this excel sheet acts as an input.
    1. Developers often include various analysis to defend their model, they should document each of this analysis so that the validators may be able to replicate them. They should document each of the analysis’s purpose so that the validators are able to understand why this analysis adds value to the model’s defense.

This page will be updated regularly.