The Impact of Big Data in Market Risk Management

Big data techniques and applications have provided data-intensive industries a pathway to extract latent information from the plethora of data being generated each second. Given the volume of data and computational prowess at their disposal, the financial services industry has now begun to focus on the “value-add” that big data techniques can bring in optimizing current practices, as well as designing new ones.

For accurate risk measurement, management of the huge volumes of data is very important. As different areas of risk face their own unique challenges, a need has arisen to review how they can be answered with specific big data applications.

Since the measurement of market risk is governed by regulatory guidelines, the application of big data techniques in market risk management is a particularly tricky issue that is further complicated by the fact that the industry has grown accustomed to conventional methodologies of risk measurement.

Just because we now have large volumes of data about our trading positions, it would be naïve to expect a revolutionary change overnight in the way we measure risk. However, that said, ignoring the effect that big-data technologies can have on the robustness of risk measurement would not be prudent.

This article focuses on current challenges in market risk management and how big data techniques can help to address those challenges without disturbing current practices and regulatory requirements.

Market Risk in a Nutshell

Before delving deeper into the role of big data, we first need to explain market risk.

Market risk is defined as any shift in the valuation of a portfolio due to a change in any of four market parameters: interest rates, foreign exchange, commodity prices and equity.

Interest rate changes heavily impact companies that deal in loans. Banks, for example, take loans from one set of customers and pass them to others. Changes in foreign exchange rates, on the other hand, have a large effect on companies with a global presence, because such companies transact in various currencies.

Commodity prices, of course, are most significant for companies that deal in physical goods (e.g., coffee, gold and wheat), because of their dependence on raw materials for business processes. Fluctuations in stock prices, meanwhile, have a great impact on publicly-listed companies.

To assess market risk, many firms make use of value-at-risk (VaR), a statistical technique used to measure and quantify the level of risk within a firm or an investment portfolio over a specific time frame, under a given confidence.

The treasury department of almost all the companies trade in bonds and other financial derivatives to meet various business requirements. Their trading may be because of hedging requirements or they may indulge in proprietary trading, depending on company policy. Regardless of policy, the fact remains that all firms are impacted by market risk, and they consequently have to manage this risk effectively.

Market Risk Measurement Process

Given the critical impact of market risk, financial organizations have processes set up to measure this risk. Typical steps include collecting the market data (like trading positions and historical interest rates, foreign exchange rates, commodity prices and equity prices); using the data to model risk measures like VaR and economic capital; and stress testing the model by shocking the parameters, such as interest rates, foreign exchange rates or commodity prices.

All the activities need to be documented and reported to the senior management for the formulation of hedging policies. Moreover, traders need to be informed about the limits of the exposures, to ensure proper collateral management and to provide the firm with the ability to uncover any malicious trading activity.

It is also important to remember that risk measurement has many iterations, because of potential changes in market conditions, new regulations and IT upgrades. Typically, organizations go for IT upgrades every two to three years.

Challenges in Market Risk Management

Due to fast-moving markets, there are several reoccurring challenges that financial institutions face in the measurement of market risk. For example, choosing the right data is critical yet difficult, because of the dynamic nature of the market.

As a consequence of the various types of yield curves available (e.g., treasury rates and LIBOR rates), interest rates pose a particular problem. Specifically, choosing right curve for right product is a daunting task.

What’s more, the yield-curve construction itself poses challenges, such as selecting the right instruments for the particular tenures (e.g., five-year or 10-year) of the curve. The instruments need to be vetted for liquidity and market integration, and the assumptions linked with the smoothening of the yield curves that are obtained need to be realistic. Issues like missing data, stale data or incorrect data feeds also sometimes arise.

Simulation-based models are among the tools used to measure risk. The accuracy of outputs from these models has to be governed. These models are simplifications of real-world scenarios, and multiple future scenarios should therefore be constructed.

Assuming that the right simulation model is chosen, the larger the number of simulation or future scenarios, the better the accuracy of the outputs. A larger number of simulations, however, also require more time. So an optimal combination of speed and accuracy should be the goal.

In the world of portfolio management, computing the optimum capital allocation among different assets requires a meticulous approach. Given that a portfolio has many financial products, the right correlation (or any other dependency measure) needs to be computed to allocate the right amount of capital for the overall portfolio. A proper measure of correlation is important, because it produces diversification benefits.

Currently, with huge datasets, handling the velocity and volume of data is a daunting task. With practices like high-frequency trading, it is vital to capture rapidly-moving global market conditions. The windows for market inefficiencies are typically small and traders face the challenge of exploiting them.

Planning a framework of robust stress testing is another challenge that risk managers must face. Since past performance is no guarantee of future performance, an optimal number of scenarios must be selected. What’s more, reports that clearly inform risk managers about scenarios that endanger an organization’s solvency must be developed.

While reporting of the risk numbers may seem like a trivial step, it is nevertheless equally critical. Developing such a report (which should explain the overall risk of a company’s portfolio to the management team) remains a challenge, because of dynamic market conditions, changes in regulatory requirements and revisions of company policy.

Applications of Big Data Techniques

Thanks to the rise in data quantity and computational power, Big Data techniques can help in various ways in risk measurement. Data, data storage and enhanced computational efficiency and cross validation of numbers (through robust interconnectivity among modules) are among the areas that can be aided by these techniques.

Enterprise data is expected to grow significantly in next five years. Given the complexity of the data and its structure, the technology should, first and foremost, be simple to use and understand.  Managing this amount of data, and ensuring valuable execution, requires the following:

  1.  Handle the Data. It is important to test the value of the in-house data. While some can argue that every bit of  information is useful to someone, the data tool should be able to identify sparsely-used data. This would avoid  any unnecessary analytics involved with low-value data.

    The issue of finding value in limited data is very much evident in credit-default swaps (CDS). CDS values are  more reliable in calculating credit risk, but the constraint is that not all the obligors in the portfolio would have  actively-traded CDS. We have to use the available limited data to get the extrapolated/interpolated values of  the remaining ones.

  2. Create a visualization toolbox. Humans have limited imagination, and find it easier to understand situations that they can visualize. A chart that demonstrates the relationship between CDS and stock prices is an example of a visualization tool specific to market risk.
  3. Develop a sanity-check toolbox. This toolbox should enable the risk manager to make some initial data checks and should assist in the development of a market-risk-specific remediation plan.
  • It should include the following:
  • a. An application for cleaning up missing/stale/garbage data.
  • b. A tool that can identify any spike in data that has the potential to become unreasonable because of the existing conceptual relationships – e.g., a spike in foreign exchange in spite of stable interest rates or a sudden jump in some forwards of commodity prices. This tool should have   built-in adaptive intelligence capabilities that can trigger warning signals for risk managers when abstract data patterns occur.
  • c. A warning device for distributions. Financial risk managers have defined distributions for various market risk variables. Examples are stock prices that follow Geometric Brownian motion; interest rates that follow mean-reversion behavior; and foreign exchange prices, which tend to follow both geometric Brownian motion and mean reversion. Obviously, real-world data does not follow these distributions, and the toolbox should warn the risk managers about the level of these distributions and their order of deviation.

The Execution Angle

Execution is another area in which big data techniques can help. When we talk about execution, we mean the risk measurement process where models are implemented to assess risk.

As discussed previously, most of the measurement methodologies for market risk are simulation based. The larger the number of simulations, the better the accuracy.

A parallel computation approach can be used to perform such simulations, because each simulation step in a given methodology has an identical initial value and use an identical procedure. Consequently, when using a parallel computing approach, a larger number of simulations can be more quickly carried out, resulting in more accurate results in a shorter window of time.

Data parallelism is another advantageous big data technique. Rather than extracting huge chunks of data from another platform, it speeds up access to data embedded in computational algorithms and models.

Parting Thoughts

Sooner or later, with the rise of big data techniques, we are going to see an overall change in perspectives about the way market risk is quantified.  Existing methodologies are more than a decade old and are based on data and computational resources that were available 10 years ago.

Now, with much more data available and more resources in hand, processes are going to evolve. However, this change won’t happen overnight, because markets will need time to see the effectiveness of more modern techniques.

This article has highlighted some of the quick-kill areas where big data techniques can directly help market risk managers improve their comprehensive view of risk management.

This article was published by GARP

http://www.garp.org/risk-news-and-resources/2014/september/the-impact-of-big-data-in-market-risk-management.aspx

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s