Access the full text.
Sign up today, get DeepDyve free for 14 days.
References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.
Andraz Grum Abstrac t Slovenian commercial banks have two possibilities for calculating capital charges for the market risks to which they are exposed. Due to the capital decree legislated by the Bank of Slovenia, they can use standardized methodology or apply an internal model. An internal model can be based on different risk measures, with each risk measure having its strengths and weaknesses. Consequently, the volume of risk calculated using a specific risk measure will vary among risk measures. Basel II regulation assumes VaR methodology for capital requirements calculations for the market risks to which commercial banks are exposed. There are two commonly used methods for VaR calculation historical simulation and the variance-covariance method. Each has its strengths and weaknesses. The goal of this paper is to present the methodology of volatility and time weighted historical simulation as an internal model for market risk measurement in Slovenian commercial banks. The methodology is based on historical simulation and tries to remove the disadvantages of this method with GJR GARCH volatility modelling and the time weighting of returns. Key words: Commercial banks, Value at risk, Risk measurement, Internal model, JEL : C22, E58, G21 DOI: 10.2478/v10033-007-0009-x 1. Introduction In this paper we regard commercial banks as financial investors. Slovenian commercial banks have two possibilities for calculating capital charges for the market risk to which they are exposed. Due to the capital decree legislated by the Bank of Slovenia, Slovenian commercial banks can apply internal models for capital requirements calculation for currency risk and selected market risks (general position risk in line with debt and equity instruments, price change risk for commodities) as an alternative or in combination with standardized methodology. The standardized approach has to be used by banks in case they do not have an internal model; it is based on a capital decree legislated by the Bank of Slovenia. Alternatively, commercial banks can apply an internal model for risk management purposes and can use several risk measures. Each risk measure has its strengths and weaknesses. Consequently, the volume of risk calculated using a specific risk measure will vary. The risk management process in a commercial bank is based on a calculated value of risk measure. Consequently, it is very important for a commercial bank to fully understand the interpretation of a selected risk measure. If the volume of risk varies using different risk measures, the decisions upon changes in positions in a portfolio will be different. One of the most frequently used measures of market risk is the VaR (value at risk) risk measure1 . Basel II regulation assumes VaR methodology for capital requirements More about VaR can be found in Jorion (2001). * Andraz Grum, PhD IFIN, Institute of Finance, Ulica Jozeta Jame 14 SI-1000 Ljubljana email@example.com calculations for the market risk commercial banks to which are exposed. There are two commonly used methods for VaR calculation historical simulation and the variance-covariance method. Each has its strengths and weaknesses. The goal of this paper is to present the methodology of volatility and time weighted historical simulation as an internal model for market risk measurement in a commercial bank. The methodology is based on historical simulation, and tries to abolish the disadvantages of this method with GARCH volatility modelling and the time weighting of returns. days), where the weights of specific security (i = 1, ..., N) in a portfolio are assumed to be constant over time: 2. Methodology 2.1. Distribution of Returns The historical simulation methodology of VaR calculation is commonly used because of its independence to risk factor distribution. If a commercial bank wants to calculate the general position risk for securities in its portfolio, the results of historical simulation will not depend on portfolio return distribution. As Andersen et al. (2000) point out, the asset return distribution is usually not normal, but rather leptokurtic it has fat tails (excess kurtosis) and is asymmetric (skewed). This is bad news for the variance-covariance method for VaR calculation, which is based on the assumption of normal asset returns distribution and consequently underestimates the possibility of extreme returns. One of the reasons for the distribution to be leptokurtic is that asset returns volatility tends to cluster in time (Poon and Granger (2003)). As Campbell, Lo and MacKinlay (1997) point out, the distribution of volatility of asset returns in time does not follow the independently and identically distributed normal (IDD) model as assumed by efficient capital market theory. If asset returns distribution conditional on volatility is normal, then unconditional distribution will have fatter tails as normal distribution. Calculated historically simulated returns can be represented by a histogram, in which the VaR measure for a given significance level can be obtained. If we are looking for the VaR measure at a 99% significance level, then the third (1% of 250 is 2.5, rounded up to 3) biggest negative return is the percentage VaR. The absolute value of VaR is obtained if the percentage VaR is multiplied by the current portfolio value. The historical simulation method has its disadvantages. The main methodological disadvantage is the fact that returns are not time weighted; rather they all have been appointed the same weight (1/number of observations). The weighting scheme indirectly assumes the risk factors and consequently historical returns to be IDD over time. Such an assumption is problematic, especially in emerging markets which are assumed to be inefficient and where volatility clustering and autocorrelation of returns are very common. Another disadvantage comes from the fact that the calculated VaR measure depends on the actual returns observed in the chosen past time period, from which several problems emerge: · If volatility in period chosen is low/high compared to current volatility, the calculated VaR measure will underestimate/overestimate the actual current market risk. Historical simulation is not an adequate market risk measure if extreme market volatility movements or shocks are observed in the period chosen because it reacts slowly to such movements. Extreme negative returns that are unlikely to reoccur can unrealistically overestimate the VaR measure. Extreme negative returns that occurred in the past usually result in a ghost effect they continuously affect the calculated VaR measure for a long period of time before suddenly disappearing as they fall out of the chosen period. 2.2. Historical Simulation VaR Calculation and Disadvantages The capital charges for market risk are usually calculated on the basis of no less than 250 daily returns of a commercial bank securities portfolio. The historical simulation VaR calculation for a portfolio of securities is based on the empirical distribution of historical portfolio returns (in n = 250 · VaR calculated with historical simulation is limited by the biggest negative return in the chosen period. Bigger negative returns can not be extrapolated, even if they are possible at present. ( t,i) for the period at the end of day (t-1). The predicted volatility (T,i) at time (T) serves as a multiplier, with which historical returns (ri,t) at time (t) weighted for volatility ( t,i) at time (t) are multiplied: The problems stated above cannot be solved by classical historical simulation. A combination of historical simulation and volatility modeling, namely volatility and time weighted historical simulation, has proven to be a good solution. This method is especially appropriate for emerging market economies and can be used as an internal model for market risk assessment at a commercial bank. 2.3. Volatility and Time Weighted Historical Simulation If we assume that unconditional returns are not IDD, then it can be assumed that the data on returns from the more recent past are more representative of future risk. As a possible solution to the problem, Boudoukh, Richardson and Whitelaw (1998) suggested a generalized historical simulation method known as the BRW model. The BRW model assigns different weights to returns, depending on the time of their origin. The last historical return rt has an assigned weight w1, the return before that r t-1 has an assigned weight w2, where w2 = w1* and so on. represents the exponential decay factor with values on the interval between 0 and 1. The largest weights are assigned to the returns from the more recent past. The value of the weights decreases in time according to an exponential decay factor, and the sum of all weights is equal to one. When the weights are assigned to each return, VaR is calculated on the basis of the empirical distribution of weighted returns from the cumulative distribution function. The BRW model is commonly used and was accepted by RiskMetrics (RiskMetrics Group, 1999). RiskMetrics uses a decay factor = 0.94 for daily data and = 0.97 for monthly data, though practical experiences have shown that the optimal value of the decay factor varies depending on the specifics of the financial market. The problem of assigning different weights to returns on the basis of their occurrence can be solved by another method. Hull and White (1998) suggested returns be weighted by volatility. The idea behind this theory is to adapt past returns to the change in volatility that occurred most recently. The prediction of VaR on day (T) depends on the latest historical return (ri,t) and the GARCH prediction of volatility The method assures the multiplication of past negative returns - they increase or decrease depending on the current market volatility. The volatility weighted returns are then represented by a histogram, from which the VaR measure for a given significance level can be obtained. The GARCH (Generalized Autoregressive Conditional Heteroscedasticity) model (Bollerslev, 1986 and Taylor, 1986) will be used for future volatility prediction, as it was specially designed to model volatility clustering observed in financial markets. In the GARCH model, conditional variance of returns changes in time and is a function of past variance and the square of past returns. The model assumes future variance can be predicted from past returns and volatility. If we take into consideration only one past period, then volatility is predicted by the GARCH(1, 1) model, which is the most commonly used (Poon and Granger, 2003): The simple GARCH (1, 1) model captures most of the variability in the return series. Small lags are common in empirical applications. The model is adequate for modeling volatilities even over long sample periods (Bollerslev, Chou, Kroner, 1992). For commercial banks, risk is represented by negative rather than positive returns. The model should be able to treat negative returns asymmetrically to positive returns. For that reason, the asymmetric GARCH (AGARCH) model or GJR (Glosten, Jagannathan, Runkle, 1993) GARCH model can be used. GJR GARCH models are sometimes known as Threshold-GARCH or TGARCH models 2. They are similar to GARCH models, but include a term to capture the leverage effect, or negative correlation, between asset returns and volatility. For certain asset classes, most notably equities, but excluding foreign exchange, volatility tends to rise in response to lower than expected returns, and to fall in response to higher than expected returns. Such an effect suggests models that include an asymmetric response to positive and negative surprises. GJR GARCH (1, 1) model can be represented as: The GARCH volatility prediction assigns greater importance to historical volatility predictions than to the current return. If volatility on the market is growing, then the VaR measure will also grow and will be slow to fall when volatility falls. If extreme negative returns are observed in the period chosen the ghost effect distorts the true level of risk. The remaining problems can be solved by a procedure from the BRW model by using an exponential weighting scheme. For that reason, however, the value of the decay factor should be readjusted. If exponential weighting is used on modified data (as in the case of volatility weighted returns) then the importance of the decay factor should be much lower, while the fore value of the decay factor should be close to one. If the importance of the decay factor is high, then the multiplication of current volatility with historical returns would no longer be sensible, as its importance would quickly fall. The suggested level for the decay factor is above = 0.99. To lower the importance of the decay factor, we use a factor of 0.997 in the estimation. 3. Data For emerging markets, or for markets with evident trends, different values can be used. This model is often sufficient to describe the conditional mean in a financial return series. Most financial return series do not require the comprehensiveness that an ARMA (Auto Regressive Moving Average) model provides. The volatility and time weighted historical simulation (VTWHS) model represents the combination of Hull-White model and BRW model. The VTWHS model starts with Hull-White volatility weighted returns, but uses a modified AGARCH or GJR GARCH model rather than the GARCH model. With the GARCH prediction of volatility, the Hull-White model solves most problems of classical historical simulation; two problems, however, remain: We tested the adequacy of the methodology of volatility and time weighted historical simulation as an internal model for market risk measurement in a commercial bank. It was assumed that a Slovenian commercial bank invests their trading book positions mainly in stocks included in the main Slovenian stock exchange index SBI20. The methodology was tested on the 500 daily returns of SBI20 stock exchange index in the time period between 19th June 2005 and 14th June 2007. Many references mention the GJR model as a TGARCH, or Threshold GARCH, model. However, others make a very clear distinction between GJR and TGARCH models: a GJR model is a recursive equation for the conditional variance, whereas a TGARCH model is the identical recursive equation for the conditional standard deviation (see, for example, Hamilton (1994, p. 669), Bollerslev, et. al. (1994, p. 2970)). 4. Results The parameters of GARCH (1, 1) and GJR GARCH (1, 1) model were estimated using MATLAB 7 software. Parameter Parameter Value Standard Error 0.022032 0.011678 0.061296 0.059722 T-Statistic Value Standard Error 0.022138 0.061427 0.061427 0.082743 0.090180 T-Statistic Table 1. Estimated Parameters of GARCH (1, 1) model Table 2. Estimated Parameters of GJR GARCH (1, 1) model Date SB 120 matching portfolio value Daily return (%) Daily return squared Recursive variance Recursive standard deviation Normalized returns Volatility weighted returns Time weightening (=0.997) 3,129.81 3,124.70 3,112.07 -0.184017 -0.163269 -0.404199 -1.000000 -0.515068 -1.127978 -0.42180203 -0.217256848 -0.475783256 -0.420536624 -0.216605078 -0.474355907 4,570.60 4,585.22 4,599.64 4,600.06 0.316012658 0.102317122 0.098903159 8.33779E-05 Table 3. Example of GJR GARCH (1, 1) volatility and time weighted historical simulation VaR calculation Column 2 in Table 3 represents daily index values, from which daily returns (column 3) can be calculated. In Column 4 daily returns are squared, as they are needed as input in the GJR GARCH (1, 1) model recursive variance calculations (column 5). Recursive variance is calculated with equation 4 and on the basis of estimated parameter values from Table 2. The square root of variance namely recursive standard deviation is presented in Column 6. The daily return divided by recursive standard deviation gives the normalized return. For every day in which the simulation was performed, in Column 8 the recursive standard deviation calculated for that day was multiplied with all normalized returns from the sample of the chosen period. The multiplication resulted in volatility weighted returns (column 8). To estimate VaR for 15th June 2005, the first day after the start of the chosen period, then normalized returns should be multiplied with recursive standard deviation estimated on 14th June 2005. To achieve the best possible sensitivity to market changes, volatility weighted returns were exponentially time weighted with a decay factor of 0.997. The VaR at the chosen significance level for the next day can be obtained if volatility and time weighted returns (column 9) are represented with a histogram. Method VaR% (99% significance) -1.5379 VaR% (95% significance) -0.8371 Simple historical simulation Volatility and time weighted historical simulation with GARCH (1, 1) model and =0.997 Volatility and time weighted historical simulation with GJR GARCH (1, 1) model and =0.997 -1.0693 -0.6778 -1.0679 -0.6772 Table 4. Percentage VaR values for 15th June 2005 for portfolio matching index SBI20 calculated on the basis of different methods From Table 4 it can be seen that volatility and time weighted historical VaR calculation was much smaller (regardless of the chosen volatility model) than VaR calculated with the simple historical method. The reason for this was small volatility represented by the recursive standard deviation, which was at the time of the estimation about 20% below the average volatility within the chosen period. Because simple historical simulation was slow to react to market volatility changes, the commercial bank using such a method would overestimate the market risk, the resulting capital charges would be too high, and capital would be inefficiently spent.
South East European Journal of Economics and Business – de Gruyter
Published: Nov 1, 2007
Access the full text.
Sign up today, get DeepDyve free for 14 days.