0% found this document useful (0 votes)
28 views11 pages

Jurnal Mete Feridun

This article analyzes the 1998 failure of the hedge fund Long-Term Capital Management (LTCM) from a risk management perspective. It discusses how LTCM's Value at Risk (VaR) system failed to accurately estimate the fund's potential risk exposure, which ultimately led to its collapse. The failure demonstrates the importance of effective risk management systems for financial institutions. Banks and regulators can learn lessons from LTCM's experience, such as the need to properly implement quantitative risk models and ensure robust risk assessment practices.

Uploaded by

nadha niyafah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views11 pages

Jurnal Mete Feridun

This article analyzes the 1998 failure of the hedge fund Long-Term Capital Management (LTCM) from a risk management perspective. It discusses how LTCM's Value at Risk (VaR) system failed to accurately estimate the fund's potential risk exposure, which ultimately led to its collapse. The failure demonstrates the importance of effective risk management systems for financial institutions. Banks and regulators can learn lessons from LTCM's experience, such as the need to properly implement quantitative risk models and ensure robust risk assessment practices.

Uploaded by

nadha niyafah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

“Risk Management in Banks and Other Financial Institutions: Lessons from the

Crash of Long-Term Capital Management (LTCM)”

AUTHORS Mete Feridun

Mete Feridun (2006). Risk Management in Banks and Other Financial


ARTICLE INFO Institutions: Lessons from the Crash of Long-Term Capital Management (LTCM).
Banks and Bank Systems, 1(3)

RELEASED ON Thursday, 05 October 2006

JOURNAL "Banks and Bank Systems"

FOUNDER LLC “Consulting Publishing Company “Business Perspectives”

NUMBER OF REFERENCES NUMBER OF FIGURES NUMBER OF TABLES

0 0 0

© The author(s) 2023. This publication is an open access article.

businessperspectives.org
132 Banks and Bank Systems / Volume 1, Issue 3, 2006

RISK MANAGEMENT IN BANKS


AND OTHER FINANCIAL INSTITUTIONS:
LESSONS FROM THE CRASH
OF LONG-TERM CAPITAL MANAGEMENT (LTCM)
Mete Feridun

Abstract
Risk management is a rapidly developing discipline and there are many and varied views and de-
scriptions of what risk management involves, how it should be conducted and what it is for. Mar-
ket risk, in particular, is an area that has received increasing attention in the last decade as financial
institutions’ trading activities have significantly grown. This paper analyses the failure of LTCM
hedge fund in 1998 from a risk management perspective aiming at deriving implications for the
managers of financial institutions and for the regulating authorities. The study concludes that the
LTCM’s failure can be attributed primarily to its Value at Risk (VaR) system, which failed to es-
timate the fund’s potential risk exposure correctly.
Key words: Risk Management, Value at Risk, Market Risk, Risk Control.
JEL Classifications: D81, D82, G24.

1. Introduction
Risk management is a constant challenge to all financial institutions. Banks need to consistently
develop and improve their operational and technical practices. Credit Risk Management is assum-
ing greater importance in the current environment. With the implementation of the Basel Accord,
banks are increasingly moving towards quantitative risk evaluation of their loan portfolios. The
areas of market risk have long been under the quantitative risk management scrutiny but credit risk
has gradually emerged as an area for the quantitative risk management. This area has a unique set
of challenges and opportunities for the quantitative risk managers. These include both quantitative
modeling challenges as well as change management issues. The quantitative challenges include the
calculation of default probabilities for the banks clients, estimation of the loss rates and inclusion
of various structural elements of the credit deals. Further, these models have to be back-tested and
validated. There are significant data issues including the variability of the accounting numbers,
relatively low number of actual defaults and missing information on the key drivers of default.
The new Basel Capital Accord, also known as “Basel II”, is an update of “The 1988 Accord”,
which was adopted by more than 100 countries worldwide. The new Basel II regulations put for-
ward by the Basel Committee on Banking Supervision are aimed at improving the safety and
soundness of the financial system by aligning capital adequacy assessment more closely with the
underlying risks in the banking industry, providing a thorough supervisory review process, and
enhancing market discipline. They further seek to maintain the current overall level of capital in
the system and enhance competitive equality.
If Basel II is implemented correctly, intelligently and in a timely manner, it can potentially reduce
capital reserves for some banks and, consequently, increase the amount of capital available for
investments, thus enabling banks to achieve optimum risk – reward profiles and help to increase
shareholder value. To comply with Basel II and accurately calculate risk measures, banks need to
ensure the timeliness and integrity of their data, and aggregate different risk types. Automating and
standardizing the aggregation of data residing in various systems and repositories across the bank
will bring efficiency to the process while maintaining data integrity.
Banks affected by Basel II will have to implement more sophisticated approaches to measuring
and managing credit, operational and market risk over geographic and departmental boundaries,

© Mete Feridun, 2006


Banks and Bank Systems / Volume 1, Issue 3, 2006 133

and determine capital reserves to cover potential losses. The first challenge banks may face is to
identify and aggregate their exposure data to meet the new data validation and corroboration re-
quirements for credit risk management, and to implement a system that can report a complete view
of risk across the bank.
Further, estimating default probabilities of corporate, private, SME and other counterparties will
call for employing various estimation models to reflect different counter party profiles over differ-
ent asset classes. Most banks decide to develop and implement their credit rating and credit scor-
ing models internally rather than using third-party models, because it enables them to customize
models to reflect their own proprietary knowledge, ensures fast implementation and enables them
to adapt to changing market conditions quickly. All of these advantages lead to more accurate risk
assessment. Furthermore, to meet Basel II’s supervisory review requirements, each step in the
model development and scoring process should be documented, thus providing the required trans-
parency for compliance.
Generally speaking, Risk Management is the process of measuring, or assessing risk and then de-
veloping strategies to manage the risk. In general, the strategies employed include transferring the
risk to another party, avoiding the risk, reducing the negative effect of the risk, and accepting some
or all of the consequences of a particular risk. Traditional risk management focuses on risks stem-
ming from physical or legal causes (e.g. natural disasters or fires, accidents, death, and lawsuits).
Financial risk management, on the other hand, focuses on risks that can be managed using traded
financial instruments. Intangible risk management focuses on the risks associated with human
capital, such as knowledge risk, relationship risk, and engagement-process risk. Regardless of the
type of risk management, all large corporations have risk management teams and small groups and
corporations practice informal, if not formal, risk management. In ideal risk management, a priori-
tization process is followed whereby the risks with the greatest loss and the greatest probability of
occurring are handled first, and risks with lower probability of occurrence and lower loss are han-
dled later. In practice the process can be very difficult, and balancing between risks with a high
probability of occurrence but lower loss vs. a risk with high loss but lower probability of occur-
rence can often be mishandled.
The fundamental difficulty in risk assessment is determining the rate of occurrence since statistical in-
formation is not available on all kinds of past incidents. Furthermore, evaluating the severity of the con-
sequences (impact) is often quite difficult for immaterial assets. Asset valuation is another question that
needs to be addressed. Thus, best educated opinions and available statistics are the primary sources of
information. Nevertheless, risk assessment should produce such information for the management of the
organization that the primary risks are easy to understand and that the risk management decisions may
be prioritized. Thus, there have been several theories and attempts to quantify risks.
Market risk, in particular, is an area that has received increasing attention in the last decade as fi-
nancial institutions’ trading activities have significantly grown. Huge losses suffered by organiza-
tions such as Long-Term Capital Management (LTCM), Barings Bank, and Metallgesellschaft
within the last decade due to speculative trading, failed hedging programs, or derivatives, have
increased the importance of risk management to a great extent.
In 1998, the failure of LTCM, the world's largest hedge fund, nearly devastated the world’s finan-
cial system. LTCM was so big that the Federal Reserve Bank of New York had to facilitate a bail-
out to the fund, fearing that the liquidation might wreck the global financial markets. The primary
factor contributing to LTCM’s failure has been widely pointed out as its poor risk management.
Another factor was the fund’s investment strategy which relied heavily upon the convergence-
arbitrage. Hence, the fund had to have a high level of leverage in order to meet a satisfactory rate
of return. However, this kind of investment strategy brings high risks in derivatives trading and
requires a thorough risk management schedule (Brian, 1995). The fund’s risk management, how-
ever, failed to do so, eventually leading to the collapse of the fund.
134 Banks and Bank Systems / Volume 1, Issue 3, 2006

This study aims at exploring the reasons behind the failure of LTCM from a risk management per-
spective, analyzing the impact of VaR implementation in the formation of the collapse. It also
seeks to point out any lessons that can be derived from this experience for the similar hedge funds
as well as for the regulators of the financial markets and institutions. This paper is structured as
follows. Section 2 provides a brief account of VaR and three VaR methods used commonly by
financial institutions. Section 3 presents an overview of the LTCM and its investment strategies as
well as an overview of its VaR model used before the collapse. This section also provides a review
of how VaR can be used to assess the capital base needed to support a leveraged portfolio. Section
4 points out the lessons for the managers of financial institutions and for financial regulators that
emerge from the analysis. Section 5 presents the conclusions of the study.

2. Value-at-Risk Models
Value-at-Risk models are the primary means through which financial institutions measure the
magnitude of their exposure to market risk. These models are designed to estimate, for a given
portfolio, the maximum amount that a bank could lose over a specific time period with a given
probability (Jorion, 1997). This way, they provide a summary measure of the risk exposure gener-
ated by the given portfolio. Management then decides whether it feels comfortable with this level
of exposure or not and acts accordingly. Value-at-Risk models are extensively used for reporting
and limiting risk, allocating capital, and measuring performance (Brian, 1995).
Calculation of VaR depends on the method used. It essentially involves using historical data on market
prices and rates, the current portfolio positions, and models for pricing those positions. These inputs are
then combined in various ways depending on the method used, to derive an estimation of a particular
percentile of the loss distribution, typically the 99th percentile loss. According to the Basle Committee
Proposal (1995, 1996), the computation of VaR should be based on a set of uniform quantitative inputs,
namely a horizon of 10 trading days, or two calendar weeks, a 99% level of confidence, and an observa-
tion period based on at least a year of historical data. Three methods are commonly used for computing
VaR. This section provides a brief account of these three methods.
2.1. Delta-Normal Approach
Delta-normal approach is the simplest method to implement. However, it has several drawbacks such as
non-stability of parameters used, and the assumptions of normal distributions for all risk factors and
linearity for all securities in the risk factors. This method consists of going back in time and computing
variances and correlations for all risk factors. Portfolio risk is then computed by a combination of linear
exposures to numerous factors and by the forecast of the covariance matrix (Dunbar, 1998).
For this method, positions on risk factors, forecasts of volatility, and correlations for each risk fac-
tor are required. Delta-normal approach is generally not appropriate to portfolios that hold options
or instruments with imbedded options such as mortgage-backed securities, callable bonds, and
many structured notes. This approach is relatively easier to compute and compare. It is also easy to
compute marginal contribution to VaR.
RiskMetrics, a particular implementation of the delta-normal approach, assumes a particular struc-
ture for the evolution of market prices and rates through time. It, then, transforms all portfolio po-
sitions into their constituent cash flows and performs the VaR computation on those (Dowd,
1998). This model was launched by JP Morgan in 1994 aiming at promoting the use of value-at-
risk among the firm's clients. The service comprised a technical document describing how to im-
plement a VaR measure and a covariance matrix for several hundred key factors updated daily on
the Internet. It is an entirely logical approach, particularly for portfolios without a lot of non-
liearity, and is known to be responsible for popularizing VaR.
2.2. Historic or Back-Simulation Approach
Historic Approach is also a relatively simple method where distributions can be non-normal, and securi-
ties can be non-linear. Historic approach involves keeping a historical record of preceding price
Banks and Bank Systems / Volume 1, Issue 3, 2006 135

changes. It is essentially a simulation technique that assumes that whatever the realizations of those
changes in prices and rates were in the earlier period is what they can be over the forecast horizon. It
takes those actual changes, applies them to the current set of rates, and then uses those to revalue the
portfolio. The outcome is a set of portfolio revaluations corresponding to the set of possible realizations
of rates. From that distribution, the 99th percentile loss is taken as the VaR (Dowd, 1998).
However, historic approach uses only one sample path, which may not efficiently represent future
distributions. For this approach, specification of a stochastic process for each risk factor is re-
quired. Also required are the positions on various securities, and valuation models for all assets in
the portfolio. This method involves going back in time, and applying current weights to a time-
series of historical asset returns. This return restructures the history of a hypothetical portfolio us-
ing the current position. Obviously, if asset returns are all normally distributed, the VaR obtained
under the historical-simulation method should be the same as that under the delta-normal method
(Dowd, 1998). This approach is easy to compute and to understand. It allows for non-normality
and non-linearity. It can also easily be adapted to scenario analysis. However it has several draw-
backs such as unstable parameters and altering variances. In addition, the model may not work
well if based on small sample (Stulz, 2000).
2.3. Monte-Carlo Approach
Monte Carlo approach is widely regarded as the most sophisticated VaR method. It looks easy to
code Monte Carlo analyses. However, it takes hours or even days to run those analyses, and to
speed up analyses complicated techniques such as variance reduction need to be implemented
(Dowd, 1998). In theory, Monte Carlo method makes some assumptions about the distribution of
changes in market prices and rates. Then, it collects data to estimate the parameters of the distribu-
tion, and uses those assumptions to give successive sets of possible future realizations of changes
in those rates. For each set, the portfolio is revalued and, as in the historic method, outcomes are
ranked and the appropriate VaR is selected. Monte-Carlo method makes it easier to cope with ex-
treme non-linearities as it allows for non-linear securities. It can also easily be adjusted according
to the distribution of risk factors. However it is computationally burdensome which constitutes a
problem for routine use (Dunbar, 1998).

3. Risk Management at LTCM


Due to its sophisticated hedging strategies, LTCM became one of the most highly leveraged
hedge funds in history. It had a capital base of $3 billion, controlled over $100 billion in assets
worldwide, and possessed derivatives whose value exceeded $1.25 trillion. To predict and mitigate
its risk exposures, LTCM used a combination of different VaR techniques. LTCM claimed that its
VaR analysis showed that investors might experience a loss of 5% or more in about one month in
five, and a loss of 10% or more in about one month in ten. Only one year in fifty should it lose at
least 20% of its portfolio (Lowenstein, 2000).
LTCM also estimated that a 45% drop in its equity value over the course of a month was a 10 stan-
dard deviation event. In other words, this scenario would never be likely to occur in the history of the
universe (Prabhu, 2001). Unfortunately for Long-Term Capital Management and its investors, this
event did happen in August 1998. This reliance on Value-at-Risk may also indicate one of the prob-
lems that eventually led to Long-Term’s demise. LTCM believed that historical trends in securities
movements were an accurate predictor of future movements. Their faith in this belief led them to sell
options in which the implied volatility was higher than the historical volatility (Prabhu, 2001).
Similarly, the Value-at-Risk models used by LTCM rely on historical data to project information
about future price movements. These models project the probability of various losses based on the
prior history of similar events. Unfortunately, the past is not a perfect indicator of the future. On
October 18, 1987, for example, two-month S&P futures contracts fell by 29%. Under a lognormal
hypothesis, with annualized volatility of 20% (approximately the historical volatility on this secu-
rity), this would have been a –27 standard deviation event. In other words, the probability of such
136 Banks and Bank Systems / Volume 1, Issue 3, 2006

an event occurring would have been 10-160. This is such a remote probability that it would be
virtually impossible for it to happen (Prabhu, 2001). On October 13, 1989 the S&P 500 fell about
6%, which under the above assumptions would be a five standard deviation event. A five standard
deviation event would only be expected to occur once every 14,756 years (Jackwerth and Rubin-
stein, 1998). There are many other examples of abnormal market events happening with greater
frequency than these models would lead one to expect. It would appear then, that lognormal mod-
els for expected returns do not fully account for these large losses, and that prior estimates of vola-
tility may not be able to accurately predict future price movements. This reliance on a risk model
that tends to underestimate the probability of large downward movements in securities prices may
have led Long-Term Capital to be overconfident in its hedging strategies.
In August 1998, an unexpected non-linearity occurred that was beyond the detection scope of the
VaR models used by LTCM (Davis 1999). Russia defaulted on its sovereign debt, and liquidity in
the global financial markets began to dry up as derivative positions were quickly slackened. The
LTCM VaR models had estimated that the fund’s daily loss would be no more than $50 million of
capital. However, the fund soon found itself losing around $100 million every day. In the forth
day after the Russian default, they lost $500 million in a single trading day alone. As a result
LTCM began preparations for declaring bankruptcy (Davis, 1999). However, US Federal Reserve,
fearing that LTCM’s collapse could paralyze the entire global financial system due to its enor-
mous, highly leveraged derivatives positions, extended a $3.6 billion bailout to the fund, creating a
major moral hazard for other adventurous hedge funds (Dong et al., 1999). Consequently, LTCM’s
failure can be attributed to VaR. As the regulation is applied less strictly to securities firms and
other non-bank financial institutions, LTCM’s preference of the horizon was in fact more delicate
than the 10-day period required the Basle Committee (Hunter and Power, 2000). For a hedge fund,
the horizon should match the period required to raise additional funds, which is rather difficult as
additional capital will be needed precisely after the fund has suffered a large loss. Indeed, by the
end of August 1998, LTCM had $2.3 billion of equity capital and $1 billion excess liquidity. The
firm faced a dilemma between reducing risk and raising additional capital. Because of the magni-
tude of its positions, it was unable to reduce its risk exposure promptly. Neither was it able to at-
tract new investors. It soon became clear that the firm had underestimated its capital needs. In ad-
dition, the positions of LTCM were allocated so as to maximize expected returns subject to the
single constraint that the fund’s perceived risk would not exceed that of the U.S. securities market.
In a nutshell, the fund was maximizing return while not carefully monitoring its volatility. Besides,
the fund determined a target daily volatility figure of $45 million based on the simplistic assump-
tion that volatility would remain constant, while in reality it could easily double in turbulent times.
Along with its exposure to volatility, the fund also achieved extraordinary levels of leverage. In-
deed, at the time of its near-failure, the LTCM Fund was the most highly leveraged hedge fund
reporting to the Commodity Futures Trading Commission (CFTC). Combination of its large capi-
tal base and high degree of leverage enabled the fund to possess total assets exceeding $125 billion
in value, a figure bigger than three times that of the next largest hedge fund. The fund was betting
on the convergence of spreads, and during turbulent times, spread between relative bonds would
widen causing the fund to lose its position. In order to regain the position, additional funds would
be needed to meet margin calls (Davis 1999). However, the fund would not be able to liquidate
quickly enough in case of adverse market conditions, leading to a risk of liquidity and insolvency.
Indeed, LTCM faced harsh liquidity problems when its investments began losing value. As men-
tioned earlier, liquidity risk is not factored into VaR models as they assume that normal market
conditions will prevail (Bangia et al., 1999). Therefore, LTCM’s actual exposure to liquidity and
solvency risks was not evaluated accurately by its VaR model.
The Fund’s investors and counter parties were not adequately aware of the nature of the exposures
and risks the hedge fund had accumulated either as they exercised minimal scrutiny of the fund’s
risk management practices and risk profile. This insufficient monitoring was alleviated by
LTCM’s practice of disclosing only minimal information to these parties. Financial statements
disclosed by LTCM did not include details about the fund’s risk profile and concentration of expo-
Banks and Bank Systems / Volume 1, Issue 3, 2006 137

sures in certain markets (Stulz, 2000). Counter parties’ current credit exposures were in most cases
covered by collateral. However, their potential future exposures were not adequately assessed,
priced, or collateralized relative to the potential price shocks the markets were facing at that time.
Figure 1 indicates the losses incurred by LTCM in 1998 by trade types.

Source: Lowenstein (2000).

Fig. 1. Losses by trade type

At the same time, not only LTCM’s liquidity, credit and volatility spreads widen, but also the liquid-
ity of most global financial markets dried up. This expanded the problem faced by LTCM’s creditors,
because a liquidation of LTCM’s positions would have been disorderly and could have had adverse
market effects on their positions and that of many other market participants. The possibility of this
situation occurring was not fully considered by either LTCM or its creditors. This raises the issue of
how events that are assumed to be extreme and very improbable should be incorporated into risk-
management and business practice, and how they should be dealt with by public policy. Figure 2
presents the losses incurred by financial institutions through collapse of LTCM.

Source: Prabhu (2001).


Fig. 2. Losses by Financial Institution
138 Banks and Bank Systems / Volume 1, Issue 3, 2006

As the analysis indicates, LTCM relied too much on VaR models and not enough on stress testing,
gap risk and liquidity risk. There was an assumption that the portfolio was sufficiently diversified
across world markets to produce low correlation. But in most markets LTCM was replicating basi-
cally the same credit spread trade (Shirreff, 1999). Several risk management and policy implica-
tions emerge from this analysis, which will be pointed out in the next section.

4. Lessons for Financial Institutions and Financial Regulators


4.1. Lessons for Financial Institutions
As pointed out in the previous section, LTCM relied excessively on the VaR models for managing
risk, which failed to predict the maximum potential loss due to its assumption that the market state
would remain stable. This leads us to the conclusion that relying solely on VaR as a means of risk
management would be erroneous. What can be implemented as a complementary measure to alle-
viate the limitations and disadvantages of VaR is another functional risk management method,
namely stress testing. This procedure incorporates the impact of particular extreme case scenario
in the analysis of risk and naturally complements VaR. While VaR gives fund managers an esti-
mate of what they might lose with a certain maximum probability, stress tests enable them to have
a clear idea of what they stand to lose in case the worst case scenario actually takes place.
A further implication for the managers of financial institutions is the importance of imposing
stress-loss limits on their portfolio to protect against extreme shocks. Stress limits provide protec-
tion against extreme shocks in both individual and groups of risk factors. They might also prevent
the management from concentrating in a single strategy or project, or from maintaining a single
position. What is more, the information provided by stress tests can be very useful in determining
capital allocation within an institution (Dowd, 1998).
Another implication lies in the philosophy of the trading strategies. As pointed out in the previous
sections, LTCM’s trading strategy exploited the intrinsic weaknesses in its risk management sys-
tem as it aimed at maximizing expected returns subject to constrain on VaR. The fund’s daily trad-
ing basically relied on the convergence arbitrage strategy. The fund took non-directional bets so
that it is not dependent upon the direction of markets. This strategy induced the fund to make un-
diversified and highly leveraged bets on more subtle risks such as credit risks, political risks, or
market disruptions. Hence, it can be concluded that the convergence arbitrage strategy does not
necessary produce safety by betting neither direction and is a risky strategy.
To be brief, the reason of LTCM’s failure is that it had severely underestimated its risk. The trading
positions were singularly undiversified, as they were exposed to liquidity, credit, and volatility spreads,
which are all similar risk factors. Although, the LTCM’s VaR system seems to be the primary reason
behind LTCM’s collapse due to its failure in measuring the fund’s potential risk exposures, it is still the
best tool in risk measurement. Analysis of LTCM’s collapse illustrates the perils of relying on VaR
without the complementation a proper stress testing system. As explained earlier, LTCM’s strategy was
to maximize the return on a constrained VaR which worked well only under normal market conditions,
and the market crash in 1998 was not expected by the nature of such a system.
To mitigate the limitation of VaR systems, financial entities have to develop and properly imple-
ment stress-testing systems, which area capable of helping to measure risk exposures in extreme
cases. In addition, traditional risk management models ignore asset and funding liquidity (Bangia
et al., 1999). When positions are relatively large and leveraged, it is important to account for the
price impact of forced sales. The final implication derived from the story of LTCM for the man-
agers of financial institutions would suggest that a fund diversifies its portfolio not only geo-
graphically but also wisely and strategically.
4.2. Lessons for Financial Regulators
The risk management limitations revealed by the LTCM incident were not unique to LTCM or to
its creditors. As new technology has cultivated a major expansion in the volume and the leverage
Banks and Bank Systems / Volume 1, Issue 3, 2006 139

of transactions, VaR models have underestimated the probability of severe losses. This shows the
need for regulation in the appropriate models of VaR and the level of capital for risky positions
taken by financial institutions.
As mentioned earlier, LTCM managed to establish leveraged trading positions of a size that posed
potential systemic risk, primarily because the banks and derivatives firms that were its creditors
and counter parties failed to enforce their own risk management standards. Other market partici-
pants and federal regulators relied upon these large banks and securities and futures firms to follow
sound risk management practices in providing LTCM with credit. However, limitations in the risk
management practices of these creditors and counter parties allowed LTCM to use leverage to
grow beyond control. Federal financial regulators did not identify the extent of weaknesses in
banks’ and funds’ risk management practices until after LTCM’s collapse. Although regulators
were aware of the potential systemic risk that hedge funds can pose to markets and the perils of
declining credit standards, until LTCM’s collapse, they seemed to believe that creditors and
counter parties were appropriately constraining hedge funds’ leverage and risk-taking (Hunter and
Power, 2000). However, LTCM’s collapse revealed weaknesses in credit risk management by
banks and broker-dealers that allowed LTCM to become excessively leveraged. The existing regu-
latory approach, which focuses on the condition of individual institutions, did not sufficiently con-
sider systemic threats that can arise from non-regulated financial institutions such as LTCM.
Similarly, periodic financial information provided in financial statements received from LTCM, its
creditors, and its counter parties did not reveal the potential threat posed by LTCM. Regulators for
each industry generally continued to focus on individual firms and markets, the risks they face, and
the soundness of their practices, but they failed to address interrelationships across each industry.
The risks posed by LTCM crossed traditional regulatory and industry boundaries, and the regula-
tors had to coordinate their activities to have had a chance of identifying these risks. Although
regulators recommended improvements to information reporting requirements, they did not rec-
ommend ways to better identify risks across markets and industries. Lack of authority over certain
affiliates of investment firms limited the ability of the Securities and Exchange Commission (SEC)
and CFTC to identify the kind of systemic risk that LTCM posed (Crouhy et al., 2001).
A number of policy implications for the regulating authorities of financial markets and institutions
emerge from this study. To begin with, regulators should ensure that they address the risk man-
agement weaknesses that have been pointed out earlier. They should also outline sound practices
for the institutions’ interactions with highly leveraged institutions. In addition, regulators need to
ensure that entities for which they have responsibility to regulate are apposite to the scale and
complexity of the credit services they provide, investments they make, and liabilities they incur. In
this respect, banks should ensure that their counter parties develop meaningful measures of poten-
tial future credit exposure and use these measures to set exposure limits. Regulators should en-
courage banks to develop policies setting out the situations in which potential future exposures
should be collateralized. Another important issue that needs special attention by the regulating
authorities is the guidance of enhancing the quality of reporting financial information by financial
institutions, especially hedge funds.

5. Conclusions
This paper analyzed the collapse of the LTCM hedge fund aiming at deriving implications for the
managers of financial institutions and for the policy makers. A number of significant conclusions
emerge from the study as described in detail in the previous sections. For instance, LTCM’s VaR
model is major factor that accounts for its collapse in 1998 as it underestimated the fund’s poten-
tial risk exposure. This was mainly due to its insufficiency in terms of identifying the risk factors
such as liquidity and volatility. However, VaR is still an invaluable instrument for assessing the
market risk. Nevertheless, it is not the solution for all risk management challenges, and is certainly
not an appropriate measure upon which to build optimal decision rules. As a result, we must draw
some valuable risk management lessons from the LTCM experience regarding VaR.
140 Banks and Bank Systems / Volume 1, Issue 3, 2006

First of all, managers of financial institutions should not rely only on VaR for their market risk
management practices. As mentioned earlier, LTCM’s strategy was to maximize the return on a
constrained VaR but this risk management system only works well under normal market condi-
tions. The market crash in 1998, for instance was not expected due to the nature of VaR. This illus-
trates the danger of relying on VaR without the complementation of a proper stress testing system.
In order to mitigate the limitations of VaR systems, managers of financial institutions have to de-
velop and properly implement stress-testing systems, which may enable them to measure risk ex-
posures in extreme cases.
Moreover, traditional risk management models do not take into account asset and funding liquid-
ity. When positions are relatively large and leveraged, it is important to account for the price im-
pact of forced sales. In this respect, the LTCM experience has important lessons for convergence-
arbitrage strategies. It can be conveniently concluded that a fund should not only diversify its port-
folio geographically around the global financial markets, but also wisely and strategically. Banks
also face the challenge of calculating the statistical measures that express volatility in the market.
The volatility associated with individual exposures must then be aggregated across the portfolio of
all exposures to a Value-at-Risk (VaR) figure. VaR and risk adjusted performance figures must
then be disseminated to decision makers, often with figures broken down by a variety of catego-
ries. These reports will help management adjust their business strategies with regard to market
risk; further, regulators who have determined that VaR serves as the basis for regulatory capital
requirements require the reports.
To meet the market discipline requirements of Basel II, a data access and storage mechanism must
be created that is comprehensive; provides timely and appropriate information to analysis tools and
robust reporting engines; provides a complete view of risk exposure across the enterprise in a vari-
ety of reporting formats; and provides consistent market risk measures in terms of both required
data and applied analytics. Implementing a bank-wide risk management strategy is the only guar-
anteed way that banks can begin to comply with the minimum requirements of the new regula-
tions, by ensuring that your risk calculations are accurate, complete and reliable. This study also
suggests several implications for the regulating authorities. For instance sound practices need to be
outlined for financial institutions’ levels of exposure to leverage risk. Consequently, accurate
measurement of potential credit exposure should be enforced. Last but not least, regulators should
take steps in order to enforce improvement of the quality of information reporting by not only the
hedge funds but also all types of financial institutions.

References
1. Basle Committee on Banking Supervision, (1995), An Internal Model-Based Approach to
Market Risk Capital Requirements, BIS, 101-105.
2. Basle Committee on Banking Supervision, (1996), Supervisory Framework for the Use of
Backtesting in Conjunction with the Internal Models Approach to Market Risk Capital Re-
quirements, BIS, 123-126.
3. Bangia A., F. Diebold, X. Schuermann, T. Stroughair, (1999), “Modeling Liquidity Risk With
Implications For Traditional Market Risk Measurement and Management”, Financial institu-
tions Center, 15-17.
4. Brian, A., (1995) “Financial Risk Management”, McGraw-Hill, 23-25.
5. Crouhy M., D. Galai, Mark R., (2001), Risk Management, McGraw-Hill, 2001, 45-51.
6. Davis E.P., (1999), Russia/LTCM and Market Liquidity Risk, Bank of England, 13-17.
7. Dong-Chan K., D. Lee, R.M. Stulz, (1999), U.S. Banks, Crises, and Bailouts: From Mexico to
LTCM, American Economic Association, 22-25.
8. Dorfman, M.S. (1997), “Introduction to Risk Management and Insurance” (6th ed.), Prentice
Hall.
9. Dowd, K., (1998), Beyond Value At Risk, The New Science of Risk Management, John
Wiley & Sons, 123-130.
10. Dunbar, N., (1998), “Meriwether’s Meltdown”, Risk (October), 12-14.
Banks and Bank Systems / Volume 1, Issue 3, 2006 141

11. Hunter B. and M. Power, (2000), “Risk Management and Business Regulation”, The Center
for Analysis of Risk and Regulation at the London School of Economic and Political Science,
21-13.
12. Jackwerth, J.C. and M. Rubinstein, (1998), “Recovering Probability Distributions from Op-
tions Prices”, The Journal of Finance, 6, 1611-1631.
13. Jorion, P., (1997), “Value At Risk – The New Benchmark for Controlling Market Risk”,
McGraw-Hill, 14-29.
14. Lowenstein, R., (2000), When Genius Failed: The Rise and Fall of Log-Term Capital Man-
agement. New York: Random House, 21-45.
15. Prabhu, Siddharth (2001), “Long-Term Capital Management: The Dangers of Leverage”,
Duke Journal of Economics, 4, 24-41.
16. Shirreff, D., (2003), Lessons From the Collapse of Long Term Capital Management, The Risk
Web Site [online]. Available from: http://risk.ifci.ch/146480.htm, [Accessed 14th July 2003].
17. Stulz, R.M., (2000), “Why Risk Management Is Not Rocket Science”, Financial Times, Mas-
tering Risk Series, June 27, 2000. 21-24.
18. ___________ (2003), “Risk Management & Derivatives” (1st ed.), Mason, Ohio: Thomson
South-Western.

You might also like