Problem Background2

advertisement
Chapter 2 Problem Background
Portfolio Management
Successful portfolio management involves allocating funds between different asset groups such as
equities, fixed-income securities, commodities and real-estate and determining which individual
securities within each major group to buy and sell with the goal of maximizing the portfolio performance
while satisfying constraints such as risk and style.
Quantitative equity portfolio management (QEPM) uses mathematical and statistic techniques to
evaluate fundamental and technical data - or any other quantifiable data. Quantitative analysis uses the
results to build models that evaluate securities in the context of an investment decision.
The following section outlines the theories behind the models employed in QEPM.
Modern Portfolio Management Theory
One of the first works on modern financial theory was produced by Bachelier in 1900 called the “The
Theory of Speculation” which concluded that prices were random and impossible to predict. This was
also the conclusion Alfred Cowles in 1930s after studying the trades of investment firms. Graham was
the first to challenge the theory of random prices when he published the “The Intelligent Investor”
which advocated the “value” style of investing which focuses on finding stocks that are fundamentally
undervalued. Investors were encouraged to choose a small number of securities on which they would
perform in-depth analysis. Warren Buffet was one of Graham’s students and he has used the ‘value’
style to outperform the S&P 500 index over many years.
Harry Markowitz significantly changed modern financial theory with his work on the relationship of risk
and returns. He showed that by holding a large diversified portfolio one could obtain a higher return for
the same level of risk.
The work of Sharpe on the capital asset pricing model (CAPM) was another big step in financial theory. It
introduced the concept of beta (β), which is the correlation of an assets return with the market index. By
assuming that the relationship between the asset’s return and the market return was the only factor
determining its risk it provided a way to determine an asset’s value from its β. It is considered a singlefactor model. One of the outcomes of the work on CAPM was re-statement of the efficient-market
hypothesis (EMH) which asserts that assets are priced rationally and fairly to the information available in
the public domain and that it is not possible to outperform the market. CAPM is often used within QEPM
to determine risk and to measure a portfolios performance against a benchmark index.
The main alternative to CAPM is the arbitrage pricing theory (APT) which extended the CAPM by
allowing for other non-diversifiable risk factors other than the assets beta. A portfolio’s expected return
could be determined from its exposure to multiple risk factors - thus laying the foundations for the first
multi-factor models.
In 1992, French and Fama produced a three-factor model (beta, market capitalization and price-to-book
ratio) which explained 95% of the variability in their sample of stock returns.
While modern financial theory has produced the mechanism for optimizing a portfolios return in
relation to the market risk it has not produced a strong foundation for the possibility of producing alpha
(α) which is the return above the associated risk of the portfolio. In order for active portfolio managers
to justify the associated overhead cost of actively-managed funds over passively-managed funds, such as
those which simply track an index, they must produce risk-adjusted returns above their benchmark
index. The theories behind such returns are those associated with market behavior rather than portfolio
management.
The next section gives a brief overview of two general types of security analysis before describing two
conflicting theories of market behavior and describes their effect on the practice of QEPM.
Security Analysis
Two common approaches to analyzing securities and markets for trading decisions are technical analysis
and fundamental analysis.
Technical analysis is based on the assumptions that security prices follow trends and that by analyzing
past price and volume data these trends can be identify and exploited. Decisions to buy and sell ignore
the fundamental characteristics of the company or market.
Fundamental analysis on the other hand looks at the structure and environment of a security. In the
case of a stock it evaluates the company’s business model and trading environment in order to put a
value on its future earnings. With this information it estimates a reasonable price for its stock. If the
market price of the stock is sufficiently different from the estimate a buy or sell decision is made. This is
considered a ‘bottom-up’ approach.
Market Behavior Theory
Efficient-Market Hypothesis
Over the years a number of different theories explaining how securities markets behave have influenced
the practice of portfolio management. The first theory, called the Efficient-Market Hypothesis (EMH),
asserts that assets are priced rationally and fairly to the information available in the public domain and
that it is not possible to outperform the market. Since it emergence as a prominent theory in the 1960s
it has been further refined into three forms;
Strong-form efficiency.
Based on an idealized and theoretic view of markets - it asserts that the securities price reflects all public
or private information. Researching individual securities would be pointless as no insight could be
gained that is not already priced in.
Semi-strong-form efficiency.
Similar to the strong-form but allows that private information may not be reflected in the price. Still no
justification for research as research is dependent on public information.
Weak-form efficiency
Allows for differences between the price of the security and its fundamental value but insists that
historical price movements have no bearing on the future direction of the price.
The EMH in all its forms challenges the use of technical analysis while only advocating the limited use of
fundamental analysis in weak-form efficiency. However, it is obvious from the media coverage of
financial markets that the use of both technical and fundamental analysis is widespread.
Portfolio managers have for long justified their fees by their supposed ability to ‘beat the market’. The
theoretical justification for such above normal returns is based on view of the markets from behavioral
finance and other possible abnormalities such as the noisy market hypothesis.
Behavioral Finance
The EMH assume that individual actors within the market act rationally to the market-related
information they receive. Behavioral economists believe that the limitation of human rationality can
affect their actions and cause market abnormalities that can be exploited. It seeks to explain bubbles in
the market through the use of behavioral models. For example, behavioral economists often attribute
stock market bubbles, which are anomalies according to the EMH, to human herding behavior - large
number of investors basing their actions on the actions of other investors.
Noisy market hypothesis
Another explanation for inefficient markets and justification for actively-manage funds is the noisy
market hypothesis (NMH) which explains the miss-pricing of assets with the fact than many investor and
management firms make decision to buy and sell based on reasons other than the expected asset’s
return such as liquidity and tax obligations.
Despite the lack of clear theoretical justification many active portfolio managers and investors, such as
Warren Buffet, have outperform the market and the assets under management (AUM) of the actively
managed fund industry, especially the hedge fund industry, has continued to grow.
The next section describes some of the common strategies employed by actively-managed portfolios.
Active Trading Strategies
A number of different strategies, which are based more on empirical evidence rather than theory, have
gained popularity. The following are a list of some of some popular strategies;
Momentum
Growth
A technical trend based on the observation that an increase in price and volume will
lead to further increases in price.
Stocks which have reported strong growth in the recent past are more likely to show
growth in the future.
Valuation
Earnings Trend
Quality
A bottom-up analysis to find stocks that are undervalued. Introduced by Graham,
popularized by Warren Buffet.
Based on the rational that companies that beat their earnings estimate or have their
earnings estimates revised up tend to do well in future periods also.
Based on the theory that stocks which are leaders within their industry/sector tend to
continually outperform their rivals.
The performance of each strategy is dependent market timing, it has been shown that from 1979 to
2008 growth strategies have outperformed valuation strategies during different the following years
1982, 1985, 1987, 1989-91, 1995-99, 2007 (Ameritrade, 2008) . In order to use each strategy efficiently
it is important to predict the correct market timing. It is assumed by many that the correct timing for
each strategy is based on the economic outlook, however in order to benefit from related prices
changes investors must predict changes in the economic outlook ahead of others.
As described in above a common quantitative technique is find factors which explain difference in equity
returns and associated risk and then use statistical analysis of the relationship between the factors and
equities to build models for various tasks such as stock screening, analyzing risk and performance and
portfolio construction.
The form of the models can vary greatly from simple linear models for stock-screen models to far more
complex high-dimensional non-linear models for risk-modeling and portfolio optimization.
Some firms, with sufficient experience and investment in quantitative techniques, will based their
decisions to buy and sell equities entirely on their quantitative models and may also automate the
process of buy and selling stocks. For many traditional investments management firms the prospect of
trusting a new and unproven quantitative model is unappealing. Instead fund manager will use the
model to screen stocks and combine the results with human judgment before deciding which equities to
include in their portfolio. This is very common in firms that are new to quantitative techniques.
The preceding sections gave an overview of market theory and its effects on active portfolio
management and outlined some common active investment strategies. The next section concentrates
on the problem of quantitative equity selection.
Aggregate Z-Score Stock Ranking
The number of securities to choose from is vast – according to world federation of exchanges there are
over 5,000 equities in the NYSE and NASDAQ alone and over 53,000 in its worldwide members. An
investment management firm’s ability to evaluate securities is limited by the resources of its research
department. The cost of research is expensive. A key problem for portfolio managers is how to evaluate
investment opportunities while restraining overhead costs.
One method is to use a quantitative multifactor stock screen model to find and rank securities based on
an investment strategy. For example, if the portfolio manager wishes to uses a momentum strategy he
would construct a stock screen which use a number of factors associated with valuation such as Free
Cash Flow Yield or EV/Sales.
The raw data for the factors are obtained from the company’s reports in the case of fundamental data
or from the stock exchange prices in the case of technical data. Economic data such as interest and
inflation rates can also be used however the exposure of a security to such economic factors must be
determined by regression analysis.
The raw data itself cannot be used by the model instead it must be normalized before it can be used for
stock ranking. A common method of normalizing the data is to transform the raw data to a z-score
which is the number of standard deviations of the security differs from the mean of the population of
candidate securities. Once the z-score has been calculated for each of the factors they are then
combined in a linear function to produce an aggregate score. In the linear function each factor is given a
weight. There are two common methods for determining the weight of each of the factors.
Information co-efficient
The weights are assigned based on the factors information ratio which is calculated by breaking the
populations of candidates into decile portfolios for each factor were the securities in top 10% bracket of
z-score are grouped together. The IC for each factor is calculated using historical data. The factors with
the highest IC receive the highest weights. One disadvantage of this method is that it does not consider
the interdependent relationship between factors.
Cross-sectional regression analysis
A cross-sectional regression analysis can be performed on a number of time-periods to build a variancecovariance matrix that can be used to find the optimal combination of weights. Procuring the necessary
data for this method may be difficult.
Both of these methods are based on historical data and to not take into account forecasts of changes in
the economic and market outlook which is very important to the performance of the investment
strategies detailed above.
Factor Groups
Factor groups are used to group factors by the investment strategy they are based on. The multifactor
model then becomes a linear function of the groups. This can make adjusting the weights based on
economic and market forecasts simpler. While the weights of each factors within each group can be set
using either of the methods outline above. The overall trading strategy can be adjusted at a higher level
to deal with changes in the market outlook.
Problem Scenario
In our real-world problem there are five groups in the factor model. Each group represents a different
trading strategy whose performance depends on market timing which is forecasted from the portfolio
managers and research departments economic and market outlook.
The correct weights will be based on the IC of each of the factors as well as the portfolios managers’
market and economic outlook for each industry, sector and country within the universe of candidate
equities. Obtaining and quantifying this much information from the portfolio manager on a regular basis
may be difficult. Alternatively we can infer manager’s preferences from the factor exposures of the
current portfolio holdings. We can also analysis the portfolios benchmark index and compare the
difference between factor exposures.
This reconciliation process between the model and portfolio can be framed as an optimization problem find the set of weights which will endorse the maximum weighted value of the positions in the portfolio.
Evolutionary algorithms can be successfully applied to this type optimization problem however
evolutionary algorithms are different than most other algorithms in that they have certain control
parameters, outside of the data they operate on, that affect their operation and are therefore
considered part of algorithm design. The correct parameters are also dependent on the data. This
relationship between data and control parameters means that the design of algorithm is tied to the data
it operates on. This makes it difficult to implement EA within user-applications were the data changes
over time. As the positions of the portfolio will change over time it is beneficial to reconcile the model
and portfolio on a regular basis which may not be practical if the algorithm requires a lot of time to
design.
Download