Performance
Analysis
Managing
risk according to Michael Halls-Moore comes down to:
- identifying
and mitigating intrinsic and extrinsic factors that can affect the
performance or execution of the algo
- managing
the portfolio to maximise growth rate and minimise capital drawdown
Strategy
and Model risk
This
includes the risk from backtesting bias and assumptions in the statistical
model that can be ignored. e.g a linear regression assumes homoscedasticity. We
can see that through constant variance in the residual plot. However if this is
not tested then the linear regression will provide less accurate results in
parameter estimation.
Other
models use moments of the data such as mean, variance, skewness and kurtosis of
strategy returns. This means moments should be constant in time. However if a
market regime changes then the model will not fit as well as it should due to a
change in these moments. Models
with rolling parameters can help.
Portfolio
risk
Portfolio
allocation may be heavy in certain sectors and therefore it is often important
for institutional investors to override particular strategies to account for
overloaded factor risk when the preservation of capital is more important than
the long term growth rate of the capital.
Institutional
investors may also face the risk of causing market impact particularly in
illiquid assets. A large percentage of the daily trading volume if traded can
invalidate the results of the backtest which typically neglect market impact.
Therefore setting a limit such as small percentage of the running average daily
volume over a certain period can help.
Strategies
in the portfolio should also not be correlated. This correlation can be
measured by the Pearson Product Moment Correlation Coefficient (Pearson’s R),
though it is best to design it specifically so that they are not correlated due
to different asset classes or frequency, particularly as correlation can change
during financial contagion. Rolling correlations can therefore be estimated over a
long time frame to include in the backtest.
Counterparty
risk
In
the context of algo trading we are more concerned about the risk of default
from suppliers such as an exchange or brokerage (though rare). Michael Halls
Moore factors the risk of brokerage bankruptcy and recommends using multiple
brokerages though this can make it difficult.
Money
Management
Measurement
of account drawdowns (drop in account equity) can allow the trader to realise
how much they are able to tolerate because they find it is optimal for long
term growth rate of the portfolio via leverage.
The
Kelly Criterion enables control over this balance - it lets us see leverage and
allocation towards various strategies.
Assumptions:
- Each algo strategy will be assumed to have a returns of a Gaussian distribution. Furthermore each strategy has their own fixed and constant mean and standard deviation of returns.
- The returns are excess returns, having subtracted margin account interest and transaction costs (and management and performance fees for institutions). i.e mean annual return - risk free borrowing rate
- The
strategies have no correlation and therefore the covariance matrix between
strategy returns is diagonal.
The
Kelly Criterion fi for optimal leverage for each strategy to maximise growth
rate and minimise drawdowns is:
fi
= μi/σi2 for each strategy i to N of vector f
i.e
average excess returns/variance of excess returns for that strategy
fi
= optimal leverage = optimal size of portfolio/own equity
growth
rate = rf + S^2/2
Where
rf is the risk-free interest rate, (the rate at which you can borrow from the
broker), and S is the annualised Sharpe Ratio of the strategy.
Sharpe
Ratio = annualised mean excess returns/annualised standard deviation of excess
returns
Kelly
Criterion is obviously not static and therefore should be recalculated
regularly via a lookback window of 3-6 months of daily returns. Sometimes
maintaining this level of leverage means selling into a loss though this is
mathematically the way to maximise long term growth rate. Because the average
excess returns and variance of excess returns are uncertain in practice traders
tend to use the half Kelly, i.e divide it by two, where the actual value serves
as an upper bound of leverage to use. Otherwise using the actual Kelly value
can lead to a complete wipeout of account equity given the non Gaussian nature
of returns for each strategy.
Risk
Management
Value
at Risk
VaR
is an estimate at a certain confidence level of the size of a loss from a
portfolio over a certain timeframe. This time period is one that would lead to
the least market impact if the portfolio were to be liquidated.
Where
L is loss, the value of the loss is VaR and c is confidence level:
P(L <= -VaR) = 1-c
So
to mathematically express the fact there is a 95% chance of losing no more than
US$100 000 the next day:
P(L <= -1.0
x 105 ) = 0.05
Though
straightforward to calculate, it does not tell us by how much the loss can
exceed the value (that is expected shortfall). It also assumes typical market
conditions (not tail risk) and volatility and correlation of assets need to be
known, which can be difficult especially if there is a change in market regime
that changes these parameters significantly.
Three
techniques to calculate VaR are:
1. variance-covariance method: assumes normal
distribution
2. monte carlo: assumes non normal distribution
3. historical bootstrapping: uses historical returns of
the asset/entire strategy
For the
variance-covariance method for an asset/strategy, the daily VaR of a portfolio
of P dollars with confidence level c and alpha being the inverse of a
cumulative distribution function* of a normal distribution:
VaR = P - (P(alpha(1-c)+1))
*Cumulative frequency analysis is the
analysis of the frequency of occurrence of values of a phenomenon less than a
reference value.
The ppy method
under the SciPy library in Python enables us to generate the values for the
inverse cumulative distribution function of a normal distribution with mean and
standard deviation values obtained from historical daily returns of an asset
(we would replace these with returns of a strategy).
More info that
describes the various ways in which R can calculate VaR and ES:
http://www.r-bloggers.com/the-estimation-of-value-at-risk-and-expected-shortfall/
#!/usr/bin/python
# -*- coding:
utf-8 -*-
# var.py
#importing the
required packages
from __future__
import print_function
import datetime
import numpy
import
pandas.io.data
from
scipy.stats import norm
#creating the
function to calculate VaR with the following parameters
def
var_cov_var(P, c, mu, sigma):
"""
Variance-Covariance calculation of daily
Value-at-Risk
using confidence level c, with mean of
returns mu
and standard deviation of returns sigma, on
a portfolio
of value P.
"""
alpha = norm.ppf(1-c, mu, sigma)
return P - P*(alpha + 1)
#reading the
table of historical prices from Yahoo Finance for JP Morgan
#over the
period 2012 to 2016
if __name__ ==
"__main__":
start = datetime.datetime(2012, 1, 1)
end = datetime.datetime(2016, 1, 1)
jp =
pandas.io.data.DataReader("JPM", 'yahoo', start, end)
#converting into percentage returns
jp["rets"] = jp["Adj
Close"].pct_change()
P = 1e6
# 1,000,000 USD portfolio
c = 0.99
# 99% confidence interval
mu = numpy.mean(jp["rets"])
sigma = numpy.std(jp["rets"])
var = var_cov_var(P, c, mu, sigma)
print("Value-at-Risk: $%0.2f" %
var)
Value-at-Risk: $31676.22
There is a 99% chance of losing no more than $31 676.22 from a $100 000 portfolio the next day.
No comments:
Post a Comment