Beruflich Dokumente
Kultur Dokumente
2
1 t 0.25
ψ(t) = π − 4 exp (i2πf0 t) exp − (2)
2
Mutual Information
0.2
The CWT of a signal is given by equation 1. Equa- which can be simplified to:
tion 5 presents the MI equation. These two equa-
tions can be combined as shown in equation 7. The p(xn | x1 , ..., xn−1 ) = p(xn | xn−1 ) (10)
Wavelet-MI algorithm computes the mutual information, which is a simplified form of a first-order Markov model.
5.2 Hidden Markov Model where γk [t] is the probability of being in state k. The log
likelihood of the ICA observation model, with unmixing
A Hidden Markov Model (HMM) is a statistical model
matrix W and M sources can be written as [6]:
consisting of a set of observations which are produced by
an unobservable set of latent Markov model states. It is
widely used within the speech recognition sector. Due to M
X
its numerous advantages in inferring the hidden states of a log p(xt ) = log |det(W)| + log p(ai [t]) (13)
dynamic system, it is increasingly being used in the finan- i=1
cial sector as well. The aim of using a HMM is to infer the
hidden states from a set of observations. Mathematically, Substituting the ICA log likelihood, equation 13, into the
the model can be represented by [2]: HMM auxiliary function, equation 12, gives:
N N 1 X X
Y Y Qk = log |det(Wk )| +
γk [t] log p(ai [t])
p(X | Z, θ) = p(z1 | π)[ p(zn | zn−1 , A)] p(xm | zm , B) γk t i
n=2 m=1 (14)
(11) The auxiliary function summed over all states k, becomes:
where X = x1 , ..., xN is the observation set, Z =
z1 , ..., zN is the set of latent variables, θ = π, A, B rep-
X
Q= Qk (15)
resents the set of parameters governing the model. The k
HMM is represented using Markov chains in Figure 2,
The HMICA model finds the unmixing matrix Wk for
showing the hidden layer with states zt and the observed
state k, by minimizing the cost function given by equation
layer with observation set xt . Also shown in the figure is
15 over all underlying parameters.
the state transition probability P (z(t + 1) | z(t)) and the
emission model probability P (x(t) | z(t))
6 RESULTS
State Transition Probability
Hidden Layer P(z(t+1)|z(t)) This section presents the results obtained when the
z(t) z(t+1) Wavelet-MI model presented in section 4 and the HMICA
model presented in section 5 are simulated in Matlab. Fig-
ure 3 presents the Viterbi diagrams and the mutual infor-
Emission Model Probability
P(x(t)|z(t))
mation plots obtained by using FX data at various different
time scales.
From the plots it is evident that there are significantly
long periods of state stability. There is also some evidence
Observed Layer x(t) x(t+1)
of existence of recurring patterns, which can prove ex-
tremely useful in building a trading strategy. The HMICA
code also gives the state transition matrix as an output.
Figure 2: Hidden Markov Model graphical representation The state transition matrix gives the probability of change
of state from state i to state j, i.e.:
State
0
0 500 1000 1500 2000 2500 3000 3500 4000
1.5
1
MI
0.5
0
0 500 1000 1500 2000 2500 3000 3500 4000
time
State
0
0 500 1000 1500 2000 2500 3000 3500 4000
MI 2
1
0
0 500 1000 1500 2000 2500 3000 3500 4000
3
2
MI
1
0
0 500 1000 1500 2000 2500 3000 3500 4000
time
0
0 500 1000 1500 2000 2500 3000 3500 4000
4
MI
2
0
0 500 1000 1500 2000 2500 3000 3500 4000
time
Figure 3: Viterbi diagrams showing state transitions in the Hidden layer for USDJPY-EURJPY at different time scales.
Also shown are the Mutual Information (MI) plots of the currency pairs.
ACKNOWLEDGEMENTS
0.9915 0.0085
P12.5 = (21) The authors are grateful to the Oxford-Man Institute of
0.0273 0.9727
Quantitative Finance for their support. The first author
It is interesting to note that for significant portions of would also like to thank Exeter College (Oxford) for fund-
time, the length of time for which the state stays constant ing this research.
is over 100 samples (50 seconds) long. These periods of
state stability are hence well-suited for placing a trade or-
der. The state transition probability matrix, Pij , can be References
used to make predictions about future states. Simulations [1] International Banking Systems Jour-
conducted with Equities data using the models presented nal/Supplements/Trading Platforms Supplement.
in this paper also give encouraging results. International Banking Systems Journal, June 2007.
[2] C.M. Bishop. Pattern recognition and machine learn-
7 CONCLUSIONS ing. Springer, 2006.
This paper presents a statistical model for analysing the [3] AP Dempster, NM Laird, and DB Rubin. Maximum
dynamics of multivariate financial time series. The CWT Likelihood from Incomplete Data via the EM Algo-
is presented as a useful tool for the analysis of finan- rithm. Journal of the Royal Statistical Society. Series
cial data sets at various different frequencies. HMICA B (Methodological), 39(1):1–38, 1977.
is used to extract the hidden states from multivariate fi- [4] F. Long and C. Ding. Feature Selection Based on Mu-
nancial time series. The hidden states stay constant for tual Information: Criteria of Max-Dependency, Max-
significant periods of time which is potentially useful for Relevance, and Min-Redundancy. IEEE Transactions
building efficient trading models. It is also shown that the on Pattern Analysis and Machine Intelligence, 27(8):
hidden states are indicative of changes in mutual informa- 1226–1238, 2005.
tion between two FX returns time series. [5] P. Oswiecimka, J. Kwapien, S. Drozdz, and R. Rak.
Investigating Multifractality of Stock Market Fluc-
tuations Using Wavelet and Detrending Fluctuation
Methods. Acta Physica Polonica B, 36(8):2447, 2005.
[6] W. Penny, R. Everson, and S.J. Roberts. Hid-
den Markov Independent Components Analysis.
Advances in Independent Component Analysis.
Springer, pages 3–22, 2000.