Beruflich Dokumente
Kultur Dokumente
Predicting Stock Trend Using Fourier Transform And Support Vector Regression
Abstract Predicting stock price is an important task as well price of a stock as some function over time. We will learn
as difficult problem. Stock price prediction depends on various this function by supervised learning method SVR. Different
factors and their complex relationships, which is the act of to other traditional regression models, SVR not only
trying to determine the future value of a company stock. The minimizes the empirical risk (training error), but also
successful prediction of a stock future price could yield minimizes a term which makes the objective function as flat
significant profit. This paper demonstrates the applicability of
as possible. Additionally, before the SVR is used, Fourier
a framework that combines support vector regression and
Fourier transform, for predicting the stock price by learning
transformation is used for noise filtering, which is necessary
the historic data. Fourier transform is used for noise filtering, to help improve the subsequent learning. This two-step
and the support vector regression is for model training. Our learning framework shown in Figure 1 is reasonable to
results suggest that the proposed framework is a powerful model the stock price trend.
predictive tool for stock predictions in the financial market.
214
B. Loss function
We use the "-insensitive loss function proposed in [9] as
follows in SVR,
()
- 0
III. SUPPORT VECTOR REGRESSION
A. Support Vector Regression - and - measure the up error and down error for sample
Support Vector Machines (SVM) are grounded on the ( 3 , ) respectively. A standard method to solve the above
statistical learning theory, or VC theory, which was first minimization problem is to construct the dual problem of
developed by Vapnik [5]. The advantages of SVM, such as this optimization problem (primal problem) by Lagrange
theoretical background, geometric interpretation, unique Method and to translate it to maximize its dual function.
solution, and mathematical tractability has made SVM
attract researchers interest and be applied to many C. Kernel fuction
applications in various fields with impressive performance,
such as computer vision, speech recognition, and text Kernel function should satisfy the Mercers Theorem. Four
classification. The idea of SVM is to constructs a common kernel functions include:
hyperplane or set of hyperplanes in a high- or infinite- (1) Linear function: 6( 7 , 8 ) = 7 , 8 ;
dimensional space, which can be used for classification. (2) Polynomial function with parameter : 6( 7 , 8 ) =
Intuitively, a good separation is achieved by the hyperplane ( 7 , 8 + 1)9 ;
that has the largest distance to the nearest training data point (3) Radial Basis Function (RBF) with parameters ; :
of any class, because in general the larger the margin the 6( 7 , 8 ) = exp(; 7 8 )@ ;
lower the generalization error of the classifier. [6,7] (4) Hyperbolic tangent: 6( 7 , 8 ) = tanh(2 7 , 8 + 1).
The margin concept used in SVM is also used in the In this paper, we will use Radial Basis Function (RBF) as
regression problem. In this case, the goal is to construct a kernel function.
hyperplane that lies "close" to as many of the data points as
possible. Therefore, the objective is to choose a hyperplane
with small norm while simultaneously minimizing the sum D. Implemented Algorithm
of the distances from the data points to the hyperplane [8]. There are some packages for SVR available on the internet.
When SVMs were used to solve the regression problem, For example, the package SVMlight of Joachims [10], the
they were usually called Support Vector Regression (SVR) libSVM, which is prepared by Chih-Jen Lin [11], the
and the aim of SVR is to nd a function with parameters Matlab SVM Toolbox, by Steve Gunn [12]. We will use
and by minimizing the following regression risk:
libSVM in our project.
1
() = , + (( ), ) , (1)
2
IV. PREDICTION OF STOCK TREND
where C is a tradeoff term, the first term is the margin in
SVMs and used to measure VC-dimension [9]. is a loss Our data set has 6 attributes. They are Open price, High
function which can be dened as one needs. The function price, Low price, Closing price, Volume, and Adjusted
is defined by closing price. The regression analysis focus on the Open
(, , ) = , () + , price on the (t+1)-th day changes when the Open price, High
price, Low price, Volume, and Adjusted close price on the i-
where (): is kernel function, mapping into a high th day vary. Our goal is to fit the following relationship by
dimensional space. regression analysis:
215
ABCD = (ABC , EFC , GHC , IJ C , K_
C )
REFERENCES
[1] C. Chatfield. The Analysis of Time Series: An Introduction.
Chapman and Hall, fifth edition, 1996.
[2] Srinivasa, K. G., K. R. Venugopal, and Lalit M. Patnaik. "An
efficient fuzzy based neuro-genetic algorithm for stock market
prediction." International Journal of Hybrid Intelligent Systems 3.2
(2006): 63-81.
[3] Kaboudan, Mark A. "Genetic programming prediction of stock
prices." Computational Economics 16.3 (2000): 207-236.
[4] Farrell, M. Todd, and Andrew Correa. "Gaussian process regression
models for predicting stock trends." Relation 10 (2007): 3414.
[5] C. Cortes and V. Vapnik, Support Vector Networks, Machine
Figure 5. SVR parameters rough selection results (contour map) Learning, 20, 273-297, 1995.
[6] M.Pontil and A. Verri, Properties of Support Vector Machines,
Technical Report, Massachusetts Institute of Technology, 1997.
[7] E.E. Osuna, R. Freund and F. Girosi, Support Vector Machines:
Training and Applications, Technical Report, Massachusetts
Institute of Technology, Artificial Intelligence Laboratory, AI Memo
No. 1602, , 1997.
[8] Xia, Yaqing, Yulong Liu, and Zhiqian Chen. "Support Vector
Regression for prediction of stock trend." Information Management,
Innovation Management and Industrial Engineering (ICIII), 2013 6th
International Conference on. Vol. 2. IEEE, 2013.
[9] V. N. Vapnik. The Nature of Statistical Learning Theory. Springer,
New York, 1995.
[10] Joachims, Thorsten. "Making large scale SVM learning practical."
(1999).
[11] Chang, Chih-Chung, and Chih-Jen Lin. "LIBSVM: a library for
support vector machines." ACM Transactions on Intelligent Systems
and Technology (TIST) 2.3 (2011): 27.
[12] Gunn, Steve R. "Support vector machines for classification and
regression." ISIS technical report 14 (1998).
Figure 6. SVR parameters refined selection results (contour map)
216