Application of an integrated support vector regression method in prediction of financial returns

Full Text (PDF, 469KB), PP.37-43

Views: 0 Downloads: 0

Author(s)

Yuchen Fu 1,* Yuanhu Cheng 1

1. Soochow University /School of Computer Science & Technology, Suzhou, China

* Corresponding author.

DOI: https://doi.org/10.5815/ijieeb.2011.03.06

Received: 3 Mar. 2011 / Revised: 10 Apr. 2011 / Accepted: 2 May 2011 / Published: 8 Jun. 2011

Index Terms

SVR, GA-CPSO, financial returns, forecasting

Abstract

Nowadays there are lots of novel forecasting approaches to improve the forecasting accuracy in the financial markets. Support Vector Machine (SVM) as a modern statistical tool has been successfully used to solve nonlinear regression and time series problem. Unlike most conventional neural network models which are based on the empirical risk minimization principle, SVM applies the structural risk minimization principle to minimize an upper bound of the generalization error rather than minimizing the training error. To build an effective SVM model, SVM parameters must be set carefully. This study proposes a novel approach, support vector machine method combined with genetic algorithm (GA) for feature selection and chaotic particle swarm optimization(CPSO) for parameter optimization support vector Regression(SVR),to predict financial returns. The advantage of the GA-CPSO-SVR (Support Vector Regression) is that it can deal with feature selection and SVM parameter optimization simultaneously A numerical example is employed to compare the performance of the proposed model. Experiment results show that the proposed model outperforms the other approaches in forecasting financial returns.

Cite This Paper

Yuchen Fu, Yuanhu Cheng, "Application of an integrated support vector regression method in prediction of financial returns", International Journal of Information Engineering and Electronic Business(IJIEEB), vol.3, no.3, pp.37-43, 2011. DOI:10.5815/ijieeb.2011.03.06

Reference

[1]Granger, C. W. J., Combining forecasts- Twenty years later, Journal of Forecasting, Vol.8, pp.167-173, 1989.
[2]Krogh, A. and Vedelsby, J., Neural network ensembles, cross validation, and active learning, Ad-vances in Neural Information Processing System, Vol.7, pp.231-238, 1995.
[3]Vapnik V N. The Nature of Statistical Learning Theory [M]. New York : Springer - Verlag ,1995.
[4]Chang-Ying Ma, Sheng-yong , Yang,Hui Zhang, Ming-Li Xiang, Qi Huang, Yu-Quan Wei .Prediciton models of human plasma protein binding rate and oral bioavailability derived by using GA-CG-SVM method, Journal of Pharmaceutical and Biomedical Analysis 47(2008)677-682
[5]Vojislav, K., Learning and Soft Computing-Support Vector Machines, Neural Networks and Fuzzy Logic Models, The MIT Press, Massachusetts, 2001.
[6]Thissen, U., van Brakel, R., de Weijer, A. P., Melssen, W. J. and Buydens, L. M. C., Using support vector machines for time series prediction, Chemometrics and intelligent laboratory systems, Vol.69, pp.35-49, 2003.
[7]J.Cai,X.Ma,L.Li,H.Peng,Chaotic particle swarm optimization for economic dispatch considering the generator constraints, Energy Conversion and Management 48 (2) (2007) 645-653.
[8]P.F.Pai, C.S. Lin, A hybrid ARIMA and support machines model in stock price forcasting. OMEGA-International Journal of Management Science33,(6),(2005),497-505.
[9]LDavis,Handbook of Genetic Algorithms, Van Nostrand Reinhold,NewYork.1991.
[10]Kusum Deep, ManoJ ThakUr. A new crossover operator for real coded genetic Algorithms[J].Applied Mathematics and Computation.2006:16-19
[11]James E. Pettinger, Richhard M. Everson. Controlling genetic algorithms with reinforcement learning[C] //Proc of the genetic and evolutionary computation conference of contents. 2003, 1-11
[12]Clerc M. The swarm and the queen: Towards a deterministic and adaptive particle swarm optimization[C]. Proc of the Congress of Evolutionary Computation. Washington, 1999:1951-1957.
[13]Raghuwanshi M.M, Kakde O.G. Genetic Algorithm With Species And Sexual Selection[C] //Proc of the 2006 IEEE Conference on Cybernetics and Intelligent Systems. Bangkok, 2006, 1-8
[14]Yang Chen, JingLu Hu. Optimizing Reserve Size in Genetic Algorithm with Reserve Selection Using Reinforcement Learning[C] //Proc of the SICE Annual Conference. Takamatsu, 2007, 1341-1347
[15]Taejin Park, Ri Choe, Kwang Ryel Ryu. Adjusting Population Distance for the Dual-Population Genetic Algorithm[C] //Proc of the 20th Australian joint conference on Artificial Intelligence. Berlin Heidelberg, 2007, 171-180
[16]Jarno Martikainen, Seppo J. Ovaska. Hierarchical Two-Population Genetic Algorithm[C] //Proc of the 2005 IEEE Mid-Summer Workshop on Soft Computing in Industrial applications. Espoo, Finland, 2005, 91-98
[17]Shi Y, Eberhart R C. A modified particle swarm optimizer[R]. IEEE International Conference of Evolutionary Computation, Anchorage, Alaska, May 1998.
[18]DW van der Merwe, AP Engelbrecht. Data Clustering using Particle SwarmOptimization[J/OL].
[19]Clerc M., Kennedy J. The particle swarm-explosion stability, and convergence in a multidimendional compiex space[J]
[20]Ebethart R.C., Shi Y. Comparing between interia weights and constriction factors in particle swarm optimization [C]//Porceedings of the Congress on Evolutionary Programming ,2000:84-88
[21]Ven den Bergh F., Engelbrecht A. P. Using Neighbourbourhoods with the Guarranteed Convergence PSO. Porceedings of the 2006 IEEE, Swarm Intelligence Symposium,2003:235-242.
[22]Angeline P.J. Using Selection to Improve particle swwarm optimization[A]. Proceedings of the 1999 Congress on Evolutionary Computation [C]. Piscataway, NJ:IEEE Press,1999:84-89
[23]Brskar S., Suganthan P. N. A Novel Concurrent particle swarm optimization [A]. Proceedings of the 2004 Congress on Evolutionary Computation [C]. Piscataway,NJ:IEEE Press,2004:792-796.
[24]Jain, A., Zongker, D. Feature selection: Evaluation, application, and small sample performance[J]. IEEE Transactons on Pattern Analysis and Machine Intelligence, 1997,19:153-158.
[25]Koller, D.Sahami, M., Toward optimal feature selection[C]. In: Proceed-ings of International Conference on Machine Learning. 1996:284-292.
[26]Kira, K., Rendell, L.A. The feature selection problem: Traditional methods and a new algorithm[C]. In: Proceedings of Ninth National Conference on Artificial Intelligence. 1992:129-134.
[27]Narendra, P.M., Fukunaga, K.A branch and bound algorithm for feature selection[J].IEEE Transactions on Computers, 1997.26(9):917-922.
[28]Iqual M. Montes de Oca MA. “An Estimation of Distribution Particle Swarm optimization Algotithm,” Springer-verlag Berlin Heidelberg, PP,72-82.2006.
[29]Shi Y, Eberhart, RC. “A modified particle sarms optimization ,” Evolutionary Programming VII: Proceedings of the Seventh Annual Conference on Evolutionary Programming, New York, PP.591-600.1998.
[30]Storn R., Price K., “Differential evolution: A Simple and Efficient Adaptive Scheme for Global Optimization over Continuous Spaces,” Technical report, Tr-95-012, International Computer Sciences Institute, 1995
[31]A.E. Eiben,E. Marchiori, V.A. Valko, Evolutionary algorithms with on-the-fly population size adjustment , in: Proc.8th Conf. on Parallel Problem Solving from Nature, Birmingham, UK, in: Lecture Notes in Computer Science, vol.3242,Springer, Berlin,2004,pp.41-50.
[32]Osher S, Sethian J.A. Fronts propagating with curvature dependent speed: Algorithms based on the Hamilton-Jacobi formulation [J]. Journal of Computational Physics, 1988 , 79 (1): 12-49
[33]Potter M.A, De Jong K.A. A cooperative coevolutionary approach to function optimization. In: Davidor Y, Schwefel HP, Männer R,eds. Proc. of the Parallel Problem Solving from Nature—PPSN III, Int’l Conf. on Evolutionary Computation. LNCS 866, Berlin: Springer-Verlag, 1994, 249−257