Neural Networks Trained by Randomized Algorithms

Authors

  • Qin Qin
  • Qing-Guo Wang National University of Singapore
  • Shuzhi Sam Ge National University of Singapore
  • Chao Yu National University of Singapore

DOI:

https://doi.org/10.14738/tmlai.21.39

Abstract

In this paper, a new model framework is proposed where a group of neural networks are trained with randomized algorithms. By incorporating randomization of random forests into a training algorithm of a neural network, the repeated running of such a revised training algorithm yields multiple independent neural networks. This group of multiple models jointly may outperform individual models. Simulation studies are conducted on various examples including practical ones such as stock markets and show that the proposed model group overall performs better than single neural network and a random forest. When there is significant noise in the data set, the performance of the former drops relatively less than the latter. In particular, the former produces much lower deviation of the performance and higher mean performance compared with the latter. Therefore, the proposed method has strong ability to classify the noisy data and perform robustly.

References

Trevor Hastie, Robert Tibshirani and Jerome Friedman, The Elements of Statistical Learning, Springer New York, 2011.

Ben Krose, Patrick van der Smagt, An introduction to Neural Networks, 1996.

Hadzibeganovic, Tarik & Cannas, Sergio A., A Tsallis' statistics based neural network model for novel word learning, Physica A: Statistical Mechanics and its Applications 388 (5): 732–746. DOI:10.1016/j.physa.2008.10.042, 2009.

Van den Bergh, F. Engelbrecht, AP., Cooperative Learning in Neural Networks using Particle Swarm Optimizers, CIRG, 2000.

Zou Liping, Zou Tao, The application of neural network for Sneak Circuit Analysis on the aircraft electrical system, Prognostics and System Health Management Conference (PHM-Shenzhen), Page(s): 1-5, 2011.

Arsene. C.T.C., Lisboa. P.J., Bayesian Neural Network with and without compensation for competing risks, Neural Networks (IJCNN), The International Joint Conference, Page(s): 1-8, 2012.

Lou Haichuan, Su Hongye, Xie Lei, Gu Yong, Rong Gang, Multiple-prior-knowledge neural network for industrial processes, Automation and Logistics (ICAL), IEEE International Conference, Page(s): 385-390, 2010.

Kondo. T., Ueno. J., Nonlinear system identification by feedback GMDH-type neural network with architecture self-selecting function, Intelligent Control (ISIC), IEEE International Symposium, Page(s): 1521-1526, September 2010.

Shuang Han, Yongqian Liu, Jie Yan, Neural Network Ensemble Method Study for Wind Power Prediction, Power and Energy Engineering Conference (APPEEC), Asia-Pacific, Page(s): 1-4, March 2011.

McLeod. P., Verma. B., Clustered ensemble neural network for breast mass classification in digital mammography, Neural Networks (IJCNN), The International Joint Conference, Page(s): 1-6, June 2012.

E. Gelenbe, Random neural networks with negative and positive signals and product form solution, Neural Computation, vol. 1, no. 4, pp. 502–511, 1989.

Safavian S.R. and Landgrebe D., A survey of decision tree classifier methodology, Systems, Man and Cybernetics, IEEE Transactions on, volume: 21, pages: 660-674, 1991.

Anthony,J.M, Robert, N.F,Yang, L.,Nathaniel, A.W and Steven, D.B., An introduction to decision tree modeling, Journal of Chemometrics, 18: 275–285, 2004.

Yu. L. Pavlov., Random Forests, Utrecht, VSP, 2000.

Breiman, L., Random Forests. Machine Learning Journal 45, 532, 2001.

V. Svetnik, A. Liaw, C. Tong, J. C. Culberson, R. P. Sheridan and B. P. Feuston, Random forest: a classification and regression tool for compound classification and QSAR modeling, Journal of chemical information and computer sciences, volume: 43, pages: 1947--1958, 2003.

Maragoudakis, M and Serpanos, D., towards stock market data mining using enriched random forests from textual resources and technical indicators. AIAI, IFIP AICT 339, pp. 278–286, 2010.

Manish Kumar, Thenmozhi M., Forecasting Stock Index Movement: A Comparison of Support Vector Machines and Random Forest, Indian Institute of Capital Markets 9th Capital Markets Conference, January 2006.

Zhenping Yi, Jingchang Pan, Application of random forest to stellar spectral classification, Image and Signal Processing (CISP), 3rd International Congress, Volume: 7, Page(s): 3129-3232, 2010.

Dittman. D., Khoshgoftaar. T.M., Wald. R., Napolitano. A., Random forest: A reliable tool for patient response prediction, Bioinformatics and Biomedicine Workshops (BIBMW), IEEE International Conference, Page(s): 289-296, 2011.

Moschidis. E., Graham. J., Automatic differential segmentation of the prostate in 3-D MRI using Random Forest classification and graph-cuts optimization, Biomedical Imaging (ISBI), 9th IEEE International Symposium, Page(s): 1727-1730, 2012.

Breiman, L., Random Forests. Machine Learning Journal 45, 532, 2001.

Downloads

Published

2014-02-10

How to Cite

Qin, Q., Wang, Q.-G., Ge, S. S., & Yu, C. (2014). Neural Networks Trained by Randomized Algorithms. Transactions on Engineering and Computing Sciences, 2(1), Page:01–17. https://doi.org/10.14738/tmlai.21.39