The Hybrid Dynamic Prototype Construction and Parameter Optimization with Genetic Algorithm for Support Vector Machine

Authors

  • Chun-Liang Lu
  • I-Fang Chung
  • Tsun-Chen Li

Abstract

The optimized hybrid artificial intelligence model is a potential tool to deal with construction engineering and management problems. Support vector machine (SVM) has achieved excellent performance in a wide variety of applications. Nevertheless, how to effectively reduce the training complexity for SVM is still a serious challenge. In this paper, a novel order-independent approach for instance selection, called the dynamic condensed nearest neighbor (DCNN) rule, is proposed to adaptively construct prototypes in the training dataset and to reduce the redundant or noisy instances in a classification process for the SVM. Furthermore, a hybrid model based on the genetic algorithm (GA) is proposed to simultaneously optimize the prototype construction and the SVM kernel parameters setting to enhance the classification accuracy. Several UCI benchmark datasets are considered to compare the proposed hybrid GA-DCNN-SVM approach with the previously published GA-based method. The experimental results illustrate that the proposed hybrid model outperforms the existing method and effectively improves the classification performance for the SVM.

References

V. N. Vapnik, Statistical learning theory, 1st ed. New York: Wiley, 1998.

D. Anguita, A. Ghio, L. Oneto and S. Ridella, "In-Sample and out-of-sample model selection and error estimation for

support vector machines," IEEE Transactions on Neural Networks and Learning Systems, vol. 23, pp.1390-1406, 2012.

C. L. Huang and C. J. Wang, "A GA-based feature selection and parameters optimization for support vector machines,"

Expert systems with applications, vol. 31, pp. 231-240, 2006.

P. E. Hart, "The condensed nearest neighbor rule," IEEE Transactions on Information Theory, vol. 14, pp. 515-516,

F. Angiulli and A. Astorino, "Scaling up support vector machines using nearest neighbor condensation," IEEE

Transactions on Neural Networks, vol. 21, pp. 351-357, 2010.

F. Angiulli, "Fast nearest neighbor condensation for Large data sets classification," IEEE Trans. Knowledge and data

engineering, vol. 19, pp. 1450-1464, 2007.

A. Abroudi and F. Farokhi, "Prototype selection for training artificial neural networks based on Fast Condensed Nearest

Neighbor rule," IEEE Conference on Open Systems (ICOS), pp. 1-4, 2012.

X. Zhao, D. Li, Bo Yang, H. Chen, X. Yang, C. L. Yu and S. Y. Liu, "A two-stage feature selection method with its

application," Computers & Electrical Engineering, vol. 47, pp. 114-125, Oct. 2015.

W. Shu and H. Shen, "Incremental feature selection based on rough set in dynamic incomplete data," Pattern

Recognition, vol. 47, pp. 3890-3906, Dec. 2014.

J. R. Rico-Juan and J. M. Iñesta, "Adaptive training set reduction for nearest neighbor classification," Neurocomputing,

vol. 138, pp. 316-324, Aug. 2014.

J. L. Chen and C. S. Yang, "Optimizing the proportion of prototypes generation in nearest neighbor classification,"

International Conference on Machine Learning and Cybernetics (ICMLC), vol. 4, pp. 1695-1699, 2013.

Y. Miao, X. Tao, Y. Sun, Y. Li and J. Lu, "Risk-based adaptive metric learning for nearest neighbor classification,"

Neurocomputing, vol. 156, pp. 33-41, May 2015.

J. H. Holland, Adaption in natural and artificial systems, Cambridge, MIT Press Cambridge, MA, USA, 1992.

X. Sun, J. Wang, F.H. Mary and L. Kong, "Scale invariant texture classification via sparse representation,"

Neurocomputing, vol. 122, pp. 338-348, Dec. 2013.

S. Cateni, V. Colla and M. Vannucci, "A hybrid feature selection method for classification purposes," European

Modelling Symposium (EMS), pp. 39-44, 2014.

C. C. Chang and C. J. Lin, "Training nu-support vector regression: theory and algorithms," Neural Computation, vol. 14,

pp. 1959-1977, 2002.

Downloads

Published

2015-10-01

How to Cite

[1]
C.-L. Lu, I.-F. Chung, and T.-C. Li, “The Hybrid Dynamic Prototype Construction and Parameter Optimization with Genetic Algorithm for Support Vector Machine”, Int. j. eng. technol. innov., vol. 5, no. 4, pp. 220–232, Oct. 2015.

Issue

Section

Articles