Abstract and Applied Analysis

A Weighted Voting Classifier Based on Differential Evolution

Yong Zhang, Hongrui Zhang, Jing Cai, and Binbin Yang

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


Ensemble learning is to employ multiple individual classifiers and combine their predictions, which could achieve better performance than a single classifier. Considering that different base classifier gives different contribution to the final classification result, this paper assigns greater weights to the classifiers with better performance and proposes a weighted voting approach based on differential evolution. After optimizing the weights of the base classifiers by differential evolution, the proposed method combines the results of each classifier according to the weighted voting combination rule. Experimental results show that the proposed method not only improves the classification accuracy, but also has a strong generalization ability and universality.

Article information

Abstr. Appl. Anal., Volume 2014, Special Issue (2014), Article ID 376950, 6 pages.

First available in Project Euclid: 6 October 2014

Permanent link to this document

Digital Object Identifier

Zentralblatt MATH identifier


Zhang, Yong; Zhang, Hongrui; Cai, Jing; Yang, Binbin. A Weighted Voting Classifier Based on Differential Evolution. Abstr. Appl. Anal. 2014, Special Issue (2014), Article ID 376950, 6 pages. doi:10.1155/2014/376950. https://projecteuclid.org/euclid.aaa/1412606373

Export citation


  • T. Dietterich, “Ensemble learning,” in The Handbook of Brain Theory and Neural Networks, 2nd edition, 2002.
  • A. Ekbal and S. Saha, “Weighted vote-based classifier ensemble for named entity recognition: a genetic algorithm-based approach,” ACM Transactions on Asian Language Information Processing, vol. 10, no. 2, article 9, 37 pages, 2011.
  • Z.-H. Zhou, J. Wu, and W. Tang, “Ensembling neural networks: many could be better than all,” Artificial Intelligence, vol. 137, no. 1-2, pp. 239–263, 2002.
  • Z. H. Zhou, “Ensemble learning,” in Encyclopedia of Biometrics, S. Z. Li, Ed., pp. 270–273, Springer, Berlin, Germany, 2009.
  • Y. Freund and R. E. Schapire, “A decision-theoretic generalization of on-line learning and an application to boosting,” Journal of Computer and System Sciences, vol. 55, no. 1, part 2, pp. 119–139, 1997.
  • L. Breiman, “Bagging predictors,” Machine Learning, vol. 24, no. 2, pp. 123–140, 1996.
  • D. H. Wolpert, “Stacked generalization,” Neural Networks, vol. 5, no. 2, pp. 241–259, 1992.
  • L. I. Kuncheva and J. J. Rodríguez, “A weighted voting framework for classifiers ensembles,” Knowledge and Information Systems, vol. 38, no. 2, pp. 259–275, 2014.
  • L. Lam and C. Y. Suen, “Application of majority voting to pattern recognition: an analysis of its behavior and performance,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, pp. 553–568, 1997.
  • R. Storn and K. Price, “Differential evolution–-a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997.
  • S. Das and P. N. Suganthan, “Differential evolution: a survey of the state-of-the-art,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 1, pp. 4–31, 2011.
  • K. V. Price, R. M. Storn, and J. A. Lampinen, Differential Evolution: A Practical Approach to Global Optimization, Springer, Berlin, Germany, 2005.
  • S. Das, A. Abraham, and A. Konar, “Automatic clustering using an improved differential evolution algorithm,” IEEE Transactions on Systems, Man, and Cybernetics A: Systems and Humans, vol. 38, no. 1, pp. 218–237, 2008.
  • R. Mukherjee, G. R. Patra, R. Kundu, and S. Das, “Cluster-based differential evolution with Crowding Archive for niching in dynamic environments,” Information Sciences, vol. 267, pp. 58–82, 2014.
  • J. Brest, S. Greiner, B. Bošković, M. Mernik, and V. Zumer, “Self-adapting control parameters in differential evolution: a comparative study on numerical benchmark problems,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 6, pp. 646–657, 2006.
  • J. Zhang and A. C. Sanderson, “JADE: adaptive differential evolution with optional external archive,” IEEE Transactions on Evolutionary Computation, vol. 13, no. 5, pp. 945–958, 2009.
  • J. Brest and M. S. Maučec, “Self-adaptive differential evolution algorithm using population size reduction and three strategies,” Soft Computing, vol. 15, no. 11, pp. 2157–2174, 2011.
  • W. Gong, Z. Cai, and Y. Wang, “Repairing the crossover rate in adaptive differential evolution,” Applied Soft Computing, vol. 15, pp. 149–168, 2014.
  • M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I. H. Witten, “The WEKA data mining software: an update,” SIGKDD Explorations, vol. 11, no. 1, pp. 10–18, 2009.
  • UCI Machine Learning Repository, http://archive.ics.uci.edu/ ml/.
  • J. R. Quinlan, C4.5: Programs for Machine Learning, Morgan Kaufmann, San Francisco, Calif, USA, 1993.
  • P. Domingos and M. Pazzani, “On the optimality of the simple Bayesian classifier under zero-one loss,” Machine Learning, vol. 29, no. 2-3, pp. 103–137, 1997.
  • J. Pearl, Probabilistic Reasoning in Intelligent Systems, Morgan Kaufmann, San Francisco, Calif, USA, 1988. \endinput