×

From dynamic classifier selection to dynamic ensemble selection. (English) Zbl 1140.68466

Summary: In handwritten pattern recognition, the multiple classifier system has been shown to be useful for improving recognition rates. One of the most important tasks in optimizing a multiple classifier system is to select a group of adequate classifiers, known as an Ensemble of Classifiers (EoC), from a pool of classifiers. Static selection schemes select an EoC for all test patterns, and dynamic selection schemes select different classifiers for different test patterns. Nevertheless, it has been shown that traditional dynamic selection performs no better than static selection. We propose four new dynamic selection schemes which explore the properties of the oracle concept. Our results suggest that the proposed schemes, using the majority voting rule for combining classifiers, perform better than the static selection method.

MSC:

68T10 Pattern recognition, speech recognition

Software:

PRTools
Full Text: DOI

References:

[1] Brown, G.; Wyatt, J.; Harris, R.; Yao, X., Diversity creation methods: a survey and categorisation, Int. J. Inf. Fusion, 6, 1, 5-20 (2005)
[2] Kittler, J.; Hatef, M.; Duin, R.; Matas, J., On combining classifiers, IEEE Trans. Pattern Anal. Mach. Intell., 20, 3, 226-239 (1998)
[3] Kuncheva, L. I.; Whitaker, C. J., Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy, Mach. Learn., 51, 2, 181-207 (2003) · Zbl 1027.68113
[4] Ho, T. K., The random space method for constructing decision forests, IEEE Trans. Pattern Anal. Mach. Intell., 20, 8, 832-844 (1998)
[5] A. Grove, D. Schuurmans, Boosting in the limit: maximizing the margin of learned ensembles, in: Proceedings of the 15th National Conference on Artificial Intelligence, 1998, pp. 692-699.; A. Grove, D. Schuurmans, Boosting in the limit: maximizing the margin of learned ensembles, in: Proceedings of the 15th National Conference on Artificial Intelligence, 1998, pp. 692-699.
[6] Kuncheva, L. I.; Skurichina, M.; Duin, R. P.W., An experimental study on diversity for Bagging and Boosting with linear classifiers, Int. J. Inf. Fusion, 3, 2, 245-258 (2002)
[7] Schapire, R. E.; Freund, Y.; Bartlett, P.; Lee, W. S., Boosting the margin: a new explanation for the effectiveness of voting methods, Ann. Stat., 26, 5, 1651-1686 (1998) · Zbl 0929.62069
[8] D. Ruta, B. Gabrys, Classifier selection for majority voting, Int. J. Inf. Fusion (2005) 63-81.; D. Ruta, B. Gabrys, Classifier selection for majority voting, Int. J. Inf. Fusion (2005) 63-81. · Zbl 0980.68914
[9] Cao, J.; Ahmadi, M.; Shridhar, M., Recognition of handwritten numerals with multiple feature and multistage classifier, Pattern Recognition, 28, 2, 153-160 (1995)
[10] Didaci, L.; Giacinto, G.; Roli, F.; Marcialis, G. L., A study on the performances of dynamic classifier selection based on local accuracy estimation, Pattern Recognition, 38, 11, 2188-2191 (2005) · Zbl 1077.68797
[11] L. Didaci, G. Giacinto, Dynamic classifier selection by adaptive \(K\); L. Didaci, G. Giacinto, Dynamic classifier selection by adaptive \(K\)
[12] G. Giacinto, F. Roli, Methods for dynamic classifier selection, International Conference on Image Analysis and Processing (ICIAP 1999), 1999, pp. 659-664.; G. Giacinto, F. Roli, Methods for dynamic classifier selection, International Conference on Image Analysis and Processing (ICIAP 1999), 1999, pp. 659-664.
[13] Gunes, V.; Ménard, M.; Loonis, P.; Petit-Renaud, S., Combination, cooperation and selection of classifiers: a state of the art, Int. J. Pattern Recognition Artif. Intell., 17, 8, 1303-1324 (2003)
[14] Kuncheva, L. I., Switching between selection and fusion in combining classifiers: an experiment, IEEE Trans. Syst. Man Cybern. Part B, 32, 2, 146-156 (2002)
[15] Woods, K.; Kegelmeyer, W. P.; Bowyer, K., Combination of multiple classifiers using local accuracy estimates, IEEE Trans. Pattern Anal. Mach. Intell., 19, 4, 405-410 (1997)
[16] R.P.W. Duin, Pattern Recognition Toolbox for Matlab 5.0+, available free at: \( \langle;\) ftp://ftp.ph.tn.tudelft.nl/pub/bob/prtools \(\rangle;\); R.P.W. Duin, Pattern Recognition Toolbox for Matlab 5.0+, available free at: \( \langle;\) ftp://ftp.ph.tn.tudelft.nl/pub/bob/prtools \(\rangle;\)
[17] D. Whitley, Functions as permutations: regarding no free lunch, Walsh analysis and summary statistics, Parallel Problem Solving from Nature (PPSN 2000), 2000, pp. 169-178.; D. Whitley, Functions as permutations: regarding no free lunch, Walsh analysis and summary statistics, Parallel Problem Solving from Nature (PPSN 2000), 2000, pp. 169-178.
[18] D.H. Wolpert, W.G. Macready, No free lunch theorems for search, IEEE Trans. Evol. Comput. 1 (1) (1997) 67-82.; D.H. Wolpert, W.G. Macready, No free lunch theorems for search, IEEE Trans. Evol. Comput. 1 (1) (1997) 67-82.
[19] G. Tremblay, R. Sabourin, P. Maupin, Optimizing nearest neighbour in Random Subspaces using a multi-objective genetic algorithm, in: Proceedings of the 17th International Conference on Pattern Recognition (ICPR 2004), 2004, pp. 208-211.; G. Tremblay, R. Sabourin, P. Maupin, Optimizing nearest neighbour in Random Subspaces using a multi-objective genetic algorithm, in: Proceedings of the 17th International Conference on Pattern Recognition (ICPR 2004), 2004, pp. 208-211.
[20] P. Radtke, T. Wong, R. Sabourin, An evaluation of over-fit control strategies for multi-objective evolutionary optimization, IEEE World Congress on Computational Intelligence (WCCI 2006)—International Joint Conference on Neural Networks (IJCNN 2006), 2006.; P. Radtke, T. Wong, R. Sabourin, An evaluation of over-fit control strategies for multi-objective evolutionary optimization, IEEE World Congress on Computational Intelligence (WCCI 2006)—International Joint Conference on Neural Networks (IJCNN 2006), 2006.
[21] Oliveira, L. S.; Sabourin, R.; Bortolozzi, F.; Suen, C. Y., Automatic recognition of handwritten numerical strings: a recognition and verification strategy, IEEE Trans. Pattern Anal. Mach. Intell., 24, 11, 1438-1454 (2002)
[22] J. Milgram, M. Cheriet, R. Sabourin, Estimating accurate multi-class probabilities with support vector machines, International Joint Conference on Neural Networks (IJCNN 05), 2005, pp. 1906-1911.; J. Milgram, M. Cheriet, R. Sabourin, Estimating accurate multi-class probabilities with support vector machines, International Joint Conference on Neural Networks (IJCNN 05), 2005, pp. 1906-1911.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.