Abstract
Error-Correcting Output Coding (ECOC) is a well-known class of decomposition schemes for multi-class classification. It allows representing any multiclass classification problem as a set of binary classification problems. Due to code redundancy ECOC schemes can significantly improve generalization performance on multi-class classification problems. However, they can face a computational complexity problem when the number of classes is large.
In this paper we address the computational-complexity problem of the decomposition schemes. We study a particular class of minimally-sized ECOC decomposition schemes, namely the class of minimally-sized balanced decomposition schemes (MBDSs) [14].We show thatMBDSs do not face a computational-complexity problem for large number of classes. However we also show that MBDSs cannot correct the classification errors of the binary classifiers in MBDS ensembles. Therefore we propose voting with MBDS ensembles (VMBDSs).We show that the generalization performance of the VMBDSs ensembles improves with the number of MBDS classifiers. However this number can become large and thus the VMBDSs ensembles can have a computational-complexity problem as well. Fortunately our experiments show that VMBDSs are comparable with ECOC ensembles and can outperform one-against-all ensembles using only a small number of MBDS ensembles.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Allwein, E., Schapire, R., Singer, Y.: Reducing multiclass to binary: A unifying approach for margin classifiers. J. Machine Learning Research 1, 113–141 (2002)
Asuncion, A., Newman, D.J.: UCI machine learning repository, http://www.ics.uci.edu/~mlearn/MLRepository.html
Cohen, W.: Fast effective rule induction. In: Prieditis, A., Russell, R. (eds.) Proc. the 12th Int. Conf. Machine Learning, Tahoe City, CA, pp. 115–123. Morgan Kaufmann, San Francisco (1995)
Dietterich, T.G., Bakiri, G.: Solving multiclass learning problems via error-correcting output codes. J. Artif. Intell. Research 2, 263–286 (1995)
Domingos, P.: A unified bias-variance decomposition for zero-one and squared loss. In: Kautz, H., Porter, B. (eds.) Proc. the 17th National Conf. Artif. Intell. and 12th Conf. Innovative Applications Artif. Intell., pp. 564–569. AAAI Press, Menlo Park (2000)
Freund, Y., Schapire, R.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comp. Syst. Sci. 55, 119–139 (1997)
Fürnkranz, J.: Round robin classification. J. Machine Learning Research 2, 721–747 (2002)
Keerthi, S.S., Shevade, S.K., Bhattacharyya, C., Murthy, K.R.K.: Improvements to Platt’s SMO algorithm for SVM classifier design. Neural Comp. 13, 637–649 (2001)
Kong, E.B., Dietterich, T.G.: Error-correcting output coding corrects bias and variance. In: Prieditis, A., Russell, S.J. (eds.) Proc. the 12th Int. Conf. Machine Learning, Tahoe City, CA, pp. 313–321. Morgan Kaufmann, San Francisco (1995)
le Cessie, S., van Houwelingen, J.C.: Ridge estimators in logistic regression. Applied Statistics 41, 191–201 (1992)
Libor, S.: Face recognition database (2011), http://cswww.essex.ac.uk/mv/allfaces/index.html
Lissoni, F., Llerena, P., Sanditov, B.: Inventors’ small worlds: academic and CNRS researchers in networks of inventors in France. In: Proc. the DIME Final Conf., Maastricht, The Netherlands (2011)
Lorena, A.C., De Carvalho, A.C.P.L.F., Gama, J.M.P.: A review on the combination of binary classifiers in multiclass problems. Artif. Intell. Rev. 30, 19–37 (2008)
Mayoraz, E., Moreira, M.: On the decomposition of polychotomies into dichotomies. In: Fisher, D.H. (ed.) Proc. the 14th Int. Conf. Machine Learning, Nashville, TN, pp. 219–226. Morgan Kaufmann, San Francisco (1997)
Nadeau, C., Bengio, Y.: Inference for the generalization error. Machine Learning 52, 239–281 (2001)
Nilsson, N.: Learning machines: foundations of trainable pattern-classifying systems. McGraw-Hill, New York (1965)
Peterson, W., Weldon, J.: Error-correcting codes. MIT Press, Cambridge (1972)
Rifkin, R., Klautau, A.: In defense of one-vs-all classification. J. Machine Learning Research 5, 101–141 (2004)
Weston, J., Watkins, C.: Support vector machines for multi-class pattern recognition. In: Verleysen, M. (ed.) Proc. the 7th European Symp. Artif. Neural Networks, Bruges, Belgium, pp. 219–224 (1999)
Witten, I., Frank, E., Hall, M.: Data mining: Practical machine learning tools and techniques. Morgan Kaufmann, San Francisco (2011)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Smirnov, E.N., Moed, M., Nalbantov, G., Sprinkhuizen-Kuyper, I. (2011). Minimally-Sized Balanced Decomposition Schemes for Multi-class Classification. In: Okun, O., Valentini, G., Re, M. (eds) Ensembles in Machine Learning Applications. Studies in Computational Intelligence, vol 373. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-22910-7_3
Download citation
DOI: https://doi.org/10.1007/978-3-642-22910-7_3
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-22909-1
Online ISBN: 978-3-642-22910-7
eBook Packages: EngineeringEngineering (R0)