×

Novel weighting in single hidden layer feedforward neural networks for data classification. (English) Zbl 1252.62064

Summary: We propose a binary classifier based on the single hidden layer feedforward neural network (SLFN) using radial basis functions (RBFs) and sigmoid functions in the hidden layer. We use a modified attribute-class correlation measure to determine the weights of attributes in the networks. Moreover, we propose new weights called as influence weights to utilize in the weights connecting the input layer and the hidden layer nodes (hidden weights) of the network with sigmoid hidden nodes. These weights are calculated as the sum of conditional probabilities of attribute values given class labels. Our learning procedure of the networks is based on the extreme learning machines; in which the parameters of the hidden nodes are first calculated and then the weights connecting the hidden nodes and output nodes (output weights) are found. The results of the networks with the proposed weights on some benchmark data sets show improvements over those of the conventional networks.

MSC:

62H30 Classification and discrimination; cluster analysis (statistical aspects)
68T05 Learning and adaptive systems in artificial intelligence

Software:

UCI-ml
Full Text: DOI

References:

[1] Huang, G.-B.; Chen, L.; Siew, C.-K., Universal approximation using incremental constructive feedforward networks with random hidden nodes, IEEE Transactions on Neural Networks, 17, 4 (2006)
[2] Broomhead, D. S.; Lowe, D., Multivariable functional interpolation and adaptive networks, Complex Systems, 2, 321-355 (1988) · Zbl 0657.68085
[3] Park, J.; Sandberg, I., Universal approximation using radial-basis function networks, Neural Computation, 3, 246-257 (1991)
[4] Park, J.; Sandberg, I., Universal approximation and radial basis function network, Neural Computation, 5, 305-316 (1993)
[5] Billings, S. A.; Wei, H.-L.; Balikhin, M. A., Generalized multiscale radial basis function networks, Neural Networks, 20, 1081-1094 (2007) · Zbl 1254.68197
[6] Bishop, C. M., Neural Networks for Pattern Recognition (1995), Clarendon Press: Clarendon Press Oxford
[7] Looney, C. G., Pattern Recognition Using Neural Networks: Theory and Algorithm for Engineers and Scientist (1997), Oxford University Press: Oxford University Press New York
[8] Pérez-Godoy, M.; Fernández, A.; Rivera, A.; del Jesus, M., Analysis of an evolutionary RBFN design algorithm, \(C O^2\) RBFN, for imbalanced data sets, Pattern Recognition Letters, 31, 2375-2388 (2010)
[9] Schwenker, F.; Kesler, H. A.; Palm, G., Three learning phases for radial-basis-function networks, Neural Networks, 14, 4-5, 439-458 (2001)
[10] Golbabai, A.; Mammadov, M.; Seifollahi, S., Solving a system of nonlinear integral equations by an RBF network, Computers & Mathematics with Applications, 57, 1651-1658 (2009) · Zbl 1186.45009
[11] Haykin, S., Neural Networks: A Comprehensive Foundation (1999), Prentice-Hall: Prentice-Hall Englewood Cliffs, NJ · Zbl 0934.68076
[12] Huan, H. X.; Hien, D. T.T.; Huynh, H. T., A novel efficient two-phase algorithm for training interpolation radial basis function networks, Signal Processing, 87, 2708-2717 (2007) · Zbl 1186.94150
[13] Masters, T., Advanced Algorithms for Neural Networks: A C++ Sourcebook (1995), Wiley: Wiley New York
[14] Huang, G.-B.; Zhu, Q.-Y.; Siew, C.-K., Extreme learning machine: theory and applications, Neurocomputing, 70, 489-501 (2006)
[15] Rao, C. R.; Mitra, S. K., Generalized Inverse of Matrices and its Applications (1971), Wiley: Wiley New York · Zbl 0236.15004
[16] Widrow, B.; Lehr, M., 30 years of adaptive neural networks: perceptron, madaline and backpropagation, Proceedings of the IEEE, 78, 9, 1415-1442 (1990)
[17] Fu, X.; Wang, L., Data dimensionality reduction with application to simplifying RBF network structure and improving classification performance, IEEE Transactions on Systems, Man, and Cybernetics, Part B, 33, 3 (2003)
[18] A. Quinn, A. Stranieri, J. Yearwood, G. Hafen, A classification algorithm that derives weighted sum scores for insight into disease, in: Proc. of 3rd Australasian Workshop on Health Informatics and Knowledge Management, HIKM 2009, Wellington, New Zealand, 2009.; A. Quinn, A. Stranieri, J. Yearwood, G. Hafen, A classification algorithm that derives weighted sum scores for insight into disease, in: Proc. of 3rd Australasian Workshop on Health Informatics and Knowledge Management, HIKM 2009, Wellington, New Zealand, 2009.
[19] Rojas, I.; Valenzuela, O.; Prieto, A., Statistical analysis of the main parameters in the definition of radial basis function networks, Lecture Notes in Computer Science, 1240, 882-891 (1997)
[20] Golub, G.; Van Loan, C., Matrix Computations (1996), Hopkins University Press · Zbl 0865.65009
[21] Moody, J.; Darken, C. J., Fast learning in networks of locally-tuned processing units, Neural Computation, 1, 281-294 (1989)
[22] Kohonen, T., The self-organizing map, Proceedings of the IEEE, 78, 1464-1480 (1990)
[23] Abdel Hady, M. F.; Schwenker, F.; Palm, G., Semi-supervised learning of three-structured RBF networks using co-training, (ICANN 2008, Part I. ICANN 2008, Part I, LNCS, vol. 5163 (2008)), 79-88
[24] Liang, N. Y.; Huang, G. B.; Saratchandran, P.; Sundararajan, N., A fast and accurate online sequential learning algorithm for feedforward networks, IEEE Transactions on Neural Networks, 17, 1411-1423 (2006)
[25] A. Asuncion, D. Newman, UCI machine learning repository, School of Information and Computer Science, University of California, Irvine, 2007. http://www.ics.uci.edu/mlearn/MLRepository.html; A. Asuncion, D. Newman, UCI machine learning repository, School of Information and Computer Science, University of California, Irvine, 2007. http://www.ics.uci.edu/mlearn/MLRepository.html
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.