×

Evolutionary feature selection applied to artificial neural networks for wood-veneer classification. (English) Zbl 1141.68555

Summary: This paper presents the application of FeaSANNT, an evolutionary algorithm for optimization of artificial neural networks, to the training of a multi-layer perceptron for identification of defects in wood veneer. Given a fixed artificial neural network structure, FeaSANNT concurrently evolves the input feature vector and the network weights. The novelty of the method lies in the implementation of the embedded approach in an evolutionary feature selection paradigm. Experimental tests show that the proposed algorithm produces high-performing solutions with robust learning results. A significant reduction of the set of veneer features is obtained. Experimental comparisons are made with a previous method based on statistical filtering of the input features and a standard genetic wrapper algorithm. In the first case, FeaSANNT greatly reduces the feature set with no degradation of the neural network accuracy. Moreover, FeaSANNT entails lower design costs, since feature selection is fully automated. In the second case, the proposed algorithm achieves superior results in terms of identification accuracy and reduction of the feature set. FeaSANNT involves also lower computational costs than the standard evolutionary wrapper approach and eases the algorithm design effort. Limited overlapping is observed between the patterns of features selected by the three algorithms. This result suggests that the full feature set contains mainly redundant attributes.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
68U10 Computing methodologies for image processing

Software:

DistAl

References:

[1] DOI: 10.1016/0306-9877(92)90093-R · doi:10.1016/0306-9877(92)90093-R
[2] DOI: 10.1016/S0004-3702(97)00063-5 · Zbl 0904.68142 · doi:10.1016/S0004-3702(97)00063-5
[3] DOI: 10.1007/BF01199883 · doi:10.1007/BF01199883
[4] Fogel DB, Evolutionary Computation: Toward a New Philosophy of Machine Intelligence (2000)
[5] Huber HA, Forest Prod. J. 35 pp 79– (1985)
[6] Packianather MS, PhD thesis (1997)
[7] DOI: 10.1007/s001700050174 · doi:10.1007/s001700050174
[8] DOI: 10.1002/1099-1638(200011/12)16:6<461::AID-QRE341>3.0.CO;2-G · doi:10.1002/1099-1638(200011/12)16:6<461::AID-QRE341>3.0.CO;2-G
[9] Pham DT, Proc. IMechE E 210 pp 45– (1996)
[10] Pham DT, Advances in Manufacturing: Decision, Control and Information Technology pp 80– (1999)
[11] DOI: 10.1243/0954405991516903 · doi:10.1243/0954405991516903
[12] DOI: 10.1243/0954408991529852 · doi:10.1243/0954408991529852
[13] Pham DT, Neural Networks for Identification, Prediction and Control (1995)
[14] DOI: 10.1016/0921-5956(92)80008-H · doi:10.1016/0921-5956(92)80008-H
[15] Rumelhart, DE and McClelland, JL. 1986.Parallel Distributed Processing: Exploration in the Micro-Structure of Cognition, 1–2. Cambridge, MA: MIT Press.
[16] Smith MG, Proceedings of the 6th European Conference on Genetic Programming pp 229– (2003)
[17] Thierens D, Artificial Neural Networks and Genetic Algorithms pp 658– (1993)
[18] Vafaie H, Proceedings of the International Conference on Tools with AI pp 8– (1995)
[19] Yang J, Feature Extraction, Construction, and Subset Selection: A Data Mining Perspective pp 1– (1998)
[20] DOI: 10.1109/5.784219 · doi:10.1109/5.784219
[21] DOI: 10.1016/j.patrec.2004.09.053 · doi:10.1016/j.patrec.2004.09.053
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.