×

Linear programming, recurrent associative memories, and feed-forward neural networks. (English) Zbl 0776.90050

Summary: Many optimization procedures presume the availability of an initial approximation in the neighborhood of a local or global optimum. Unfortunately, finding a set of good starting conditions is itself a nontrivial proposition. Our previous papers [R. Kalaba, M. Kim and J. E. Moore II, Appl. Math. Comput. 40, No. 3, 203-214 (1990; Zbl 0719.65051); Int. J. Gen. Syst. 20, No. 2, 177-194 (1992; Zbl 0745.90073)] describe procedures that use simple and recurrent associative memories to identify approximate solutions to closely related linear programs. In this paper, we compare the performance of a recurrent associative memory to that of a feedforward neural network trained with the same data. The neural network’s performance is much less promising than that of the associative memory. Modest infeasibilities exist in the estimated solutions provided by the associative memory, but the basic variables defining the optimal solutions to the linear programs are readily apparent.

MSC:

90C05 Linear programming
90-08 Computational methods for problems pertaining to operations research and mathematical programming
92B20 Neural networks for/in biological studies, artificial life and related topics
Full Text: DOI

References:

[1] Kalaba, R.; Kim, M.; Moore, J. E., Linear programming and simple associative memories, Applied Mathematics and Computation, 40, 3, 203-214 (1990) · Zbl 0719.65051
[2] Kalaba, R.; Kim, M.; Moore, J. E., Linear programming and recurrent associative memories, General Systems (1991), (to appear) · Zbl 0776.90050
[3] Kalaba, R.; Tesfatsion, L., Obtaining initial parameter estimates for nonlinear systems using multicriteria associative memories, Computer Science in Economics and Management (1991), (to appear) · Zbl 0768.93034
[4] Kohonen, T., Self-Organization and Associative Memory (1989), Springer-Verlag: Springer-Verlag New York · Zbl 0659.68100
[5] Murakami, K.; Aibara, T., An improvement on the Moore-Penrose generalized inverse associative memory, IEEE Transactions on Systems, Man, and Cybernetics, 17, 4, 699-706 (1987)
[6] Hopfield, J. J.; Tank, D. W., “Neural” computation of decisions in optimization problems, Biological Cybernetics, 52, 3, 141-152 (1985) · Zbl 0572.68041
[7] Rumelhart, D. E.; Hinton, G. E.; Williams, R. J., Learning internal representations by error propagation, (Rumelhart, D. E.; McClelland, J. L., Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 1 (1986), MIT Press: MIT Press Cambridge), 318-362
[8] Rumelhart, D.; Hinton, G. E.; Williams, R. J., Learning representations by back-propagating errors, Nature, 323, 6088, 533-536 (1986) · Zbl 1369.68284
[9] Kalaba, R.; Lichtenstein, Z.; Simchnoy, T.; Tesfatsion, L., Linear and nonlinear associative memories for parameter estimation, Information Sciences (1991), (to appear)
[10] Poggio, T., On optimal nonlinear associative recall, Biological Cybernetics, 19, 4, 201-209 (1975) · Zbl 0321.94001
[11] Wasserman, P. D., Neural Computing: Theory and Practice (1989), Van Nostrand Reinhold: Van Nostrand Reinhold New York
[12] Kosko, B., Bidirectional associative memories, IEEE Transactions on Systems, Man, and Cybernetics, 18, 1, 49-60 (1988)
[13] Cohen, M. A.; Grossberg, S. G., Absolute stability of global pattern formation and parallel memory storage by competitive neural networks, IEEE Transactions on Systems, Man, and Cybernetics, 3, 5, 815-826 (1983) · Zbl 0553.92009
[14] Kolmogorov, A. N., On the representation of continous functions of several variables by superpositions of continous functions of a smaller number of variables, American Mathematical Society Translation Series 2, 17, 369-373 (1961) · Zbl 0128.27803
[15] Kolmogorov, A. N., On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition, American Mathematical Society Translation Series 2, 28, 55-59 (1963) · Zbl 0125.30803
[16] Minsky, M.; Papert, S., Perceptrons: An Introduction to Computational Geometry (1988), MIT Press: MIT Press Cambridge, Mass · Zbl 0197.43702
[17] Dantzig, G., Linear Programming and Extensions (1963), Princeton University Press: Princeton University Press Princeton · Zbl 0108.33103
[18] Kalaba, R.; Udwadia, F., An adaptive learning approach to the identification of structural and mechanical systems, Computers and Mathematics with Applications, 22, 1, 67-76 (1991) · Zbl 0743.93011
[19] Josin, G.; White, D.; Charney, D., \(GENESIS^{TM}\): Neural Network Application Implementation Environment (1990), Neural Systems Incorporated: Neural Systems Incorporated 2827 West 43rd Ave., Vancouver, BC V6H 3H9, Version 1.01
[20] Kalaba, R.; Kim, G-Y.; Kim, M.; Moore, J. E.; Park, H., Using associative memories to generate almost feasible/almost optimal solutions to linear programs, (School of Urban and Regional Planning Working Paper (1991), University of Southern California)
[21] Kohonen, T.; Barna, G.; Chrisley, R., Statistical pattern recognition with neural networks: Benchmarking studies, (Proceedings of the IEEE International Conference on Neural Networks I (1988)), I61-I68, San Diego
[22] Moore, J. E.; Kim, M.; Seo, J.-G.; Kalaba, R., Obtaining initial parameter estimates for nonlinear systems: Comparing associative memory and neural network approaches, Modelling and Scientific Computing, 1, 1 (1991)
[23] Moore, J. E., Time series and turning point forecasts: A comparison of associative memories and LeSage’s econometric techniques, Journal of Regional Science (1991), (submitted)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.