×

Improving particle swarm optimization performance with local search for high-dimensional function optimization. (English) Zbl 1205.90273

Summary: Particle swarm optimization (PSO) is one recently proposed population-based stochastic optimization technique, and gradient-based descent methods are efficient local optimization techniques that are often used as a necessary ingredient of hybrid algorithms for global optimization problems (GOPs). By examining the properties of the two methods, a two-stage hybrid algorithm for global optimization is proposed. In the present algorithm, the gradient descent technique is used to find a local minimum of the objective function efficiently, and a PSO method with latent parallel search capability is employed to help the algorithm to escape from the previously converged local minima to a better point which is then used as a starting point for the gradient methods to restart a new local search. The above search procedure is applied repeatedly until a global minimum is found (when a global minimum is known in advance) or the maximum number of function evaluations is reached. In addition, a repulsion technique and partially initializing population method are incorporated in the new algorithm to increase its global jumping ability. Simulation results on 15 test problems including five large-scale ones with dimensions up to 1000 demonstrate that the proposed method is more stable and efficient than several other existing methods.

MSC:

90C30 Nonlinear programming
90C52 Methods of reduced gradient type
90C26 Nonconvex programming, global optimization
65K05 Numerical mathematical programming methods
49M37 Numerical methods based on nonlinear programming
Full Text: DOI

References:

[1] Anerssen R. S., Optimization (1972)
[2] DOI: 10.1126/science.276.5315.1094 · Zbl 1226.90073 · doi:10.1126/science.276.5315.1094
[3] DOI: 10.1109/21.108301 · doi:10.1109/21.108301
[4] DOI: 10.1016/j.chemolab.2005.06.017 · doi:10.1016/j.chemolab.2005.06.017
[5] DOI: 10.1109/4235.985692 · doi:10.1109/4235.985692
[6] DOI: 10.1126/science.267.5198.664 · Zbl 1226.90101 · doi:10.1126/science.267.5198.664
[7] DOI: 10.1109/TEVC.2005.859468 · doi:10.1109/TEVC.2005.859468
[8] Deb, K. 1995. ”Optimization for Engineering Design, Algorithms and Examples”. New Delhi, India: Prentice-Hall.
[9] Deb, K. and Goldberg, D. E. An investigation of niche and species formation in genetic function optimization. Proceedings of the 3rd International Conference on Genetic Algorithms. Edited by: Schaffer, J. D. pp.42–25. San Mateo, CA: Morgan Kaufman.
[10] DOI: 10.1007/BF01594945 · Zbl 0753.90060 · doi:10.1007/BF01594945
[11] Eberhart, R. C. and Kennedy, J. A new optimizer using particle swarm theory. Proceedings of 6th Symposium Micro Machine and Human Science. Nagoya, Japan. pp.39–43. Piscataway, NJ: IEEE Service Center.
[12] Kennedy, J., Eberhardt, R. C. and Shi, Y. H. 2001. ”Swarm intelligence”. San Diego, CA: Morgan Kaufmann.
[13] DOI: 10.1007/BF01585737 · Zbl 0694.90083 · doi:10.1007/BF01585737
[14] Goldberg, D. E. 1989. ”Genetic Algorithm in Search, Optimization and Machine Learning”. Reading, MA: Addison Wesley. · Zbl 0721.68056
[15] DOI: 10.1162/evco.2003.11.1.29 · doi:10.1162/evco.2003.11.1.29
[16] DOI: 10.1080/1055678021000030084 · Zbl 1065.90081 · doi:10.1080/1055678021000030084
[17] Holland J. H., Sci. Amer. 4 pp 44– (1992)
[18] Kennedy, J. and Eberhart, R. C. Particle swarm optimization. Proceedings of IEEE International Conference on Neural Networks. pp.1942–1948. Piscataway, NJ: IEEE Service Center.
[19] DOI: 10.1126/science.220.4598.671 · Zbl 1225.90162 · doi:10.1126/science.220.4598.671
[20] Knnedy, J., Eberhart, R. C. and Shi, Y. 2001. ”Swarm intelligence”. San Francisco, CA: Morgan Kaufmann.
[21] DOI: 10.1137/0906002 · Zbl 0601.65050 · doi:10.1137/0906002
[22] DOI: 10.1016/j.chaos.2004.11.095 · Zbl 1074.90564 · doi:10.1016/j.chaos.2004.11.095
[23] DOI: 10.1162/1063656041774983 · doi:10.1162/1063656041774983
[24] Marti R., Handbook of MetaHeuristics pp 355– (2002)
[25] DOI: 10.1162/1063656041774956 · doi:10.1162/1063656041774956
[26] Noel, M. M. and Jannett, T. C. Simulation of a new hybrid particle swarm optimization algorithm, system symposium. Proceedings of the Thirty-Sixth Southeastern Symposium. pp.150–153. Atlanta, GA: IEEE Press.
[27] DOI: 10.1016/S0098-1354(02)00153-9 · doi:10.1016/S0098-1354(02)00153-9
[28] DOI: 10.1109/TEVC.2004.826076 · doi:10.1109/TEVC.2004.826076
[29] DOI: 10.1016/S0362-546X(01)00457-6 · Zbl 1042.90622 · doi:10.1016/S0362-546X(01)00457-6
[30] DOI: 10.1109/3468.844362 · doi:10.1109/3468.844362
[31] Schoen F., Handbook of Global Optimization pp 151– (2002)
[32] Shi, Y. and Eberhart, R. C. A modified particle swarm optimizer. Proceedings of the IEEE Congress on Evolutionary Computation (CEC 1998). pp.69–73. Piscataway, NJ: IEEE Press.
[33] Shi, Y. and Eberhart, R. C. Empirical study of particle swarm optimization. Proceedings of the IEEE International Congress on Evolutionary Computation. Washington, DC. pp.1945–1950. Piscataway, NJ: IEEE Press.
[34] Suganthan, P. N. Particle swarm optimizer with neighborhood operator. Proceedings of the Congress on Evolutionary Computation[C]. Washington, DC. pp.1958–1962. Piscataway, NY: IEEE.
[35] DOI: 10.1023/B:JOGO.0000015313.93974.b0 · Zbl 1114.90163 · doi:10.1023/B:JOGO.0000015313.93974.b0
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.