Abstract
The derivative-free global optimization algorithms developed in Part I of this study, for linearly constrained problems, are extended to nonconvex n-dimensional problems with convex constraints. The initial \(n+1\) feasible datapoints are chosen in such a way that the volume of the simplex generated by these datapoints is maximized; as the algorithm proceeds, additional feasible datapoints are added in such a way that the convex hull of the available datapoints efficiently increases towards the boundaries of the feasible domain. Similar to the algorithms developed in Part I of this study, at each step of the algorithm, a search function is defined based on an interpolating function which passes through all available datapoints and a synthetic uncertainty function which characterizes the distance to the nearest datapoints. This uncertainty function, in turn, is built on the framework of a Delaunay triangulation, which is itself based on all available datapoints together with the (infeasible) vertices of an exterior simplex which completely contains the feasible domain. The search function is minimized within those simplices of this Delaunay triangulation that do not include the vertices of the exterior simplex. If the outcome of this minimization is contained within the circumsphere of a simplex which includes a vertex of the exterior simplex, this new point is projected out to the boundary of the feasible domain. For problems in which the feasible domain includes edges (due to the intersection of multiple twice-differentiable constraints), a modified search function is considered in the vicinity of these edges to assure convergence.
Similar content being viewed by others
Notes
The representation of a convex feasible domain as stated in (1) is the standard form used, e.g., in [15], but is not completely general. Certain convex constraints of interest, such as those implied by linear matrix inequalities (LMIs), can not be represented in this form. The extension of the present algorithm to feasible domains bounded by LMIs will be considered in future work.
A simplex is said to be regular if all of its edges are equal.
The problem of globally maximizing the volume of a feasible simplex inside a convex domain is, in general, a nonconvex problem. We do not attempt to solve this global maximization, which is unnecessary in our algorithm.
This is a maximization problem for a convex function in a convex compact domain. This problem is studied in [49].
The maximum eigenvalue is denoted \(\lambda _\text {max}(.)\).
If \(x\le {a}/{b}\), \(x\le {c}/{d}\), and \(b,d{>}0\) then \(x\le (a+c)/(b+d)\).
If \(A,B,C{>}0\), and \(A^2 \le C+BA\) then \(A\le \sqrt{C}+B\).
If \(x,y{>}0\), then \(\sqrt{x+y} \le \sqrt{x}+\sqrt{y}\).
The point \(\hat{x}\) is an \(\omega \)-limit point (see [45]) of the sequence \(x_k\), if a subsequence of \(x_k\) converges to \(\hat{x}\).
This is simply the classical Rosenbrock function with its unconstrained global minimizer shifted to the origin.
Though this code is not available online, we have obtained a copy of it by personal communication from Prof. Marsden.
References
Abramson, M.A., Audet, C., Dennis Jr., J.E., Le Digabel, S.: OrthoMADS: a deterministic MADS instance with orthogonal directions. SIAM J. Optim. 20(2), 948–966 (2009)
Audet, C., Dennis Jr., J.E.: A pattern search filter method for nonlinear programming without derivatives. SIAM J. Optim. 14(4), 980–1010 (2004)
Audet, C., Dennis Jr., J.E.: Mesh adaptive direct search algorithms for constrained optimization. SIAM J. Optim. 17(1), 188–217 (2006)
Audet, C., Dennis Jr., J.E.: A progressive barrier for derivative-free nonlinear programming. SIAM J. Optim. 20(1), 445–472 (2009)
Barvinok, A.: A Course in Convexity, vol. 50. American Mathematical Society, Providence (2002)
Bauschke, H.H., Borwein, J.M.: On projection algorithms for solving convex feasibility problems. SIAM Rev. 38(3), 367–426 (1996)
Beyhaghi, P., Cavaglieri, D., Bewley, T.: Delaunay-based derivative-free optimization via global surrogates, part I: linear constraints. J. Glob. Optim. 63(3), 1–52 (2015)
Booker, A.J., Dennis Jr., J.E., Frank, P.D., Serafini, D.B., Torczon, V., Trosset, M.W.: A rigorous framework for optimization of expensive functions by surrogates. Struct. Optim. 17(1), 1–13 (1999)
Belitz, P., Bewley, T.: New horizons in sphere-packing theory, part II: lattice-based derivative-free optimization via global surrogates. J. Glob. Optim. 56(1), 61–91 (2013)
Bertsekas, D.P.: On penalty and multiplier methods for constrained minimization. SIAM J. Control Optim. 14(2), 216–235 (1976)
Bertsimas, D., Tsitsiklis, J.N.: Introduction to Linear Optimization, vol. 6. Athena Scientific, Belmont (1997)
Boissonnat, J.D., Devillers, O., Hornus, S.: Incremental construction of the Delaunay triangulation and the Delaunay graph in medium dimension. In: Proceedings of the Twenty-Fifth Annual Symposium on Computational Geometry, pp. 208–216. ACM (2009)
Coxeter, H.S.M.: Regular Polytopes. Courier Corporation, Chelmsford (1973)
Combettes, P.L.: Hilbertian convex feasibility problem: convergence of projection methods. Appl. Math. Optim. 35(3), 311–330 (1997)
Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, Cambridge (2004)
Dwyer, R.A.: A faster divide-and-conquer algorithm for constructing Delaunay triangulations. Algorithmica 2(1–4), 137–151 (1987)
Dwyer, R.A.: Higher-dimensional Voronoi diagrams in linear expected time. Discrete Comput. Geom. 6(3), 343–367 (1991)
Dwyer, R.A.: The expected number of k-faces of a Voronoi diagram. Comput. Math. Appl. 26(5), 13–19 (1993)
Facchinei, F., Pang, J.S.: Finite-Dimensional Variational Inequalities and Complementarity Problems. Springer, Berlin (2007)
Floudas, C.A., Pardalos, P.M.: A Collection of Test Problems for Constrained Global Optimization Algorithms, vol. 455. Springer, Berlin (1990)
Forsgren, A., Gill, P.E., Wright, M.H.: Interior methods for nonlinear optimization. SIAM Rev. 44(4), 525–597 (2002)
Forsgren, A., Gill, P.E.: Primal-dual interior methods for nonconvex nonlinear programming. SIAM J. Optim. 8(4), 1132–1152 (1998)
Gill, P.E., Murray, W.: Newton-type methods for unconstrained and linearly constrained optimization. Math. Program. 7(1), 311–350 (1974)
Gill, P.E., Murray, W., Saunders, M.A., Wright, M.H.: A Schur-Complement Method for Sparse Quadratic Programming (No. SOL-87-12). Stanford University CA Systems Optimization Lab (1987)
Gill, P.E., Murray, W., Saunders, M.A., Tomlin, J.A., Wright, M.H.: On projected Newton barrier methods for linear programming and an equivalence to Karmarkars projective method. Math. Program. 36(2), 183–209 (1986)
Heinonen, J.: Lectures on Lipschitz analysis. University Report, University of Jyvaskyla Department of Mathematics and Statistics, 100, University of Jyvaskyla, Jyvaskyla, pp. 1–77 (2005)
Horst, R., Pardalos, P.M. (eds.): Handbook of Global Optimization, vol. 2. Springer, Berlin (2013)
Jones, D.R., Perttunen, C.D., Stuckman, B.E.: Lipschitzian optimization without the Lipschitz constant. J. Optim. Theory Appl. 79(1), 157–181 (1993)
Jones, D.R.: A taxonomy of global optimization methods based on response surfaces. J. Glob. Optim. 21(4), 345–383 (2001)
Kort, B.W., Bertsekas, D.P.: A new penalty function method for constrained minimization. In: Proceedings of the 1972 IEEE Conference on Decision and Control, 1972 and 11th Symposium on Adaptive Processes, pp. 162–166. IEEE (1972)
Kort, B.W., Bertsekas, D.P.: Combined primal-dual and penalty methods for convex programming. SIAM J. Control Optim. 14(2), 268–294 (1976)
Le Digabel, S.: Algorithm 909: NOMAD: nonlinear optimization with the MADS algorithm. ACM Trans. Math. Softw. 37(4), 44 (2011)
Lewis, R.M., Torczon, V.: Pattern search algorithms for bound constrained minimization. SIAM J. Optim. 9(4), 1082–1099 (1999)
Lewis, R.M., Torczon, V.: Pattern search methods for linearly constrained minimization. SIAM J. Optim. 10(3), 917–941 (2000)
Lewis, R.M., Torczon, V.: A globally convergent augmented Lagrangian pattern search algorithm for optimization with general constraints and simple bounds. SIAM J. Optim. 12(4), 1075–1089 (2002)
Mangasarian, O.L., Fromovitz, S.: The Fritz John necessary optimality conditions in the presence of equality and inequality constraints. J. Math. Anal. Appl. 17(1), 37–47 (1967)
Marsden, A.L., Wang, M., Dennis Jr., J.E., Moin, P.: Optimal aeroacoustic shape design using the surrogate management framework. Optim. Eng. 5(2), 235–262 (2004)
McMullen, P.: The maximum numbers of faces of a convex polytope. Mathematika 17(02), 179–184 (1970)
Nocedal, J., Wright, S.: Numerical Optimization. Springer, New York (2006)
Paulavicius, R., Zilinskas, J.: Simplicial Global Optimization. Springer, New York (2014)
Preparata, F.P., Shamos, M.: Computational Geometry: An Introduction. Springer, Berlin (2012)
Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)
Schonlau, M., Welch, W.J., Jones, D.R.: A data-analytic approach to bayesian global optimization. In: Department of Statistics and Actuarial Science and The Institute for Improvement in Quality and Productivity, ASA Conference (1997)
Shubert, B.O.: A sequential method seeking the global maximum of a function. SIAM J. Numer. Anal. 9(3), 379–388 (1972)
Smith, H.L., Waltman, P.: The Theory of the Chemostat: Dynamics of Microbial Competition, vol. 13. Cambridge University Press, Cambridge (1995)
Susca, S., Bullo, F., Martnez, S.: Gradient algorithms for polygonal approximation of convex contours. Automatica 45(2), 510–516 (2009)
Torczon, V.: On the convergence of pattern search algorithms. SIAM J. Optim. 7(1), 1–25 (1997)
Zhigljavsky, A., Zilinskas, A.: Stochastic Global Optimization, vol. 9. Springer, Berlin (2007)
Zwart, P.B.: Global maximization of a convex function with linear inequality constraints. Oper. Res. 22(3), 602–609 (1974)
Acknowledgments
The authors gratefully acknowledge Prof. Phillip Gill and Prof. Alison Marsden for their collaborations and funding from AFOSR FA 9550-12-1-0046, Cymer Center for Control Systems & Dynamics, and Leidos corporation in support of this work.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Beyhaghi, P., Bewley, T.R. Delaunay-based derivative-free optimization via global surrogates, part II: convex constraints. J Glob Optim 66, 383–415 (2016). https://doi.org/10.1007/s10898-016-0433-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10898-016-0433-5