Optimizing costly functions with simple constraints: A limited-memory projected quasi-newton algorithm

M Schmidt, E Berg, M Friedlander…�- Artificial intelligence�…, 2009 - proceedings.mlr.press
Artificial intelligence and statistics, 2009proceedings.mlr.press
An optimization algorithm for minimizing a smooth function over a convex set is described.
Each iteration of the method computes a descent direction by minimizing, over the original
constraints, a diagonal plus low-rank quadratic approximation to the function. The quadratic
approximation is constructed using a limited-memory quasi-Newton update. The method is
suitable for large-scale problems where evaluation of the function is substantially more
expensive than projection onto the constraint set. Numerical experiments on one-norm�…
Abstract
An optimization algorithm for minimizing a smooth function over a convex set is described. Each iteration of the method computes a descent direction by minimizing, over the original constraints, a diagonal plus low-rank quadratic approximation to the function. The quadratic approximation is constructed using a limited-memory quasi-Newton update. The method is suitable for large-scale problems where evaluation of the function is substantially more expensive than projection onto the constraint set. Numerical experiments on one-norm regularized test problems indicate that the proposed method is competitive with state-of-the-art methods such as bound-constrained L-BFGS and orthant-wise descent. We further show that the method generalizes to a wide class of problems, and substantially improves on state-of-the-art methods for problems such as learning the structure of Gaussian graphical models and Markov random fields.
proceedings.mlr.press
Showing the best result for this search. See all results