0% found this document useful (0 votes)
96 views1 page

L-BFGS-B Summary

The document discusses the L-BFGS-B algorithm for bound-constrained optimization. The algorithm iterates over quasi-Newton steps using a quadratic approximation of the objective function. It then outlines the key steps of the algorithm, which include checking for convergence, finding the Cauchy point to determine active/free variables, minimizing over free variables and backtracking into feasible space, computing the search direction and step length, and updating the L-BFGS Hessian approximation.

Uploaded by

Arp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
96 views1 page

L-BFGS-B Summary

The document discusses the L-BFGS-B algorithm for bound-constrained optimization. The algorithm iterates over quasi-Newton steps using a quadratic approximation of the objective function. It then outlines the key steps of the algorithm, which include checking for convergence, finding the Cauchy point to determine active/free variables, minimizing over free variables and backtracking into feasible space, computing the search direction and step length, and updating the L-BFGS Hessian approximation.

Uploaded by

Arp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

appropriate observations about its Matlab implementation.

Then we dis-
cuss several basic regression tests that have been implemented to ensure the
L-BFGS-B Matlab implementation solves bound-constrained optimization
problems appropriately.

2 The L-BFGS-B algorithm


From a high-level the L-BFGS-B algorithm iterates over quasi-Newton steps.
For a given iteration k, the objective function is approximated by a quadratic
model at a point xk as:
1
mk (x) = fk + gkT (x xk ) + (x xk )T Bk (x xk ), (2)
2
where fk represents the objective function f evaluated at xk , gk denotes
the objective gradient evaluated at xk , and Bk represents a limited memory
BFGS approximation to the Hessian evaluated at xk . Using this quadratic
model, the L-BFGS-B algorithm can be outlined using the following steps:
1. Check if the inf-norm of the gradient gk projected onto the feasible
design space is less than a user-specified tolerance. If it is, then return
successfully.
2. Find the Cauchy point xc , that minimizes mk in the steepest descent
direction gk projected onto the feasible design space. Once found, the
Cauchy point xc is used to identify active design variables A(x) (those
that are identified as fixed at either an upper or lower bound) and free
variables F (x) (those that are identified as inside the feasible design
space). Conceptually, this is like the algorithm ‘peeking ahead’ an
iteration to determine which variables will likely be active or inactive.
3. The quadratic model mk is then minimized for the set of free vari-
ables F (x) in an unconstrained manner, and then backtracked into
the feasible design space to obtain x̄k .
4. The new search direction is computed as dk = x̄k xk and a line-search
method is used to find a step length ↵k that satisfies the strong-Wolfe
conditions to compute the new design variables xk+1 .
5. The L-BFGS Hessian approximation Bk+1 is computed based on the
new step xk+1 and a new iteration is started.
These steps are now discussed from a high-level, with references to their
implementation.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy