Line search methods

From optimization
Revision as of 11:00, 24 May 2015 by Epc257 (Talk | contribs)

Jump to: navigation, search

Author names: Elizabeth Conger
Steward: Dajun Yue and Fengqi You



An algorithm is a line search method if it seeks the minimum of a defined nonlinear function by selecting a reasonable direction vector that, when computed iteratively with a reasonable step size, will provide a function value closer to the absolute minimum of the function. Varying these will change the "tightness" of the optimization. For example, given the function f(x), an initial x_k is chosen. To find a lower value of f(x), the value of x_{k+1} is increased by the following iteration scheme


in which \alpha_k is a positive scalar known as the step length and p_k defines the step direction.

Step Length

Choosing an appropriate step length has a large impact on the robustness of a line search method. If the step length is too large, the line search will not produce an accurate or substantial minimization of the target function. If the step length is too small, the method would be too computationally expensive to complete in a reasonable about of time. To select the correct step length, the following function is minimized:

\phi(\alpha) = f(x_k+\alpha p_k), \alpha >0

Step Direction

Steepest Descent Method

Newton Method

Quasi-Newton Method

Solution to 48 States Traveling Salesman Problem


\begin{bmatrix} G(x,y) & 0 & -A(x)^T \\ 0 & Y & W \\ A(x) & -I & 0 \end{bmatrix} \begin{bmatrix} \Delta x \\ \Delta s \\ \Delta y \end{bmatrix} = \begin{bmatrix} -\nabla f(x) + A(x)^T y \\ \mu e - W Y e \\ -g(x) + s \end{bmatrix}


1. Sun, W. & Yuan, Y-X. (2006) Optimization Theory and Methods: Nonlinear Programming (Springer US) p 688.

2. Anonymous (2014) Line Search. (Wikipedia).

3. Nocedal, J. & Wright, S. (2006) Numerical Optimization (Springer-Verlag New York, New York) 2 Ed p 664.