# Difference between revisions of "Line search methods"

Line 4: | Line 4: | ||

=Introduction= | =Introduction= | ||

An algorithm is a line search method if it seeks the minimum of a defined nonlinear function by selecting a reasonable direction vector that, when computed iteratively with a reasonable step size, will provide a function value closer to the absolute minimum of the function. Varying these will change the "tightness" of the optimization. For example, given the function <math>f(x)</math>, an initial <math>x_k</math> is chosen. To find a lower value of <math>f(x)</math>, the value of <math>x_{k+1}</math> is increased by the following iteration scheme | An algorithm is a line search method if it seeks the minimum of a defined nonlinear function by selecting a reasonable direction vector that, when computed iteratively with a reasonable step size, will provide a function value closer to the absolute minimum of the function. Varying these will change the "tightness" of the optimization. For example, given the function <math>f(x)</math>, an initial <math>x_k</math> is chosen. To find a lower value of <math>f(x)</math>, the value of <math>x_{k+1}</math> is increased by the following iteration scheme | ||

+ | |||

[[File:CodeCogsEqn.gif]], | [[File:CodeCogsEqn.gif]], | ||

+ | |||

in which <math>\alpha_k</math> is a positive scalar known as the step length and <math>p_k</math> defines the step direction. | in which <math>\alpha_k</math> is a positive scalar known as the step length and <math>p_k</math> defines the step direction. | ||

=Step Length= | =Step Length= | ||

+ | Choosing an appropriate step length has a large impact on the robustness of a line search method. If the step length is too large, the line search will not produce an accurate or substantial minimization of the target function. If the step length is too small, the method would be too computationally expensive to complete in a reasonable about of time. To select the correct step length, the following function is minimized: | ||

+ | |||

+ | <math>\phi(\alpha) = f(x_k+\alpha p_k), \alpha >0 </math> | ||

=Step Direction = | =Step Direction = |

## Revision as of 11:00, 24 May 2015

Author names: Elizabeth Conger

Steward: Dajun Yue and Fengqi You

## Contents |

# Introduction

An algorithm is a line search method if it seeks the minimum of a defined nonlinear function by selecting a reasonable direction vector that, when computed iteratively with a reasonable step size, will provide a function value closer to the absolute minimum of the function. Varying these will change the "tightness" of the optimization. For example, given the function , an initial is chosen. To find a lower value of , the value of is increased by the following iteration scheme

in which is a positive scalar known as the step length and defines the step direction.

# Step Length

Choosing an appropriate step length has a large impact on the robustness of a line search method. If the step length is too large, the line search will not produce an accurate or substantial minimization of the target function. If the step length is too small, the method would be too computationally expensive to complete in a reasonable about of time. To select the correct step length, the following function is minimized:

# Step Direction

## Steepest Descent Method

## Newton Method

## Quasi-Newton Method

# Conclusion

# References

1. Sun, W. & Yuan, Y-X. (2006) Optimization Theory and Methods: Nonlinear Programming (Springer US) p 688.

2. Anonymous (2014) Line Search. (Wikipedia). http://en.wikipedia.org/wiki/Line_search.

3. Nocedal, J. & Wright, S. (2006) Numerical Optimization (Springer-Verlag New York, New York) 2 Ed p 664.