Difference between revisions of "Line search methods"

From optimization
Jump to: navigation, search
Line 3: Line 3:
  
 
=Introduction=
 
=Introduction=
Line search methods are a group of algorithms that determine the minimum of a defined multivariable function by selecting a reasonable direction relative to the function that will provide a value closer to the absolute minimum of the function. This process is iterated using defined direction parameters and step sizes of travel.  Varying these will change the "tightness" of the optimization.  
+
An algorithm is a line search method if it seeks the minimum of a defined nonlinear function by selecting a reasonable direction vector that, when computed iteratively with a reasonable step size, will provide a function value closer to the absolute minimum of the function. Varying these will change the "tightness" of the optimization. For example, given the function <math>f(x)</math>, an initial <math>x</math>
 +
 
 +
 
 
==Section 1.1==
 
==Section 1.1==
 
==Section 1.2==
 
==Section 1.2==
Line 25: Line 27:
  
 
=References=
 
=References=
 +
1. Sun, W. & Yuan, Y-X. (2006) Optimization Theory and Methods: Nonlinear Programming (Springer US) p 688.
 +
 +
2. Anonymous (2014) Line Search.  (Wikipedia). http://en.wikipedia.org/wiki/Line_search.

Revision as of 08:48, 24 May 2015

Author names: Elizabeth Conger
Steward: Dajun Yue and Fengqi You

Contents

Introduction

An algorithm is a line search method if it seeks the minimum of a defined nonlinear function by selecting a reasonable direction vector that, when computed iteratively with a reasonable step size, will provide a function value closer to the absolute minimum of the function. Varying these will change the "tightness" of the optimization. For example, given the function f(x), an initial x


Section 1.1

Section 1.2

jadklfjlasjfkladslkfdsklf dfadjfkhdakjfhadskj fahdfkjadshfkahdfjsdk [1] Youtube Site

Section 2

Chemicals.jpg

Solution to 48 States Traveling Salesman Problem

Section 3

E=mc^2

Conclusion

\begin{bmatrix} G(x,y) & 0 & -A(x)^T \\ 0 & Y & W \\ A(x) & -I & 0 \end{bmatrix} \begin{bmatrix} \Delta x \\ \Delta s \\ \Delta y \end{bmatrix} = \begin{bmatrix} -\nabla f(x) + A(x)^T y \\ \mu e - W Y e \\ -g(x) + s \end{bmatrix}

[25 20 15]


References

1. Sun, W. & Yuan, Y-X. (2006) Optimization Theory and Methods: Nonlinear Programming (Springer US) p 688.

2. Anonymous (2014) Line Search. (Wikipedia). http://en.wikipedia.org/wiki/Line_search.