Difference between revisions of "Line search methods"

From optimization
Jump to: navigation, search
Line 3: Line 3:
  
 
=Introduction=
 
=Introduction=
An algorithm is a line search method if it seeks the minimum of a defined nonlinear function by selecting a reasonable direction vector that, when computed iteratively with a reasonable step size, will provide a function value closer to the absolute minimum of the function. Varying these will change the "tightness" of the optimization. For example, given the function <math>f(x)</math>, an initial <math>x</math>
+
An algorithm is a line search method if it seeks the minimum of a defined nonlinear function by selecting a reasonable direction vector that, when computed iteratively with a reasonable step size, will provide a function value closer to the absolute minimum of the function. Varying these will change the "tightness" of the optimization. For example, given the function <math>f(x)</math>, an initial <math>x_k</math>
  
  
==Section 1.1==
+
=Step Length=
==Section 1.2==
+
 
 +
=Step Direction =
 
jadklfjlasjfkladsl'''kfdsklf'''
 
jadklfjlasjfkladsl'''kfdsklf'''
 
dfadjfkhdakjfhadskj
 
dfadjfkhdakjfhadskj
Line 13: Line 14:
 
[https://www.youtube.com/ Youtube Site]
 
[https://www.youtube.com/ Youtube Site]
  
=Section 2=
+
==Steepest Descent Method==
 +
==Newton Method==
 +
==Quasi-Newton Method==
 
[[File:Chemicals.jpg]]
 
[[File:Chemicals.jpg]]
 
[[File:48StatesTSP.png|frame|Solution to 48 States Traveling Salesman Problem]]
 
[[File:48StatesTSP.png|frame|Solution to 48 States Traveling Salesman Problem]]
 
=Section 3=
 
<math>E=mc^2</math>
 
  
 
=Conclusion=
 
=Conclusion=
 
<math>\begin{bmatrix} G(x,y) & 0 & -A(x)^T \\ 0 & Y & W \\ A(x) & -I & 0 \end{bmatrix} \begin{bmatrix} \Delta x \\ \Delta s \\ \Delta y \end{bmatrix} = \begin{bmatrix} -\nabla f(x) + A(x)^T y \\ \mu e - W Y e \\ -g(x) + s \end{bmatrix}</math>
 
<math>\begin{bmatrix} G(x,y) & 0 & -A(x)^T \\ 0 & Y & W \\ A(x) & -I & 0 \end{bmatrix} \begin{bmatrix} \Delta x \\ \Delta s \\ \Delta y \end{bmatrix} = \begin{bmatrix} -\nabla f(x) + A(x)^T y \\ \mu e - W Y e \\ -g(x) + s \end{bmatrix}</math>
 
[25 20 15]
 
  
  
Line 30: Line 28:
  
 
2. Anonymous (2014) Line Search.  (Wikipedia). http://en.wikipedia.org/wiki/Line_search.
 
2. Anonymous (2014) Line Search.  (Wikipedia). http://en.wikipedia.org/wiki/Line_search.
 +
 +
3. Nocedal, J. & Wright, S. (2006) Numerical Optimization (Springer-Verlag New York, New York) 2 Ed p 664.

Revision as of 09:31, 24 May 2015

Author names: Elizabeth Conger
Steward: Dajun Yue and Fengqi You

Contents

Introduction

An algorithm is a line search method if it seeks the minimum of a defined nonlinear function by selecting a reasonable direction vector that, when computed iteratively with a reasonable step size, will provide a function value closer to the absolute minimum of the function. Varying these will change the "tightness" of the optimization. For example, given the function f(x), an initial x_k


Step Length

Step Direction

jadklfjlasjfkladslkfdsklf dfadjfkhdakjfhadskj fahdfkjadshfkahdfjsdk [1] Youtube Site

Steepest Descent Method

Newton Method

Quasi-Newton Method

Chemicals.jpg

Solution to 48 States Traveling Salesman Problem

Conclusion

\begin{bmatrix} G(x,y) & 0 & -A(x)^T \\ 0 & Y & W \\ A(x) & -I & 0 \end{bmatrix} \begin{bmatrix} \Delta x \\ \Delta s \\ \Delta y \end{bmatrix} = \begin{bmatrix} -\nabla f(x) + A(x)^T y \\ \mu e - W Y e \\ -g(x) + s \end{bmatrix}


References

1. Sun, W. & Yuan, Y-X. (2006) Optimization Theory and Methods: Nonlinear Programming (Springer US) p 688.

2. Anonymous (2014) Line Search. (Wikipedia). http://en.wikipedia.org/wiki/Line_search.

3. Nocedal, J. & Wright, S. (2006) Numerical Optimization (Springer-Verlag New York, New York) 2 Ed p 664.