Difference between revisions of "Sequential quadratic programming"

From optimization
Jump to: navigation, search
Line 12: Line 12:
 
=Background=
 
=Background=
 
Previous knowledge of the component methods is helpful in understanding sequential quadratic programming. Briefly, the active set method confines the search for optimal solutions to regions where the objective function is increasing significantly with respect to constraint functions. Lagrangian Parameters and KKT conditions provide the framework to find these regions and converge to the optimum solution. Newton’s method is an algorithm for converging to critical values of any function with improvement steps that follow the form below: <br/>
 
Previous knowledge of the component methods is helpful in understanding sequential quadratic programming. Briefly, the active set method confines the search for optimal solutions to regions where the objective function is increasing significantly with respect to constraint functions. Lagrangian Parameters and KKT conditions provide the framework to find these regions and converge to the optimum solution. Newton’s method is an algorithm for converging to critical values of any function with improvement steps that follow the form below: <br/>
<math>x_{k+1}</math><math> \text{ = x_k - }</math><math>\frac{\nabla f}{\nabla^2 f} </math> <br/>
+
<math>x_{k+1}</math><math> \text{ = x_k - } </math><math>\frac{\nabla f}{\nabla^2 f} </math> <br/>
 
=SQP Algorithms=
 
=SQP Algorithms=
 
=Convergence Analysis=
 
=Convergence Analysis=
 
=Example=
 
=Example=

Revision as of 21:21, 24 May 2015

Authored by: Ben Goodman (ChE 345 Spring 2016)
Steward: Dajun Yue and Fenqi You

Contents

Introduction

Sequential quadratic programming (SQP) combines two fundamental algorithms for solving non-linear optimization problems: an active set method and Newton’s method. The approach creates quadratic sub-problems using the active set method, and these subproblems reveal the best improvement to be made to a current guess. Newton’s method can then find the solution to this sub-problem in one iteration because it is quadratic. The added efficiency from this dual approach makes SQP appropriate for larger non-linear problems and problems with high non-linearity in the constraints.

The abstracted, general problem below will be used for the remainder of this page to explain and discuss SQP:
 \text{min f(x)}
 \text{s.t. h(x) = 0}
 \text{and g(x)} \le 0

with f(x), h(x), and g(x) each potentially non-linear.

Background

Previous knowledge of the component methods is helpful in understanding sequential quadratic programming. Briefly, the active set method confines the search for optimal solutions to regions where the objective function is increasing significantly with respect to constraint functions. Lagrangian Parameters and KKT conditions provide the framework to find these regions and converge to the optimum solution. Newton’s method is an algorithm for converging to critical values of any function with improvement steps that follow the form below:
x_{k+1}Failed to parse(lexing error): \text{ = x_k - } \frac{\nabla f}{\nabla^2 f}

SQP Algorithms

Convergence Analysis

Example