Non-Linear Programming - University of Washington

advertisement
Non-Linear Programming
© 2011 Daniel Kirschen and University of Washington
1
Motivation
• Method of Lagrange multipliers
– Very useful insight into solutions
– Analytical solution practical only for small problems
– Direct application not practical for real-life problems
because these problems are too large
– Difficulties when problem is non-convex
• Often need to search for the solution of practical
optimization problems using:
– Objective function only or
– Objective function and its first derivative or
– Objective function and its first and second derivatives
© 2011 Daniel Kirschen and University of Washington
2
Naïve One-Dimensional Search
• Suppose:
– That we want to find the value of x that minimizes
f(x)
– That the only thing that we can do is calculate the
value of f(x) for any value of x
• We could calculate f(x) for a range of values of
x and choose the one that minimizes f(x)
© 2011 Daniel Kirschen and University of Washington
3
Naïve One-Dimensional Search
f(x)
x
• Requires a considerable amount of
computing time if range is large and a good
accuracy is needed
© 2011 Daniel Kirschen and University of Washington
4
One-Dimensional Search
f(x)
x0
© 2011 Daniel Kirschen and University of Washington
x
5
One-Dimensional Search
f(x)
x0
© 2011 Daniel Kirschen and University of Washington
x1
x
6
One-Dimensional Search
f(x)
Current search
range
x0
x1
x2
x
If the function is convex, we have bracketed the optimum
© 2011 Daniel Kirschen and University of Washington
7
One-Dimensional Search
f(x)
x0
x1
x3
x2
x
Optimum is between x1 and x2
We do not need to consider x0 anymore
© 2011 Daniel Kirschen and University of Washington
8
One-Dimensional Search
f(x)
x0
x1
x4
x3
x2
x
Repeat the process until the range is sufficiently small
© 2011 Daniel Kirschen and University of Washington
9
One-Dimensional Search
f(x)
x0
x1
x4
x3
x2
x
The procedure is valid only if the function is convex!
© 2011 Daniel Kirschen and University of Washington
10
Multi-Dimensional Search
• Unidirectional search not applicable
• Naïve search becomes totally impossible as
dimension of the problem increases
• If we can calculate the first derivatives of the
objective function, much more efficient
searches can be developed
• The gradient of a function gives the direction
in which it increases/decreases fastest
© 2011 Daniel Kirschen and University of Washington
11
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
12
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
13
Unidirectional Search
Objective function
© 2011 Daniel Kirschen and University of Washington
Gradient direction 14
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
15
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
16
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
17
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
18
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
19
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
20
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
21
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
22
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
23
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
24
Steepest Ascent/Descent
Algorithm
x
2
© 2011 Daniel Kirschen and University of Washington
x1
25
Choosing a Direction
• Direction of steepest ascent/descent is not
always the best choice
• Other techniques have been used with varying
degrees of success
• In particular, the direction chosen must be
consistent with the equality constraints
© 2011 Daniel Kirschen and University of Washington
26
How far to go in that direction?
• Unidirectional searches can be timeconsuming
• Second order techniques that use information
about the second derivative of the objective
function can be used to speed up the process
• Problem with the inequality constraints
– There may be a lot of inequality constraints
– Checking all of them every time we move in one
direction can take an enormous amount of
computing time
© 2011 Daniel Kirschen and University of Washington
27
Handling of inequality constraints
x2
How do I know that
I have to stop here?
Move in the direction
of the gradient
x1
© 2011 Daniel Kirschen and University of Washington
28
Handling of inequality constraints
x2
How do I know that
I have to stop here?
x1
© 2011 Daniel Kirschen and University of Washington
29
Penalty functions
Penalty
xmin
© 2011 Daniel Kirschen and University of Washington
xmax
30
Penalty functions
• Replace enforcement of inequality constraints
by addition of penalty terms to objective
function
Penalty
K(x-xmax)2
xmin
© 2011 Daniel Kirschen and University of Washington
xmax
31
Problem with penalty functions
• Stiffness of the penalty function must be increased
progressively to enforce the constraints tightly enough
• Not very efficient method
Penalty
xmin
© 2011 Daniel Kirschen and University of Washington
xmax
32
Barrier functions
Barrier cost
xmin
© 2011 Daniel Kirschen and University of Washington
xmax
33
Barrier functions
• Barrier must be made progressively closer to the limit
• Works better than penalty function
• Interior point methods
Barrier cost
xmin
© 2011 Daniel Kirschen and University of Washington
xmax
34
Non-Robustness
Different starting points may lead to different solutions if the
problem is not convex
x2
x1
© 2011 Daniel Kirschen and University of Washington
X
Y
35
Conclusions
• Very sophisticated non-linear programming
methods have been developed
• They can be difficult to use:
– Different starting points may lead to different
solutions
– Some problems will require a lot of iterations
– They may require a lot of “tuning”
© 2011 Daniel Kirschen and University of Washington
36
Download