Sensitivity Analysis and Duality Part I – Sensitivity Analysis Chapter 6

advertisement
Sensitivity Analysis and Duality
Part I – Sensitivity Analysis
Based on
Chapter 6
Introduction to Mathematical Programming: Operations Research, Volume 1
4th edition, by Wayne L. Winston and Munirpallam Venkataramanan
Lewis Ntaimo
L. Ntaimo (c) 2005 INEN420 TAMU
1
6.1 – Introduction
•
Sensitivity analysis (SA) and duality are two of the most
important topics in linear programming.
•
Sensitivity analysis is concerned with how changes in an LP’s
parameters affect the LP’s optimal solution. These parameters
are the objective function coefficients, right-hand sides and the
technology matrix.
L. Ntaimo (c) 2005 INEN420 TAMU
2
6.1 – Introduction
Sensitivity analysis is important for several reasons :
•
In many applications, the values of an LP’s parameters may
change, e.g. prices, demand. Remember the certainty
assumption of linear programming?
•
If a parameter changes sensitivity analysis often makes it
unnecessary to solve the problem again. Solving an LP with
thousands of variables and constraints again would be a chore!
•
The knowledge of sensitivity analysis often enables the analyst
(you ☺) to determine from the original solution how changes in
an LP’s parameters change the optimal solution.
L. Ntaimo (c) 2005 INEN420 TAMU
3
6.1 – Introduction
We need the knowledge of matrices to express simplex tableaus in
matrix form before learning how to perform sensitivity analysis on
an arbitrary LP.
You already know this!
L. Ntaimo (c) 2005 INEN420 TAMU
4
6.2 – A Graphical Illustration of Sensitivity Analysis
Reconsider the Giapetto problem from Chapter 3:
Decision Variables: x1 = number of soldiers produced each week
x2 = number of trains produced each week.
Max z = 3x1 + 2x2
2 x1 + x2 ≤ 100 (finishing constraint)
x1 + x2 ≤ 80 (carpentry constraint)
x1
≤ 40 (demand constraint)
x1,x2 ≥ 0
Standard Form:
Max z = 3x1 + 2x2
2 x1 + x2 + s1
x1 + x 2
x1
= 100
+ s2
= 80
+ s3 = 40
x1, x2 , s1, s2, s3 ≥ 0
L. Ntaimo (c) 2005 INEN420 TAMU
5
6.2 – A Graphical Illustration of Sensitivity Analysis
X2
100
Giapetto Problem
finishing constraint
Slope = -2
80
Feasible Region
A
demand constraint
60
Isoprofit line z = 120
Slope = -3/2
B
40
D
carpentry constraint
Slope = -1
20
C
20
40
50
60
The optimal solution to the
Giapetto problem is z = 180,
x1 = 20, x2 = 60 (Point B in the
figure) with x1, x2, and s3 as
basic variables.
How would changes in the
problem’s objective function
coefficients or the
constraint’s right-hand sides
change this optimal
solution?
80 X1
L. Ntaimo (c) 2005 INEN420 TAMU
6
6.2 – A Graphical Illustration of Sensitivity Analysis
Graphical Analysis of the Effect of a Change in an Objective
Function Coefficient
•
Isoprofit line z = 120
Slope = -3/2
B
D
40
20
Since a typical isoprofit line is
Feasible Region
A
demand constraint
60
Finally, Point C (40,20) is optimal if the
slope of the isoprofit line is steeper
than the slope of the finishing
constraint.
Giapetto Problem
finishing constraint
Slope = -2
80
•
Point B (20,60) is optimal if the
isoprofit line is steeper than the
carpentry constraint but flatter than the
finishing constraint.
X2
100
•
If the isoprofit line is flatter than the
carpentry constraint, Point A (0,80) is
optimal.
carpentry constraint
Slope = -1
C
c1x1 + 2x2 = k,
the slope of the isoprofit line is just -c1/2.
20
L. Ntaimo (c) 2005 INEN420 TAMU
40
50
60
80 X1
7
6.2 – A Graphical Illustration of Sensitivity Analysis
In summary:
X2
100
finishing constraint
Slope = -2
80
1. Point A is optimal if -c1/2 ≥ -1 or 0
≤ c1 ≤ 2 ( -1 is the carpentry
constraint slope).
Giapetto Problem
Feasible Region
A
demand constraint
Isoprofit line z = 120
Slope = -3/2
B
D
40
20
3. Point C is optimal if -c1/2 ≤ -2 or c1
≥ 4 ( -2 is the finishing constraint
slope).
60
2. Point B is optimal if -2 ≤ -c1/2 ≤ -1
or 2 ≤ c1 ≤ 4 (between the slopes
of the carpentry and finishing
constraint slopes).
carpentry constraint
Slope = -1
C
20
L. Ntaimo (c) 2005 INEN420 TAMU
40
50
60
80 X1
8
6.2 – A Graphical Illustration of Sensitivity Analysis
A graphical analysis can also be used to determine whether a
change in the RHS of a constraint will make the basis no longer
optimal.
X2
100
finishing constraint, b1 = 120
80
Let b1 = number of finishing hours
in the Giapetto LP.
finishing constraint, b1 = 100
Isoprofit line z = 120
A
demand constraint
2 x1 + x2 ≤ b1 (finishing constraint)
D
40
carpentry constraint
Feasible Region
20
The current optimal point (Point B)
is where the carpentry and finishing
constraints are binding.
60
Then we see that a change in b1
shifts the finishing constraint
parallel to its current position.
finishing constraint, b1 = 80
B
C
20
L. Ntaimo (c) 2005 INEN420 TAMU
40
50
60
80 X1
9
6.2 – A Graphical Illustration of Sensitivity Analysis
finishing constraint, b1 = 120
finishing constraint, b1 = 100
80
Isoprofit line z = 120
A
demand constraint
60
finishing constraint, b1 = 80
B
D
40
carpentry constraint
Feasible Region
20
We see that if b1 > 120, x1 will be
greater than 40 and will violate the
demand constraint. Also, if b1 < 80,
x1 will be less than 0 and the
nonnegativity constraint for x1 will
be violated.
X2
100
If we change the value of b1, then
as long as the point where the
finishing and carpentry constraints
intersect are binding remains
feasible, the optimal solution will
still occur where these constraints
intersect.
C
Therefore: 80 ≤ b1 ≤ 120
The current basis remains optimal
for 80 ≤ b1 ≤ 120, but the decision
variable values and z-value will
change.
20
L. Ntaimo (c) 2005 INEN420 TAMU
40
50
60
80 X1
10
6.2 – A Graphical Illustration of Sensitivity Analysis
In Summary:
In a constraint with a positive slack (or positive excess) in an LPs
optimal solution, if we change the RHS of the constraint to a
value in the range where the basis remains optimal, the optimal
solution to the LP remains the same.
L. Ntaimo (c) 2005 INEN420 TAMU
11
6.3 – Shadow Prices
It is important to determine how a constraint’s RHS changes the
optimal z-value. We define:
The shadow price for the i th constraint of an LP is the amount by
which the optimal z-value is improved (increased in a max problem or
decreased in a min problem) if the RHS of the i th constraint is
increased by 1. This definition applies only if the change in the RHS
of constraint i leaves the current basis optimal.
L. Ntaimo (c) 2005 INEN420 TAMU
12
6.3 – Shadow Prices
Giapetto Example:
Max z = 3x1 + 2x2
2 x1 + x2 ≤ 100 (finishing constraint)
x1 + x2 ≤ 80 (carpentry constraint)
x1
≤ 40 (demand constraint)
x1,x2 ≥ 0
Using the finishing constraint as an example, we know, 100 + ∆ finishing hours
are available (assuming the current basis remains optimal).
The LP’s optimal solution is then x1 = 20 + ∆ and x2 = 60 – ∆ with z = 3x1 + 2x2 =
3(20 + ∆) + 2(60 - ∆) = 180 + ∆.
Thus, as long as the current basis remains optimal, a one-unit increase in the
number of finishing hours will increase the optimal z-value by $1. So, the
shadow price for the first (finishing hours) constraint is $1.
L. Ntaimo (c) 2005 INEN420 TAMU
13
6.4 – Some Important Formulas
An LP’s optimal tableau can be expressed in terms of the LP’s
parameters. The formulas we will develop in this section are used
in the study of sensitivity analysis, duality, and advanced LP topics.
Assume that we are solving a max problem that has been prepared for
solution by the Big M method with the LP having m constraints and n
variables. Although some of the variables may be slack, excess, or
artificial, we will label them x1, x2, …, xn. The LP may then be written as
follows:
Max z = c1x1 + c2x2 + … + cnxn
s.t.
a11x1 + a12x2 + … + a1nxn = b1
a21x1 + a22x2 + … + a2nxn = b2
….
….
…
…
…
am1x1 + am2x2 + … + amnxn = bm
xi ≥ 0 (i = 1, 2, …, n)
L. Ntaimo (c) 2005 INEN420 TAMU
14
6.4 – Some Important Formulas
•
Definitions:
BV = {BV1, BV2, …, BVn} to be the set of basic variables in the
optimal tableau.
NBV = {NBV1, NBV2, …, NBVn} the set of nonbasic variables in
the optimal tableau.
xBV = vector listing the basic variables in the optimal tableau.
xNBV = vector listing the nonbasic variables in the optimal tableau.
cBV = row vector of the initial objective coefficients for the optimal
tableau’s basic variables.
cNBV = row vector of the initial objective coefficients for the
optimal tableau’s nonbasic variables.
L. Ntaimo (c) 2005 INEN420 TAMU
15
6.4 – Some Important Formulas
•
Definitions Cont..
B is an m x m matrix whose j th column is the column for BVj in
the initial tableau.
aj is the column (in the constraints) for the variable xj
N is the m x (n-m) matrix whose columns are the columns for the
nonbasic variables (in NBV order) in the initial tableau.
b is an m x 1 column vector of the right-hand side of the constraints in
the initial tableau.
L. Ntaimo (c) 2005 INEN420 TAMU
16
6.4 – Some Important Formulas
As an example, consider the Dakota Furniture problem without the
x2 ≤ 5 constraint (initial tableau):
z – 60x1 – 30x2 – 20x3 + 0s1 + 0s2 + 0s3
8x1 +
6x2 +
x3 + s1
4x1 +
2x2 + 1.5x3
=0
= 48
+ s2
2x1 + 1.5x2 + 0.5x3
= 20
+ s3
=8
Suppose we have found the optimal solution to the LP with the
following optimal tableau:
z
+
5x2
-
2x2
-
2x2 + x3
x1 + 1.25x2
+ 10s2 + 10s3
= 280
+ s1 + 2s2
- 8s3
= 24
+ 2s2
- 4s3
=8
- 0.5 s2 + 1.5s3
L. Ntaimo (c) 2005 INEN420 TAMU
=2
17
6.4 – Some Important Formulas
Dakota Problem Optimal Tableau:
z
+
5x2
-
2x2
-
2x2 + x3
x1 + 1.25x2
+ 10s2 + 10s3
BV = {s1, x3, x1}
= 280
+ s1 + 2s2
- 8s3
= 24
+ 2s2
- 4s3
=8
- 0.5 s2 + 1.5s3
=2
NBV = {x2, s2, s3
Original objective: Max z = 60x1 + 30x2 + 20x3
X BV
⎡ s1 ⎤
= ⎢⎢ x 3 ⎥⎥
⎢⎣ x1 ⎥⎦
X NBV
⎡x 2 ⎤
= ⎢⎢ s 2 ⎥⎥
⎢⎣ s 3 ⎥⎦
Since BV = {s1, x3, x1}, cBV = [0 20 60]
Since NBV = {x2, s2, s3}, cNBV = [30 0 0]
L. Ntaimo (c) 2005 INEN420 TAMU
18
6.4 – Some Important Formulas
Dakota Problem Initial Tableau:
z – 60x1 – 30x2 – 20x3 + 0s1 + 0s2 + 0s3
8x1 +
6x2 +
4x1 +
2x2 + 1.5x3
x3 + s1
2x1 + 1.5x2 + 0.5x3
⎡1 1 8 ⎤
Since BV = {s1, x3, x1}, B = ⎢⎢0 1.5 4⎥⎥
⎢⎣0 0.5 2⎥⎦
⎡ 6 0 0⎤
Since NBV = {x2, s2, s3}, N = ⎢⎢ 2 1 0⎥⎥
⎢⎣1.5 0 1⎥⎦
=0
= 48
+ s2
= 20
+ s3
=8
⎡8 ⎤
⎡6⎤
a1 = ⎢⎢4⎥⎥ , a 2 = ⎢⎢ 2 ⎥⎥ , …
⎢⎣2⎥⎦
⎢⎣1.5⎥⎦
⎡ 48⎤
b = ⎢⎢20⎥⎥ , b1 = 48, b2 = 20, b3 = 8.
⎢⎣ 8 ⎥⎦
L. Ntaimo (c) 2005 INEN420 TAMU
19
6.4 – Some Important Formulas
We are now in a position to use matrix algebra to determine how
an LP’s optimal tableau (with the set of basic variables BV) is
related to the original LP.
We observe that an LP may be written as follows:
Max z = cBVxBV + cNBVxNBV
s.t.
BxBV +
NxNBV = b
xBV, xNBV ≥ 0
L. Ntaimo (c) 2005 INEN420 TAMU
20
6.4 – Some Important Formulas
•
Using the format on the left, the Dakota problem can be written
as:
Max z = cBVxBV + cNBVxNBV
s.t.
BxBV + NxNBV = b
xBV, xNBV ≥ 0
⎡x 2 ⎤
⎡ s1 ⎤
Max z = [0 20 60].⎢⎢ x 3 ⎥⎥ + [30 0 0].⎢⎢ s 2 ⎥⎥
⎢⎣ x1 ⎥⎦
⎣⎢ s 3 ⎥⎦
⎡ 1 1 8 ⎤ ⎡ s1 ⎤ ⎡ 6 0 0⎤ ⎡ x 2 ⎤ ⎡48⎤
s.t. ⎢⎢ 0 1.5 4⎥⎥.⎢⎢ x 3 ⎥⎥ + ⎢⎢ 2 1 0⎥⎥.⎢⎢ s 2 ⎥⎥ = ⎢⎢20⎥⎥
⎢⎣ 0 0.5 2⎥⎦ ⎢⎣ x1 ⎥⎦ ⎢⎣ 1.5 0 1⎥⎦ ⎢⎣ s 3 ⎥⎦ ⎢⎣ 8 ⎥⎦
⎡ s1 ⎤ ⎡0⎤
⎢ x ⎥ ≥ ⎢0 ⎥ ,
⎢ 3⎥ ⎢ ⎥
⎢⎣ x1 ⎥⎦ ⎢⎣0⎥⎦
L. Ntaimo (c) 2005 INEN420 TAMU
⎡ x 2 ⎤ ⎡0 ⎤
⎢ s ⎥ ≥ ⎢0⎥.
⎢ 2⎥ ⎢ ⎥
⎢⎣ s 3 ⎥⎦ ⎢⎣0⎥⎦
21
6.4 – Some Important Formulas
•
Multiplying the constraints through by B-1 yields:
B-1BxBV + B-1NxNBV = B-1b =>
xBV + B-1NxNBV = B-1b
Using the Gauss-Jordan method (or your calculator!) for the Dakota
problem we know:
2
− 8⎤
⎡1
B-1 = ⎢⎢0
2
− 4⎥⎥
⎢⎣0 − 0.5 1.5 ⎥⎦
Substituting into xBV + B-1NxNBV = B-1b yields:
− 8⎤ ⎡ x2 ⎤ ⎡24⎤
2
⎡ s1 ⎤ ⎡ − 2
⎥.⎢ s ⎥ = ⎢ 8 ⎥
⎢x ⎥ + ⎢ − 2
−
2
4
3
⎥ ⎢ 2⎥ ⎢ ⎥
⎢ ⎥ ⎢
⎢⎣ x1 ⎥⎦ ⎢⎣1.25 − 0.5 1.5 ⎥⎦ ⎢⎣ s3 ⎥⎦ ⎢⎣ 2 ⎥⎦
L. Ntaimo (c) 2005 INEN420 TAMU
22
6.4 – Some Important Formulas
•
Conclusions:
• Column for xj in optimal tableau’s constraints = B-1aj
• Right-hand side of optimal tableau’s constraints xBV = B-1b
z – 60x1 – 30x2 – 20x3 + 0s1 + 0s2 + 0s3
a2
8x1 +
6x2 +
4x1 +
2x2 + 1.5x3
x3 + s1
= 48
+ s2
2x1 + 1.5x2 + 0.5x3
e.g. column x2 in the Dakota optimal
tableau = B-1a2
− 8⎤ ⎡ 6 ⎤ ⎡ − 2 ⎤
2
⎡1
− 4⎥⎥.⎢⎢ 2 ⎥⎥ = ⎢⎢ − 2 ⎥⎥
x 2 = ⎢⎢0
2
⎢⎣0 − 0.5 1.5 ⎥⎦ ⎢⎣1.5⎥⎦ ⎢⎣1.25⎥⎦
=0
b
= 20
+ s3
=8
e.g. RHS in the Dakota optimal
tableau = B-1b
RHSoptimal
− 8 ⎤ ⎡48⎤ ⎡24⎤
2
⎡ s1 ⎤ ⎡1
− 4⎥⎥.⎢⎢20⎥⎥ = ⎢⎢ 8 ⎥⎥
2
= ⎢⎢ x3 ⎥⎥ = ⎢⎢0
⎢⎣ x1 ⎥⎦ ⎢⎣0 − 0.5 1.5 ⎥⎦ ⎢⎣ 8 ⎥⎦ ⎢⎣ 2 ⎥⎦
L. Ntaimo (c) 2005 INEN420 TAMU
23
6.4 – Some Important Formulas
•
Determining the Optimal Tableau’s Row 0 in Terms of BV and the Initial
LP
We multiple the constraints BxBV + NxNBV = b through by the vector cBVB-1.
cBVxBV + cBVB-1NxNBV = cBVB-1b
We know the original objective function:
z - cBVxBV - cNBVxNBV = 0
Adding the two equations together and eliminating the optimal
tableau’s basic variables we obtain:
z + (cBVB-1N – cNBV) xNBV = cBVB-1b
L. Ntaimo (c) 2005 INEN420 TAMU
24
6.4 – Some Important Formulas
From z + (cBVB-1N – cNBV) xNBV = cBVB-1b the coefficient of nonbasic variable
xj in row 0 is:
cBVB-1(column of N for xj) – (coefficient of xj in cNBV) = cBVB-1aj - cj
The RHS of row 0 is cBVB-1b and is the optimal objective function value (zvalue)
Let
c j be the coefficient of xj in the optimal tableau’s row 0
Then
c j = cBVB-1aj - cj is the so called reduced cost for a nonbasic
variable xj.
Reduced Cost: For any nonbasic variable, the reduced cost
is the amount by which its objective function coefficient must
be improved before that variable will be a basic variable in
some optimal solution to the LP.
L. Ntaimo (c) 2005 INEN420 TAMU
25
6.4 – Summary of Formulas for Computing the
Optimal Tableau
•
xj column in optimal tableau’s constraints = B-1aj
•
RHS of optimal tableau’s constraints = B-1b
•
Reduced cost for a nonbasic variable xj:
•
Coefficient of slack variable si in optimal row 0 = ith element of cBVB-1
•
Coefficient of excess variable ei in optimal row 0 =
cj =
cBVB-1aj – cj
-(ith element of cBVB-1)
•
Coefficient of artificial variable ai in optimal row 0 =
(ith element of cBVB-1) + M (max problem)
•
RHS of optimal row 0 = CBVB-1b
L. Ntaimo (c) 2005 INEN420 TAMU
26
6.4 – Summary of Formulas for Computing the
Optimal Tableau
Summary of a Simplex Tableau:
c BV B−1A − c
B−1A
This is a MUST know!
c BV B −1b
B−1b
In more detail:
c1
...
cn
B−1a1 . . . B−1a n
c BV x BV
x. BV1
.
.
x BVm
L. Ntaimo (c) 2005 INEN420 TAMU
27
6.5 – Important Observation
The mechanics of sensitivity analysis hinge on the following
important observation:
We know that a simplex tableau (for a max problem) for a set of
basic variables BV is optimal iff (if and only if) each constraint
has a nonnegative RHS and each variable has a nonnegative
coefficient in row 0.
•
Mathematically, we have two conditions I refer to as follows:
−1
c
=
c
B
a j − c j ≥ 0 ∀j ∈ NBV
BV
“Optimality condition”: j
−1
−1
B
b
≥
0
(Note
:
x
=
B
b, x BV ≥ 0)
BV
“Feasibility condition”:
L. Ntaimo (c) 2005 INEN420 TAMU
28
6.6 – Sensitivity Analysis
Suppose we have solved an LP and have found that BV is an optimal
basis. A procedure to determine if any change in the LP will cause the
BV to be no longer optimal (to become suboptimal) is as follows:
Step 1 Use the formulas derived in the previous slides to determine
how changes in the LP’s parameters change the right hand side of
row 0 of the optimal tableau (the tableau having BV as the set of basic
variables).
Step 2 If each variable in row 0 has a nonnegative coefficient
(optimality condition) and each constraint has a nonnegative RHS
(feasibility condition), BV is still optimal. Otherwise, BV is no longer
optimal.
If BV is no longer optimal, find the new optimal solution by using the
derived formulas to recreate the entire tableau for BV and then continuing
the simplex algorithm with the BV tableau as your starting tableau.
L. Ntaimo (c) 2005 INEN420 TAMU
29
6.6 – Sensitivity Analysis
There are two reasons why a change in an LP’s parameters
cause BV to no longer be optimal:
1. A variable (or variables) in row 0 may have a negative
coefficient. In this case, a better (larger z-value) bfs can be
obtained by pivoting in a nonbasic variable with a negative
coefficient in row 0. If this occurs, the BV is now a
suboptimal basis. (This is violation of the optimality
condition)
2. A constraint (or constraints) may now have a negative RHS.
In this case, at least one member of BV will now be negative
and BV will no longer yield a bfs. If this occurs, we say they
BV is now an infeasible basis. (This is violation of the
feasibility condition)
L. Ntaimo (c) 2005 INEN420 TAMU
30
6.6 – Sensitivity Analysis
Six types of changes in an LP’s parameters change the optimal solution:
1. Changing the objective function coefficient of a nonbasic
variable.
2. Changing the objective function coefficient of a basic
variable.
3. Changing the right-hand side of a constraint.
4. Changing the column of a nonbasic variable.
5. Adding a new variable or activity.
6. Adding a new constraint.
L. Ntaimo (c) 2005 INEN420 TAMU
31
6.6 – Sensitivity Analysis
1. Changing the Objective Function Coefficient of a Nonbasic Variable.
•
Since B (hence B-1) and b are unchanged, feasibility is not affected.
•
However, cbv is unchanged but cj (hence
is affected in this case.
•
When is BV still optimal?
c j may change). Thus optimality
If the objective function coefficient of a nonbasic variable xj is
changed, the current basis remains optimal if c j ≥ 0. If c j < 0 , then
the current basis is no longer optimal, and xj will be a basic
variable in the new optimal solution.
Note: The process of computing the new coefficient of a nonbasic variable xj
using c j = c BV B −1a j − c j ≥ 0 is called pricing out xj.
L. Ntaimo (c) 2005 INEN420 TAMU
32
6.6 – Sensitivity Analysis
2. Changing the Objective Function Coefficient of a Basic Variable.
•
Since B (hence B-1) and b are unchanged, feasibility is not affected.
•
However, cbv is changed and hence
optimality is affected in this case.
•
When is BV still optimal?
c j may change for variable xj. Thus
If the objective function coefficient of a basic variable xj is changed,
the current basis remains optimal if c j ≥ 0 ∀j ∈ BVand NBV . If c j < 0
for any variable xj, then the current basis is no longer optimal.
Note: If the current basis remains optimal, then the values of the decision
variables do not change because xBV = B-1b remains unchanged. However,
the optimal z-value (z = cBVB-1b) does change.
L. Ntaimo (c) 2005 INEN420 TAMU
33
6.6 – Sensitivity Analysis
3. Changing the Right-Hand Side (RHS) of a Constraint.
•
Since b does not appear in the optimality condition, changing the RHS of a
constraint does not affect optimality but affects feasibility.
•
Feasibility requires x BV = B−1b ≥ 0
•
When is BV still optimal?
If the RHS of a constraint is changed, then the current basis (BV)
remains optimal if the RHS of each constraint in the tableau
remains nonnegative. If the RHS of any constraint becomes
negative, then the current basis in infeasible, and a new optimal
solution must be found (Use the Dual Simplex Method).
Note: Changing b will change the values of the basic variables and the
optimal z-value.
L. Ntaimo (c) 2005 INEN420 TAMU
34
6.6 – Sensitivity Analysis
4. Changing the Column of a Nonbasic Variable.
•
Does not affect feasibility ( x BV = B−1b ≥ 0 ) but affects optimality.
If the column of a nonbasic variable xj is changed, the current basis
remains optimal if c j ≥ 0. If c j < 0 , then the current basis is no
longer optimal and xj will be a basic variable in the new optimal
tableau.
Note: If the column of a basic variable is changed, then it is usually difficult to
determine whether the current basis is optimal (both the optimality and
feasibility conditions are affected). As always, the current basis would
remain optimal if the optimality and the feasibility conditions are both
satisfied.
L. Ntaimo (c) 2005 INEN420 TAMU
35
6.6 – Sensitivity Analysis
5. Adding a New Activity (Addition of a New Variable).
•
In many practical situations, opportunities arise to undertake new activities
•
−1
Does not affect feasibility ( x BV = B b ≥ 0 ) but affects optimality.
•
To determine whether a new activity xj will cause the current basis to be
no longer optimal, price out the new activity!
If a new column (corresponding to a variable xj) is added to an LP,
then the current basis remains optimal if c j ≥ 0. If c j < 0 , then the
current basis is no longer optimal and xj will be a basic variable in
the new optimal solution.
L. Ntaimo (c) 2005 INEN420 TAMU
36
6.6 – Sensitivity Analysis
6. Adding a New Constraint
Case 1. The current optimal solution satisfies the new constraint.
Then the current basis is still optimal.
Case 2. The current optimal solution does not satisfy the new
constraint but the LP is still feasible. In this case, use the Dual
Simplex Method to determine the new optimal solution.
Case 3. The additional constraint causes the LP to have no feasible
solutions. In this case the Dual Simplex Method will show that the
LP has become infeasible.
L. Ntaimo (c) 2005 INEN420 TAMU
37
6.7 – Sensitivity Analysis Example
Dakota Furniture Problem (Standard Form):
Max z = 60x1 + 30x2 + 20x3
(total revenue)
8x1 +
6x2 +
x3 + s1
4x1 +
2x2 + 1.5x3
2x1 + 1.5x2 + 0.5x3
= 48 (Lumber constraint)
+ s2
= 20 (Finishing constraint)
+ s3 = 8 (Carpentry constraint)
x1, x2, x3, s1, s2, s3 ≥ 0
x1 = # of desks manufactured
x2 = # of tables manufactured
x3 = # of chairs manufactured
Optimal Dakota Tableau:
Row
0
1
2
3
z
1
0
0
0
x1
0
0
0
1
x2
5
-2
-2
1.25
x3
0
0
1
0
s1
0
1
0
0
s2
10
2
2
-0.5
s3
10
-8
-4
1.5
L. Ntaimo (c) 2005 INEN420 TAMU
RHS
280
24
8
2
BV
z = 280
s1 = 24
x3 = 8
x1 = 2
38
6.7 – Sensitivity Analysis Example
1. Changing the Objective Function Coefficient of a Nonbasic Variable.
•
Currently c2 = 30.
•
For what values of c2 would BV = {s1, x3, x1} remain optimal? (Affects only the optimality
condition)
Let ∆ denote the amount by which we have changed c 2 .
Then c 2 = 30 + ∆.
= ya 2 − c 2
2
-8⎤
⎡1
⎡1 1 8 ⎤
⎢
⎥
⎢
BV = {s1 , x 3 , x1}, B = ⎢0 1.5 4⎥, ∴ B -1 = ⎢0
2
- 4 ⎥⎥
⎢⎣0 - 0.5 1.5⎥⎦
⎢⎣0 0.5 2⎥⎦
Note: For any simplex tableau, B-1 is the m x m
matrix consisting of the columns in the
current tableau that correspond to the initial
tableau’s set of basic variables taken in the
same order.
Let y = c BV B-1
2
-8⎤
⎡1
= [0 20 60] ⎢⎢0
2
- 4 ⎥⎥
⎢⎣0 - 0.5 1.5⎥⎦
= [0 10 10]
Then c2 = c BV B-1a 2 - c 2
⎡6⎤
= [0 10 10]⎢⎢ 2 ⎥⎥ − (30 + ∆ )
⎢⎣1.5⎥⎦
= 35 − (30 + ∆ )
∴ c2 = 5 − ∆
Thus c2 ≥ 0 holds, and BV remains optimal if 5 − ∆ ≥ 0, or ∆ ≤ 5.
Conclusion:
If the price of tables is decreased or increased
by $5 or less, BV remains optimal.
Thus, for c2 ≤ 30 + 5 = 35, BV remains optimal.
Also, the z-value remains the same ($280).
L. Ntaimo (c) 2005 INEN420 TAMU
39
6.7 – Sensitivity Analysis Example
1. Changing the Objective Function Coefficient of a Nonbasic Variable.
•
Consider the case when c2 > 35, e.g. c2 = 40.
•
We know that BV will now be suboptimal.
Then c2 = c BV B-1a 2 - c 2
= ya 2 − c 2
⎡6⎤
= [0 10 10]⎢⎢ 2 ⎥⎥ − 40
⎢⎣1.5⎥⎦
= -5
∴ c2 < 0 and x 2 can now enter the basis!
Perform an iteration of the simplex method.
L. Ntaimo (c) 2005 INEN420 TAMU
40
6.7 – Sensitivity Analysis Example
1. Changing the Objective Function Coefficient of a Nonbasic Variable.
Suboptimal Tableau
Row
z
0
1
1
0
2
0
3
0
Optimal Tableau
Row
z
0
1
1
0
2
0
3
0
x1
0
0
0
1
x1
4
1.6
1.6
0.8
x2
-5
-2
-2
1.25
x2
0
0
0
1
x3
0
0
1
0
x3
0
0
1
0
s1
0
1
0
0
s1
0
1
0
0
s2
10
2
2
-0.5
s2
8
1.2
1.2
-0.4
s3
10
-8
-4
1.5
s3
16
-5.6
-1.6
1.2
RHS
280
24
8
2
RHS
288
27.2
11.2
1.6
BV
z = 280
s1 = 24
x3 = 8
x1 = 2
MRT
none
none
1.6*
BV
z = 288
s1 = 27.2
x3 = 11.2
x2 = 1.6
•
Thus, if c2 = 40, the optimal solution to the Dakota Problem changes to x1 = 0, x2 = 1.6, x3 =
11.2, and z = 288.
•
The increase in the price of tables causes Dakota Furniture to manufacture tables instead of
desks!
L. Ntaimo (c) 2005 INEN420 TAMU
41
6.7 – Sensitivity Analysis Example
2. Changing the Objective Function Coefficient of a Basic Variable.
•
Consider changing c1 from its current value of c1 = 60.
•
For what values of c1 would BV = {s1, x3, x1} remain optimal? (Affects both the
optimality and feasibility conditions)
Let ∆ denote the amount by which we have changed c1.
Then c1 = 60 + ∆.
Since BV = {s1 , x 3 , x1},
Let y = c BV B-1
= [0 20 60 + ∆] B-1
= [0 10 - 0.5∆ 10 + 1.5∆]
We can now compute the new row 0 :
Since s1 , x 3 , and x1 are basic variables, their coefficients in row 0
must still be zero.
L. Ntaimo (c) 2005 INEN420 TAMU
42
6.7 – Sensitivity Analysis Example
2. Changing the Objective Function Coefficient of a Basic Variable.
However, for nonbasic variables we have :
x2 :
c2 = c BV B-1a 2 - c 2 = ya 2 − c 2
⎡6⎤
= [0 10 - 0.5∆ 10 + 1.5∆ ]⎢⎢ 2 ⎥⎥ − 30
⎢⎣1.5⎥⎦
= 5 + 1.25∆
s2 :
c5 = c BV B-1a 5 - c 5 = ya 5 − c5
c2 ≥ 0 ⇒ 5 + 1.25∆ ≥ 0 (true iff ∆ ≥ -4)
c5 ≥ 0 ⇒ 10 − 0.5∆ ≥ 0 (true iff ∆ ≤ 20)
c6 ≥ 0 ⇒ 10 + 1.5∆ ≥ 0 (true iff ∆ ≥ -20/3).
∴ BVremains optimal if - 4 ≤ ∆ ≤ 20 .
Range : 56 ≤ c1 ≤ 80 .
If BV remains optimal, the values of then bv' s are unchanged.
e.g. c1 = 70, 56 ≤ 70 ≤ 80 ⇒ BV remains optimal.
⎡0 ⎤
= [0 10 - 0.5∆ 10 + 1.5∆ ]⎢⎢1⎥⎥ − 0
⎢⎣0⎥⎦
However, the optimal z - value may change :
e.g. c1 = 70 ⇒ z = 70x 1 + 30x 2 + 20x 3
= 70(2) + 30(0) + 20(8)
= 10 − 0.5∆
s3 :
BV Remains optimal iff the following hold
= $300 .
c6 = c BV B a 6 - c 6 = ya 6 − c 6
-1
⎡0 ⎤
= [0 10 - 0.5∆ 10 + 1.5∆ ]⎢⎢0⎥⎥ − 0
⎢⎣1⎥⎦
= 10 + 1.5∆
L. Ntaimo (c) 2005 INEN420 TAMU
43
6.7 – Sensitivity Analysis Example
2. Changing the Objective Function Coefficient of a Basic Variable.
•
When the current BV is no longer optimal.
Consider c1 = 100, then ∆ = 100 - 60 = 40 > 20.
Compute the reduced costs, that is row 0 :
Let y = c BV B−1 = [0 10 − 0.5∆ 10 + 1.5∆]
= [0 - 10 70]
RHS of row 0 = c BV B−1 b = yb
x1 : c1 = 0 (bv)
s 2 : c 5 = 10 − 0.5∆ = 10 − 0.5(40) = −10
⎡ 48⎤
= [0 - 10 70] ⎢⎢20⎥⎥
⎢⎣ 8 ⎥⎦
= 360.
If c1 = 100, BV = {s1 , x 3 , x1} is now suboptimal.
s 3 : c 6 = 10 + 1.5∆ = 10 + 1.5(40) = 70
Enter s 2 into the basis to get new optimal solution!
x 2 : c 2 = 5 + 1.25∆ = 5 + 1.25(40) = 55
x 3 : c3 = 0 (bv)
s1 : c 4 = 0 (bv)
L. Ntaimo (c) 2005 INEN420 TAMU
44
6.7 – Sensitivity Analysis Example
2. Changing the Objective Function Coefficient of a Basic Variable.
•
When the current BV is no longer optimal.
c1 = 100
Suboptimal Tableau
Row
z
0
1
1
0
2
0
3
0
x1
0
0
0
1
x2
55
-2
-2
1.25
x3
0
0
1
0
s1
0
1
0
0
s2
-10
2
2
-0.5
Optimal Tableau
Row
z
0
1
1
0
2
0
3
0
x1
0
0
0
1
x2
45
0
-1
0.75
x3
5
-1
0.5
0.25
s1
0
1
0
0
s2
0
0
1
0
s3
70
-8
-4
1.5
s3
50
-4
-2
0.5
RHS
360
24
8
2
RHS
400
16
4
4
BV
z = 360
s1 = 24
x3 = 8
x1 = 2
MRT
12
4*
none
BV
z = 400
s1 = 16
s2 = 4
x1 = 4
•
Thus, if c1 = 100, the optimal solution to the Dakota Problem changes to x1 = 4, x2 = x3 = 0, and
z = 400.
•
The increase in the price of desks causes Dakota Furniture to stop making chairs but make
desks only!
L. Ntaimo (c) 2005 INEN420 TAMU
45
6.7 – Sensitivity Analysis Example
3. Changing the RHS of a constraint.
•
Consider changing b2 from its current value of b2 = 20.
•
For what values of b2 would BV = {s1, x3, x1} remain optimal? (Affects feasibility condition only)
Let b 2 = 20 + ∆, then the RHS of the constraints become :
2
- 8 ⎤ ⎡ 48 ⎤ ⎡ 24 + 2∆ ⎤
⎡1
⎢
-1
B b = ⎢0
2
- 4 ⎥⎥ ⎢⎢20 + ∆ ⎥⎥ = ⎢⎢ 8 + 2∆ ⎥⎥
⎢⎣0 - 0.5 1.5⎥⎦ ⎢⎣ 8 ⎥⎦ ⎢⎣2 − 0.5∆ ⎥⎦
For BV to remain optimal B-1b ≥ 0
24 + 2∆ ≥ 0 (true iff ∆ ≥ -12)
8 + 2∆ ≥ 0 (true iff ∆ ≥ -4)
2 − 0.5∆ ≥ 0 (true iff ∆ ≤ 4)
⇒ Range : - 4 ≤ ∆ ≤ 4
⇒
16 ≤ b 2 ≤ 24
Even if BV remains optimal, the values of the
basic variables and z may change.
x BV = B −1 (new b)
⎡ 48⎤ ⎡28⎤
⎡ s1 ⎤
⎥
⎢
−1 ⎢
e.g. b 2 = 22, x BV = ⎢ x3 ⎥ = B ⎢22⎥⎥ = ⎢⎢12 ⎥⎥
⎢⎣ 8 ⎥⎦ ⎢⎣ 1 ⎥⎦
⎢⎣ x1 ⎥⎦
z = c BV B−1 (new b)
⎡ 48⎤
= [0 10 10] ⎢⎢22⎥⎥
⎢⎣ 8 ⎥⎦
= 300.
L. Ntaimo (c) 2005 INEN420 TAMU
46
6.7 – Sensitivity Analysis Example
3. Changing the RHS of a constraint.
•
Remember that shadow price/dual multipliers are y = cBVB-1?
•
The important concept of how dual multipliers (shadow prices) can be used to determine how
changes in the RHS change the optimal z-value should be apparent by now.
•
What if BV is no longer optimal?
Let b 2 = 30 > 24, then
2
- 8 ⎤ ⎡48⎤ ⎡ 44 ⎤
⎡1
⎢
B b = ⎢0
2
- 4 ⎥⎥ ⎢⎢30⎥⎥ = ⎢⎢ 28 ⎥⎥
⎢⎣0 - 0.5 1.5⎥⎦ ⎢⎣ 8 ⎥⎦ ⎢⎣− 3⎥⎦
-1
⎡48⎤
RHS of Row 0 = c BV B b = yb = [0 10 10] ⎢⎢30⎥⎥ = 380.
⎢⎣ 8 ⎥⎦
-1
Since we have an infeasible tableau, use the Dual Simplex Algorithm
to get the new optimal solution.
L. Ntaimo (c) 2005 INEN420 TAMU
47
6.7 – Sensitivity Analysis Example
4. Changing the Column of a Nonbasic Variable.
•
Suppose the price of tables increased from $30 to $43 and that due to changes in technology
the table now requires 5 board ft of lumber, 2 finishing hours, and 2 carpentry hours.
•
Would this change the optimal solution to the Dakota Furniture problem?
⎡5 ⎤
⎡6⎤
⎢
⎥
c 2 = 43, a 2 = ⎢2⎥. Before c 2 = 30, a 2 = ⎢⎢ 2 ⎥⎥.
⎢⎣2⎥⎦
⎢⎣1.5⎥⎦
Simply use c2 = c BV B-1a 2 − c 2 to compute the new
coefficient of x 2 in row 0 ⇒ pricing out x 2 .
c2 = ya 2 − c 2
⎡5 ⎤
= [0 10 10]⎢⎢2⎥⎥ − 43
⎢⎣2⎥⎦
= −3
Since c2 < 0 BV = {s1 , x 3 , x1} is no longer optimal.
Recreate tableau for BV = {s1 , x 3 , x1} and then apply the simplex algorithm
(See Table 7 and 8 on page 286).
L. Ntaimo (c) 2005 INEN420 TAMU
48
6.7 – Sensitivity Analysis Example
5. Adding a New Activity.
•
Suppose Dakota Furniture decides to start making stools. The price of a stool is $15 and
requires 1 board ft of lumber, 1 finishing hours, and 1 carpentry hours.
•
Should Dakota Furniture manufacture stools?
•
Price out the new activity!
Let x 4 = # of stools manufactured.
⎡1⎤
Then a 4 = ⎢⎢1⎥⎥ and c 4 = 15
⎢⎣1⎥⎦
Then c4 = c BV B−1a 4 − c 4
⎡1⎤
= [0 10 10] ⎢⎢1⎥⎥ − 15
⎢⎣1⎥⎦
=5
Since c2 ≥ 0 BV = {s1 , x 3 , x1} is still optimal.
The reduced cost is $5. This means that each stool would
drecrease revenue by $5. Therefore, Dakota Furniture should
not manufacture any stool.
L. Ntaimo (c) 2005 INEN420 TAMU
49
Download