Uploaded by Harshita Ved

Homework 3 - optimization

advertisement
6/9/2020
Homework 3 - optimization
Homework 3 - optimization
DS52X0
5/21/2020
Problem 1 Interval Bisection Search and Golden
Section Search - 1-D Optimization
We have learned from lecture that the Interval Bisection Search (IBS) is a derivative-based Search Method for a
global minimum xopt in a box constraint. Additionally, both R and Python implement 1-D direct searches (no
gradient) using Golden Section Search (GSS). GSS chooses an interior point xGSS such that the larger subinterval
is the same fraction of U − L as the smaller subinterval is to the larger subinterval.
Wiki summarizes the further details of GSS in the following figure.
In this exercise, you will compare the number of iterations for IBS and GSS to reach a certain tolerance condition.
Does GSS reach the tolerance condition with fewer iterations, on average?
We will consider minimizing the convex function f (x) = x 2 in this problem. Of course, the minimum occurs at
x = 0 , but we focus here on the smallest number of iterations T until $ U_T - L_T < $ tol , where tol is a
constant. The initial search interval [L0 , U0 ] will be randomly produced as follows: L0 ∼ U [−10, 0] and
U0 ∼ U [0, 10] . ( U stands for uniform distribution.) Let’s terminate each algorithm using the above tolerance
condition, for the same value of tol .
https://cdn.inst-fs-iad-prod.inscloudgate.net/46194b89-f5fe-425e-99b0-006b42083e4f/OptimizationExercises.html?token=eyJhbGciOiJIUzUxMiIsInR5cCI6IkpXVCI…
1/3
6/9/2020
Homework 3 - optimization
Plot the log10 of the average value of T for IBS (blue lines and points) and GSS (red lines and points) as a
function of $log_10 $ tol . Consider 50 trials of each algorithm to compute this average. To prevent long run
times, set the maximum number of iterations to 1000. Let the value of tol vary in the interval [10−8 , 100 ] .
Which algorithm performs best, and why?
Additional credit: Can you improve the GSS algorithm in this case? Consider modifying the GSS to reduce the
average number of iterations. Change the golden ratio by +10% and -10%, and compare the average number of
iterations to that of GSS as above. Plot the results for GSS in red, +10% in blue, and -10% in magenta. Which of
these three methods performs best in this case, and why?
Problem 2: Convexity in 1- and 2-Dimensions
1. Consider the objective function f (v) = (1 − v) 2 for a scalar variable v. Using first and second derivatives
confirm or reject that f is convex. If you reject convexity everywhere, please specify for which values of v
that f (v) is not convex.
2. Suppose that two functions g(v) and h(v) are both convex functions. Confirm or reject that their sum is
convex by the method above.
3. Let g(v) = (a − v2 ) 2 , where a is a fixed real number. Confirm or reject the claim that g(v) is convex for any
constant a. If you reject convexity everywhere, please specify which for values of v that g(v) is not convex
(as a function of a).
4. For a vector v
= [v1 v2 ]
T
, let p(v)
= (1 − v1 )
2
2
+ 100 ∗ (v2 − v )
1
2
. Calculate the gradient and Hessian
of p, and confirm or reject that p is convex everywhere in ℝ . If the function is not convex everywhere,
please specify with a plot the region where convexity does not hold. What minimizing challenges arise for
functions which are not convex?
2
Suggested Approach Compute the Hessian Hp (v) . Function p(v) is convex around point v if the eigenvalues of
Hp are both positive at v. On a coarse [v1 , v2 ] grid, compute the minimum eigenvalue. Look for regions where
the minimum eigenvalue is 0. Produce a contour plot of the minimum eigenvalue in that region, if you notice it.
Carefully label your axes.
Problem 3: Nelder-Mead Search and Gradient
Descent
The Nelder-Mead Search is a direct search (no gradient) method to find optimizers for functions. It has
implementations both in R and Python. Also, you will consider a conjugate gradient descent algorithm (CG),
common to both language libraries. In this problem you will compare the performance of both algorithms for
function p(v) above. Note that p(v) has a unique global minimum at v = [1, 1] T .
1. Please research the Nelder-Mead Algorithm (NMA) using Wiki and other sources. Explain how it works in 2
dimensions. In particular, what is a simplex in 2 dimensions, and how does NMA evolve the simplex on
each iteration? Please provide a brief description of the algorithm, using a graphic if helpful. Do not provide
a listing of an implementation.
2. Run NMA and CG to search for the minimizer of p, as follows. Set the maximum number of iterations to 5
for both algorithms. Consider starting guesses on the grid v1 ∈ [−3, 12] (spacing 0.1) and v2 ∈ [−6, 60]
(spacing 1). At the end of each trial record the squared distance between the output guess vg and the
optimizer [1, 1] T for each algorithm. Present your results in a 5-level contour map for each algorithm, with
the log10(squared distance) as the altitude versus the starting position. Discuss your findings.
https://cdn.inst-fs-iad-prod.inscloudgate.net/46194b89-f5fe-425e-99b0-006b42083e4f/OptimizationExercises.html?token=eyJhbGciOiJIUzUxMiIsInR5cCI6IkpXVCI…
2/3
6/9/2020
Homework 3 - optimization
https://cdn.inst-fs-iad-prod.inscloudgate.net/46194b89-f5fe-425e-99b0-006b42083e4f/OptimizationExercises.html?token=eyJhbGciOiJIUzUxMiIsInR5cCI6IkpXVCI…
3/3
Download