Uploaded by ekzpuks835

final 2023

advertisement
Full marks: 20
Final Exam
Time: 2 hours
In class we have learnt two techniques for sparse recovery – a greedy approach and an optimization
based approach for modern inverse problems in imaging. The finals will be an extension to that.
Consider the constrained optimization problem –
min x
x
2
2
subject to y = Ax
It is solved via the Lagrangian method. The unconstrained Lagrangian is defined for the above is
defined as –
L( x,  ) = xT x +  T ( y − Ax) , where λ is the Lagrangian multiplier.
The approach is simple. To minimize, one has to differentiate with respect to x and λ and equate them
to zero (standard minimization approach learned in Plus 2).
 x L = 2 x + AT  = 0
(1)
 L = Ax − y = 0
(2)
We are not interested in λ per se. So we will basically eliminate it.
1
From (1) we get, x = − AT 
2
(3)
Substituting x from (3) into (2) we get,  = −2( AAT )−1 y (4)
Putting the value of λ back in (3) gives us the solution for x: x = AT ( AAT )−1 y .
What you need to do is to solve a very similar problem. You need to solve the following using the
Lagrangian approach just mentioned.
min Wx
x
2
2
subject to y = Ax . Here the matrix W is square and full rank.
1. Find the value of x in terms of W, A and y.
2. What should be the expression of W such that, the aforementioned expression approximates
the l1-norm minimization problem, i.e. min x 1 subject to y = Ax
x
Upload your answer sheet with the derivation for part 1 and reasoning for part 2.
(16+4)
Download