Exam text content

MAT-60456 Optimization Methods - 11.12.2019

Exam text content

The text is generated with Optical Image Recognition from the original exam file and it can therefore contain erroneus or incomplete information. For example, mathematical symbols cannot be rendered correctly. The text is mainly used for generating search results.

Original exam
MATH.APP 320

Optimization Methods Exam, 11.12.2019 Henri Hansen

 

Please answer four out of the following five guestions, on a separate paper. Note that there
are two pages in the exam. No books. non-programmable calculators are allowed.
A Collection of Formulas will be handed. Kaavakokoelma jaetaan.

1. Consider the optimization problem:
min f(z,y) = 2? + 4?
st.
+ (y- 2) <1
(a) Is the feasible set convex?
(b) Is the cost-function convex in the feasible region?
(e) You can deduce the optimum of this problem relatively easily. Demonstrate that
it isa KKT-point and check the constraint gualifications.
2. Assume that we have the following linear problem:
min ex
s.t.
Ax <b
T>0
(a) Explain how this is transformed so that the constraint becomes an eguality con-
straint.

(b) Suppose that we wish to use the Simplex algorithm. To do so, we need one basic
feasible solution to the LP. Explain how to find one, assuming b above contains
only positive values.

(e) Consider the reduced cost & (formula given in the formula sheet). Show (using
algebra), that its component i is egual to the change in the cost function if we
increase the value of the null-variable x; by one.

3. Consider the unconstrained problem
min J (x,y) = (1 — 2)? + 10(y — 2?)?

(a) Find the minimum of this function analytically. (Hint: It is not hard, if you really
look at when it can be minimum. Or solve for the zero of the gradient, but that
is going to be hard.

(b) Consider finding the minimum using one of the methods. Start from any point
(other than the optimum) and caleulate one iteration. If you use the gradient
method, be sure to formulate the minimization problem solving the step length.

1
Optimization Methods Exam, 11.12.2019 Henri Hansen
(e) Demonstrate that the function is not convex. (By any means you like.)

4. Let f,g: X — R be two strictly convex functions. Which of the following claims hold?
Prove or disprove (i.e., give an example where the claim is false)
(a) The function min(/ (x), g(z)) is convex.
(b) The set f(z) = b is convex for every b € R.
(c) If f is continuosly differentiable and ! Jin f(x) = oo Then /f has a unigue mini-
pi|+00
mum. =
5. Suppose we have a constraint problem
min ay?
s.t.
mn -yc=4
(a) Formulate the KKT-conditions for this problem.

(b) solve the problem either by solving the KKT-eguation or by using Lagrange mul-
tipliers. (Hint: it's not too hard, consider the factors)

(e) What is the minimum of the function?
Optimization methods

 

Important formulas for the exam

Henri Hansen

 

 

 

 

 

 

 

 

Aisi = -Vf(Dr41) + Badr

 

 

 

Algorithm direction line length Update rule
Gradient Descent -V fa) min; f (2x + tdk) | Zk41 = Zn + tdy
Newton Method (2) *V f(x) 1 Bedi + %k — (xn) *VI(0)
Ouasi-Newton -HoV Ia) min; f(2x + tdz) | %h41 = %k + tdy
Husi = H+ An
Conjugate Gradient | do = -Vf(20) = NT |vi = +hd.

VIET (VI le) VI)
Br = Ivica I

 

1. The Subset K € X of a vector space X is convex if and only

A € [0,1], Av + (1—)y € K.

2. Let K be a convex set. A function f : K > R is convez if and

and A € [0,1], f(Ax + (1— Ny) < A(x) + (1— A) (9).

 

if, for all ry € K and

only if for all x,y € K

3. Let K be a convex set. A diflerentiable function f : K > R is pseudoconvez if and
only if for all z,y € K, if Vf(z)- (y— x) > 0 then f(y) > f(z).

4. Let K be a convex set. A function f : K > R is guasiconvex if and only if for all
x,y € K and X € [0,1], [(Ax + (1 — A)y) < max(F(x), f(y))

Consider the optimization problem:

s.t.

min f(z)

Ji(x) < 0 wherei=1,...,m

h;(x) =0 where j =1,...,p

where f: X 3 R and 9; : X > Rh;:X > R are functions.

1. The Karush-Kuhn-Tucker Conditions are satisfied at point z* € X (KKT-point) if and
only if there exists p; > 0 and Aj € R such that

Vie) +>,mVgila

i=i

M9i(x*) = 0, for alli=1,...,m
9:(2*) < 0 and h;(z*) = 0 foralli=1,...,mandj=1,...,p

43 Myle) =0

 
 

N
Optimization methods Important formulas for the exam Henri Hansen
Theorem: (Regarding constraint gualifications). If x* is feasible, minimum point, then it
is a KKT-point, provided one of the following hold:
1. The functions g; are convex, the functions h; are affine (linear) and there exists an
interior point £, i.e., such that g(£) < 0 and h(£) = 0.
2. The gradients Vg;(z*) and Vh;(z*) are linearly independent, for every i such that
9i(x*) = 0.
(If some point does not satisfy the second gualification, it might be a minimum even if it is
not a KKT point; if the first holds for the problem, it should not have points that do not
satisfy the gualification)
Consider the LP-problem
min cz k
s.t.
Az=b
7>0
1. Ff 4=[B N] and x = [77 27]" where x, = 0 and x; > 0, then we say that x is a basic
feasible solution and B is a basis of the problem
2. Given a basis B (ie. A = [B NJ) and basic feasible solution z = [z7 27]” where
2» > O, the reduced cosi associated with B is
8 =c3 ABN
where c = [eX e1]”.
3. The dual of the problem is
max b"u
st. <
ATu<c
u free
Let
A= a11 41
421 422
Then

Al= 1 422 —012
detA |-an an


We use cookies

This website uses cookies, including third-party cookies, only for necessary purposes such as saving settings on the user's device, keeping track of user sessions and for providing the services included on the website. This website also collects other data, such as the IP address of the user and the type of web browser used. This information is collected to ensure the operation and security of the website. The collected information can also be used by third parties to enable the ordinary operation of the website.

FI / EN