



Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
University of Wisconsin-Madison. Fall 2017 Qualifier Exam: OPTIMIZATION. September 18, 2017. GENERAL INSTRUCTIONS: 1. Answer each question in a separate ...
Typology: Exercises
1 / 5
This page cannot be seen from the preview
Don't miss anything!
Fall 2017 Qualifier Exam: OPTIMIZATION
September 18, 2017
Answer all 4 questions.
The Exam Committee tries to proofread the exam as carefully as possible. Nevertheless, the exam sometimes contains misprints and ambiguities. If you are convinced a problem has been stated incorrectly, mention this to the proctor. If necessary, the proctor can contact a represen- tative of the area to resolve problems during the first hour of the exam. In any case, you should indicate your interpretation of the problem in your written answer. Your interpretation should be such that the problem is nontrivial.
Let P = {x ∈ Rn^ : Ax ≤ b} and Q = {x ∈ Rn^ : Cx ≤ d} be two non-empty polyhedra. (a) Write a linear programming formulation that solves the problem: min{‖x − y‖ 1 : x ∈ P, y ∈ Q} where ‖z‖ 1 =
∑n i=1 |zi|^ is the 1-norm. (b) Write the dual of the formulation you wrote in part (a). (c) Justify that both the primal and dual problems have an optimal solution (you may use the strong duality theorem). (d) Using the above primal/dual pair of linear programs, show that if P ∩ Q = ∅, then there exists a vector p ∈ Rn^ such that p>x < p>y for all x ∈ P and y ∈ Q. [Hint: the vector p can be defined using an optimal dual solution. ]
It is common knowledge that words/objects/entities have color associations. For exam- ple, anger is often associated with the color red. These associations are not one-to-one mappings, e.g. strawberry is also associated with the color red. The associations are not unique either; apple can be associated with red or green, and if we’re talking about the company Apple Inc., the associations will be different still! You are given a bar graph where each bar represents a different entity and your task is to choose colors to use for each of the bars. For example, the graph might look like the one below:
kiwi tomato raspberry apple plum
sales ($M)
Your task is to choose colors for the bars in the graph so that each chosen color has a strong association with the category it represents. Suppose the labels for the bars in the graph are {b 1 ,... , bm} and the colors at your disposal are {c 1 ,... , cn}. You have access to a dataset of color-category association strengths. The data is in the form of a table: Category \ Color c 1 c 2 · · · cn b 1 a 11 a 12... a 1 n .. .
bm am 1 am 2... amn
(a) Explain how the maximum weight matching problem can be solved in polynomial time if G is bipartite.
The greedy algorithm for the maximum weight matching problem proceeds as follows:
(b) Show an example in which the greedy algorithm does not find a matching with max- imum weight. (c) We now consider a restricted version of the maximum weight matching problem in which the weights of all edges are 1, hence a maximum matching M is simply a matching with maximum cardinality |M |. Notice that the greedy algorithm in this case chooses an arbitrary edge from A in every iteration. Let OP T be the cardinality of the optimal solution and let Mg be the output of the greedy algorithm. Show that (^) OP T|Mg^ | ≥ 0 .5 (In other words, the matching that the greedy algorithm finds is at least half the size of an optimum one). (Hint: Consider the relationship between the edges in a greedy matching and those in an optimal matching.) (d) For every n ∈ Z+ give a graph with at least n vertices for which the greedy algorithm could possibly yield a matching with (^) OP T|Mg^ | = 0.5.
(a) Let f : Rn^ → R be a twice continuously differentiable function such that f (x) ≥ 0 for all x ∈ Rn. Define the function g(x) = 12 f (x)^2 , and consider the following two problems: min x f (x), (F) min x g(x). (G) Verify that the first order necessary conditions for these two problems are equivalent, that is, x∗^ satisfies first-order necessary conditions for (F) if and only if x∗^ satis- fies first-order necessary conditions for (G). (Hint: Consider the case of f (x∗) = 0 carefully.)
(b) Suppose that f : Rn^ → R has Lipschitz continuous gradient, that is, there is L > 0 such that ‖∇f (y) − ∇f (z)‖ 2 ≤ L‖y − z‖ 2 for all y, z ∈ Rn. Suppose in addition that f (x) ≥ f¯ for all x, and for some f >¯ −∞. Consider the following short-step steepest descent method:
xk+1^ = xk^ − 1 L ∇f (xk), k = 0, 1 , 2 ,....
Show that the following three inequalities hold:
f (xk+1) ≤ f (xk) − 1 2 L ‖∇f (xk)‖^22 , k = 0, 1 , 2 ,... , T∑ − 1
k=
‖∇f (xk)‖^22 ≤ 2 L
f (x^0 ) − f¯
, for all T ≥ 1,
min k=0, 1 ,...,T − 1 ‖∇f (xk)‖ 2 ≤
2 L[f (x^0 ) − f¯ ] T , for all T ≥ 1.
Cite explicitly any theorems you use in proving these results. (Hint: Prove these three inequalities in sequence, using each one to prove the next in the sequence.)