ECE566 Computational Inference and Learning Homework #5 Due December 3, 2020 Problem 1. (Exponential distributions). Show that the feasible set of mean parameters is convex. Problem 2. (Maxent). Let K be a symmetric, positive definite matrix. Find the maxent distribution of a random variable X ∈ Rn subject to the constraint that Ep[XXT ] ≤ K (i.e. the matrix K − Ep[XXT ] is nonnegative definite). Problem 3. (Naive Mean Field). Derive the naive mean field approximation q to the dis- tribution p(x1, x2, x3) = 1{x1 ⊕ x2 = x3} over {0, 1}3. Give the KL divergence D(q||p). Problem 4. (Variational approximation). Consider the class Q of distributions of the form q(x1, x2, x3) = q1(x1)q2(x2|x1)q3(x3|x2) over {0, 1}3. Find q ∈ Q that minimizes D(q||p) for p(x1, x2, x3) ∝ 1{x1 ⊕ x2 = x3}. Problem 5. (Ising Model). Consider the Ising model on a 2x2 lattice, with parameter β = 0.1. (a) Show the graphical model is a 4-ring. (b) Since there are only 16 possible configurations, you are able to compute the correlation of any two neighboring sites exactly. Do so for β = 0.1 and for β = 1. (c) Use your results of Part (b) to evaluate the mutual information between any two adjacent sites. (d) Evaluate the Bethe entropy and compare with the true entropy (which you can compute exactly since there are only 16 possible configurations). Problem 6. (`1 penalized least squares). Let y = [ 2 1 ] and A = [ 3 1 1 3 ] . We wish to solve the minimization problem: min x∈R2 ‖y −Ax‖2 + 9‖x‖1. 1. First we are going to make a guess that the solution is 1-sparse. That is, fix x1 = 0 and find the minimizing x2 and the corresponding value of the objective function. Then fix x2 = 0 and find the minimizing x1 and the corresponding value of the objective function. Comparing these two values gives us the minimum of the objective function over the set of 1-sparse x. 1 2. Next write the equations for iterative soft thresholding and iteratively solve the mini- mization problem, using x = 0 for initialization of the algorithm. Do not use a computer. Is your solution 1-sparse? Problem 7. (Compressive sensing). Assume the measurement matrix A has spark number equal to 3. Show that two measurements suffice to recover any 1-sparse signal x. 2
欢迎咨询51作业君