辅导案例-CSE 569

欢迎使用51辅导,51作业君孵化低价透明的学长辅导平台,服务保持优质,平均费用压低50%以上! 51fudao.top
1

CSE 569 Homework #1
Total 3 points

Due Friday, Sept. 18, 11:59PM.

Notes:
1. The exam questions will be similar to these homework problems (and in particular, Q2 through Q5
were taken from past exams). Therefore, no sample exams will be posted, since these problems serve
as samples already.
2. Submission of homework must be electronic. Most problems can be solved by hand. You can write
down your solutions on paper and then take a picture of your hand-written sheets, and then upload
your work onto Canvas. The due date/time will be strictly enforced. Late submission will not be
accepted by the system.
3. If you have any question on the homework problems, you should post your question on the
Canvas discussion board (under the Homework 1 Q & A), instead of sending emails to the
instructor or TA. Questions will be answered there to avoid repetition. This also helps the entire
class to stay on the same page whenever any clarification/correction is made.
4. You will receive 0.5 point for attempting each of Q1 through Q6 (with a total 3 points max). Q7 is
optional, and no point will be given.


Q1. (From the textbook)



Q2. Consider a 1-dimensional, two-category classification problem, with prior probabilities P(ω1) = 1/3
and P(ω2) = 2/3. The class-conditional PDFs for the two classes are the normal densities N(μ1,σ2) and
2

N(μ2,σ2), respectively. Note that these two PDFs have the same variance σ2. N(μ, σ2) denotes the
normal density defined by


We further assume that the losses λ11 = λ22 = 0, but λ12 and λ21 are some positive values.
Find the optimal decision rule for classifying any feature point x.
[You need to present your rule in the form of “Deciding on ω1 if x ∈R1; otherwise Deciding on ω2”,
where R1 needs to be explicitly defined in terms of the given parameters μ1, μ2, σ, λ12 and λ21. ]


Q3. Consider a two-class classification problem with 1-dimensional class-conditionals given as




Suppose the priors are P(ω1)=2/3, P(ω2)=1/3. Find the optimal decision rule for doing the
classification. What is the Bayes error in this case?


Q4. Consider the following simple Bayesian network, where all the nodes are assumed to be binary
random variables, i.e., X=x0 or x1 with certain probabilities, and similar notations will be used for Y, Z,
and W.

This Bayesian network is fully specified if we are given the following (conditional) probabilities:
(for notational simplicity, we write P(x1) to mean P(X=x1), and so on)
P(x1) = 0.60;
P(y1 | x1) = 0.40, P(y1 | x0) = 0.30;
P(z1 | y1) = 0.25, P(z1 | y0) = 0.60;
P(w1 | z1) = 0.45, P(w1 | z0) = 0.30;

(a) Suppose that X is measured and its value is x1, compute the probability that we will obverse W having a
value w0, i.e., P(w0 | x1).
(b) Suppose that W is measured and its value is w1, compute the probability that we will obverse X having a
value x0, i.e., P(x0 | w1).


Q5. True-or-False: For a two-class classification problem using the minimum-error-rate rule, in general
the decision boundary can take any form. However, if the underlying class-conditionals are Gaussian
densities, then the decision boundary is linear (hyperplanes).

3

[ ] True [ ] False
Brief explanation of your answer:


Q6. (From the textbook)



(Optional) Q7. Part A. Consider the following game: Someone shows you three hats and tells you that
the there is a prize in one of them. He asks you to choose one of the hats. You choose one hat and tell
him which one you chose. He then lifts one of the hats you didn’t choose and there is nothing under that
hat. He then tells you that you can either stay with the hat you have originally chosen or switch to the
other remaining hat. What should you do? Explain your answer.

Part B. (Use this to help ensure your Part A is correct). Design a computer-based experiment (i.e., write
a computer program) to simulate the above game to verify your answer, by playing the game many times
to obtain an averaged performance.

51作业君

Email:51zuoyejun

@gmail.com

添加客服微信: abby12468