Sie sind auf Seite 1von 2

T-61.

3050 PROBLEMS 3/2010

You should attend only one problem session (Wednesday or Friday) during a
week.

The problems are divided into demonstrations and home assignments. The
deadline for the assignments is 4 PM the next monday after the corresponding
exercise session. Please note that late submissions will not be graded. Please,
submit your answer either to the course e-mail address (t613050@james.hut.fi)
as pdf or on paper to the T-61.3050 box, located between the 3rd floor notice
board and door B303a. Your answers will be graded, and returned to you at
exercise sessions later on.

See https://noppa.tkk.fi/noppa/kurssi/t-61.3050/etusivu
for up-to-date information.

Demonstration

1. Jack is feeling tired in the office. He is considering whether to continue


working (decision α0 , utility U = 0) or go to the coffee room (α1 ), where he
might either be dissappointed that there is no coffee (U = −1) or be happy
to get some (U = 10). Jack notices a colleague walking from the coffee
room but feels embarrassed to ask whether there is coffee (U = −0.5).
Assume that the colleague would know and tell Jack if he asks. The
probability that there is coffee is P (z = 1) = 0.7. Compute the expected
utility in the case Jack asks and then decides, and in the case where he
does the decision α without getting the information z from his colleague.
What will happen assuming Jack makes rational decisions?
2. Show that it is feasible to use the Bayes rule to update the current knowl-
edge of model parameters θ by taking one sample at the time and using
obtained posterior as a prior distribution for the next iteration. Given a
sample X, show that the resulting distribution p(θ|X) is same regardless
of using “batch” or “iterative” update procedure. Demonstrate this with
a Gaussian data set and model parameter θ = µ (you can assume that σ
is known).

Home assignment

1. If we consider loss insted of utility we can define loss function that corre-
sponds to a negative utility function. In a two-class, two-action problem,
if the loss function (or negative utility) is λ11 = λ22 = 0 (classification
results is correct), λ12 = 10 (class 2 is classified as class 1) and λ21 = 1
(class 1 is classified as class 2), write the optimal decision rule.
2. Assume that you have N samples from a multinomial distribution where
P (y t = k) = θk for each t = 1 . . . N . Here, k ∈ {1, . . . , K} in case of
K classes and indicates a class where sample t belongs to. Use auxiliary
indicator variables xtk = 1 (e.g., If x1 belongs to class 2 and there are 4

1
classes, x1 = ( 0 1 0 0 ). Write the log likelihood and show Equa-
tion 4.6, that is, the maximum likelihood estimator for the parameters
being θ̂kM L = p̂k = N1 t xtk .
P

Das könnte Ihnen auch gefallen