Beruflich Dokumente
Kultur Dokumente
Detection theory
detection theory.
The methods are based on estimation theory and attempt
Detection theory
The area is closely related to hypothesis testing, which is widely
used e.g., in medicine: Is the response in patients due to the new
drug or due to random fluctuations?
In our case, the hypotheses could be
H1 : x[n] = A cos(2f0 n + ) + w[n]
H0 : x[n] = w[n]
This example corresponds to detection of noisy sinusoid.
The hypothesis H1 corresponds to the case that the sinusoid is
present and is called alternative hypothesis.
The hypothesis H0 corresponds to the case that the
measurements consists of noise only and is called null hypothesis.
Introductory Example
manner.
Introductory Example
Likelihood
0.4
0.3
p(x; H1)
p(x; H0)
0.2
0.1
0
4
0
x[0]
H1 : = 1,
H0 : = 0.
An obvious approach for deciding the density would
Introductory Example
More specifically, study the likelihoods and choose the
1
(x[n] 1)2
H1 : p(x[0] | = 1) = exp
.
2
2
(x[n])2
1
exp
H0 : p(x[0] | = 0) =
.
2
2
Now, one should select H1 if
Introductory Example
Lets state this in terms of x[0]:
>1
p(x[0] | = 0)
(x[n]1)2
1 exp
2
2
>1
(x[n])2
1 exp
2
2
(x[n] 1)2 ) x[n]2
exp
>1
2
Introductory Example
(x[n]2 (x[n] 1)2 ) > 0
2x[n] 1 > 0
1
x[n] > .
2
In other words, choose H1 if x[0] > 0.5 and H0 if x[0] < 0.5.
Studying the ratio of likelihoods on the second row of the
Introductory Example
p(x | H1 )
p(H1 )
>
,
p(x | H0 )
p(H2 )
i.e., the only effect of using posterior probability is on the
threshold for the LRT.
Error Types
Error Types
0.4
0.3
0.2
0.1
0
4
0
x[0]
Error Types
0.4
0.3
0.3
Likelihood
Likelihood
0
x[0]
0.2
0.1
0
4
0
x[0]
Error Types
Probability of false alarm for the threshold = 1.5 is
PFA = P(x[0] > | = 0) =
Z
1.5
1
(x[n])2
exp
dx[n] 0.0668.
2
2
Z 1.5
1
(x[n] 1)2
exp
dx[n] 0.6915.
2
2
(x[n] 1)2
1
exp
dx[n] 0.3085.
2
2
Neyman-Pearson Theorem
Since PFA and PD depend on each other, we would like to
p(x; H1 )
> ,
p(x; H0 )
Neyman-Pearson Theorem
As an example, suppose we want to find the best detector
Select H1 if
p(x | = 1)
>
p(x | = 0)
The only thing to find out now is the threshold such that
Z
p(x | = 0) dx = 0.1.
Neyman-Pearson Theorem
Unfortunately icdf solves the for which
p(x | = 0) dx = 0.1.
Z
PD =
1.2816
p(x | = 1) dx 0.3891.
available.
An important special case is that of a known waveform
1
(x[n] s[n])2
,
exp
2
22
2
n=0
N1
Y
(x[n])2
1
exp
.
p(x | H0 ) =
2
22
2
n=0
p(x | H1 ) =
"
p(x | H1 )
1
= exp 2
p(x | H0 )
2
N1
X
N1
X
n=0
n=0
(x[n] s[n])2
(x[n])2
!#
> .
1
2
2
N1
X
N1
X
n=0
n=0
(x[n] s[n])2
(x[n])2
!
> ln .
n=0
x[n]s[n] > 2 ln +
n=0
N1
1 X
(s[n])2 .
2
n=0
x[n]s[n] > 0 .
Examples
x[n]A > A
N1
X
x[n] >
n=0
Examples
The detector for a sinusoid in WGN is
N1
X
n=0
N1
X
n=0
n=0
Note that the amplitude A does not affect our statistic, only
the threshold which is anyway selected according to the
fixed PFA rate.
Examples
As an example, the below picture shows the detection
100
200
300
400
500
Noisy signal
600
700
800
900
100
200
300
400
500
Detection result
600
700
800
900
5
0
5
0
50
0
50
0
100
200
300
400
500
600
700
800
900
1000
formulated as follows.
becomes
Decide H1 , if xT s > ,
where
s = Cs (Cs + 2 I)1 x.
linear model.
The Bayesian linear model assumes linearity x = H + w
H0 : x = w
H1 : x = H + w
as
H0 : x N(0, 2 I)
H1 : x N(0, Cs + 2 I)
with Cs = HC HT .
The assumption s N(0, HC HT ) states that the exact
Decide H1 , if xT s > ,
where
s = Cs (Cs + 2 I)1 x
= HC HT (HC HT + 2 I)1 x
= xT H.
An example of applying the linear model is in Kay:
N1
X
x[n] exp(2if0 n) > .
n=0
h = exp(-2*pi*sqrt(-1)*f0*n);
y = abs(conv(h,x));
100
200
300
400
500
Noisy signal
600
700
800
900
100
200
300
400
500
Detection result
600
700
800
900
100
200
300
600
700
800
900
5
0
5
0
50
400
500
1
(x 1)2
exp
PD () =
dx
2
2
2
Z
1
x
exp
dx
PFA () =
2
2
Z
PD () =
1
2
1
x
exp
dx = PFA ( 1).
2
2
curve.
1
0.9
Probability of Detection, PD
0.8
0.7
Small threshold
0.6
0.5
0.4
Large threshold
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
Probability of False Alarm, PFA
0.8
0.9
variance = 0.2
Probability of Detection, PD
0.8
variance = 0.4
0.7
variance = 0.6
0.6
variance = 0.8
0.5
variance = 1
Random guess
0.4
0.3
0.2
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
Probability of False Alarm, PFA
0.8
0.9
H1 : A , 0
H0 : A = 0
follows:
What is the probability of observing x[n] if H0 would hold?
If the probability is small (e.g., all x[n] [0.5, 1.5], and lets
An example
tossing experiment.
If we get 19 heads out of 20 tosses, it seems rather likely
An example
Additionally, lets say, we want 99% confidence for the test.
Thus, we can state the hypothesis test as: "what is the
20
20
0.519 0.51 |{z}
+ 0.5
| {z } 0.00002.
19
|
{z
} or 20 heads
19 heads
Since 0.00002 < 1%, we can reject the null hypothesis and
An example