Beruflich Dokumente
Kultur Dokumente
Techniques
Tan,Steinbach, Kumar
4/18/2004
Instance-Based Classifiers
S
A tr1
...
A trN
C la s s
A
B
B
C
A
A tr1
...
A trN
C
B
Tan,Steinbach, Kumar
4/18/2004
Rote-learner
Memorizes entire training data and performs
classification only if attributes of record match one of
the training examples exactly
Nearest neighbor
Uses k closest points (nearest neighbors) for
performing classification
Tan,Steinbach, Kumar
4/18/2004
idea:
If it walks like a duck, quacks like a duck, then
its probably a duck
Compute
Distance
Training
Records
Tan,Steinbach, Kumar
Test
Record
Choose k of the
nearest records
4/18/2004
Nearest-Neighbor Classifiers
Tan,Steinbach, Kumar
4/18/2004
4/18/2004
1 nearest-neighbor
Voronoi Diagram
Tan,Steinbach, Kumar
4/18/2004
d ( p, q )
( pi
i
q )
Determine
Tan,Steinbach, Kumar
4/18/2004
the value of k:
Tan,Steinbach, Kumar
4/18/2004
issues
Attributes may have to be scaled to prevent
distance measures from being dominated by
one of the attributes
Example:
height of a person may vary from 1.5m to 1.8m
weight of a person may vary from 90lb to 300lb
income of a person may vary from $10K to $1M
Tan,Steinbach, Kumar
4/18/2004
10
curse of dimensionality
vs
100000000000
011111111111
000000000001
d = 1.4142
d = 1.4142
Tan,Steinbach, Kumar
4/18/2004
11
Tan,Steinbach, Kumar
4/18/2004
12
Bayes Classifier
A probabilistic
problems
Conditional Probability:
P ( A, C )
P (C | A)
P ( A)
P ( A, C )
P( A | C )
P (C )
Bayes theorem:
P ( A | C ) P (C )
P (C | A)
P ( A)
Tan,Steinbach, Kumar
4/18/2004
13
P ( S | M ) P ( M ) 0.5 1 / 50000
P( M | S )
0.0002
P( S )
1 / 20
Tan,Steinbach, Kumar
4/18/2004
14
Bayesian Classifiers
Tan,Steinbach, Kumar
4/18/2004
15
Bayesian Classifiers
Approach:
compute the posterior probability P(C | A1, A2, , An) for
all values of C using the Bayes theorem
P (C | A A A )
1
P ( A A A | C ) P (C )
P( A A A )
1
Tan,Steinbach, Kumar
4/18/2004
16
Tan,Steinbach, Kumar
4/18/2004
17
Tan,Steinbach, Kumar
4/18/2004
18
continuous attributes:
Discretize the range into bins
one ordinal attribute per bin
violates independence assumption
Tan,Steinbach, Kumar
4/18/2004
19
Normal distribution:
1
P( A | c )
e
2
i
( Ai ij ) 2
2 ij2
ij
1
P ( Income 120 | No)
e
2 (54.54)
Tan,Steinbach, Kumar
( 120110 ) 2
2 ( 2975 )
0.0072
4/18/2004
20
P(X|Class=No) = P(Refund=No|Class=No)
P(Married| Class=No)
P(Income=120K| Class=No)
= 4/7 4/7 0.0072 = 0.0024
=> Class = No
Tan,Steinbach, Kumar
4/18/2004
21
Tan,Steinbach, Kumar
c: number of classes
p: prior probability
m: parameter
4/18/2004
22
human
python
salmon
whale
frog
komodo
bat
pigeon
cat
leopard shark
turtle
penguin
porcupine
eel
salamander
gila monster
platypus
owl
dolphin
eagle
Give Birth
yes
Give Birth
yes
no
no
yes
no
no
yes
no
yes
yes
no
no
yes
no
no
no
no
no
yes
no
Can Fly
no
no
no
no
no
no
yes
yes
no
no
no
no
no
no
no
no
no
yes
no
yes
Can Fly
no
Tan,Steinbach, Kumar
no
no
yes
yes
sometimes
no
no
no
no
yes
sometimes
sometimes
no
yes
sometimes
no
no
no
yes
no
Class
yes
no
no
no
yes
yes
yes
yes
yes
no
yes
yes
yes
no
yes
yes
yes
yes
no
yes
mammals
non-mammals
non-mammals
mammals
non-mammals
non-mammals
mammals
non-mammals
mammals
non-mammals
non-mammals
non-mammals
mammals
non-mammals
non-mammals
non-mammals
mammals
non-mammals
mammals
non-mammals
yes
no
Class
A: attributes
M: mammals
N: non-mammals
6 6 2 2
P ( A | M ) 0.06
7 7 7 7
1 10 3 4
P ( A | N ) 0.0042
13 13 13 13
7
P ( A | M ) P ( M ) 0.06 0.021
20
13
P ( A | N ) P ( N ) 0.004 0.0027
20
P(A|M)P(M) > P(A|N)P(N)
=> Mammals
4/18/2004
23
Handle
Robust
to irrelevant attributes
Independence
attributes
Use other techniques such as Bayesian Belief
Networks (BBN)
Tan,Steinbach, Kumar
4/18/2004
24
Tan,Steinbach, Kumar
4/18/2004
25
Y I ( 0 .3 X 1 0 .3 X 2 0 .3 X 3 0 .4 0 )
1
where I ( z )
0
Tan,Steinbach, Kumar
if z is true
otherwise
4/18/2004
26
Model is an assembly of
inter-connected nodes and
weighted links
Perceptron Model
Y I ( wi X i t )
or
Y sign( wi X i t )
i
Tan,Steinbach, Kumar
4/18/2004
27
Tan,Steinbach, Kumar
4/18/2004
28
Adjust
Tan,Steinbach, Kumar
4/18/2004
29
Find a linear hyperplane (decision boundary) that will separate the data
Tan,Steinbach, Kumar
4/18/2004
30
Tan,Steinbach, Kumar
4/18/2004
31
Tan,Steinbach, Kumar
4/18/2004
32
Tan,Steinbach, Kumar
4/18/2004
33
Tan,Steinbach, Kumar
4/18/2004
34
Tan,Steinbach, Kumar
4/18/2004
35
w x b 0
w x b 1
w x b 1
f ( x)
1
if w x b 1
1 if w x b 1
Tan,Steinbach, Kumar
2
Margin 2
|| w ||
4/18/2004
36
want to maximize:
2
Margin 2
|| w ||
2
|| w ||
Which is equivalent to minimizing: L( w)
2
f ( xi )
1 if w x i b 1
Tan,Steinbach, Kumar
4/18/2004
37
Tan,Steinbach, Kumar
4/18/2004
38
Subject to:
f ( xi )
Tan,Steinbach, Kumar
1
if w x i b 1 - i
1 if w x i b 1 i
4/18/2004
39
Tan,Steinbach, Kumar
4/18/2004
40
Tan,Steinbach, Kumar
4/18/2004
41
Ensemble Methods
Construct
data
Predict
Tan,Steinbach, Kumar
4/18/2004
42
General Idea
Tan,Steinbach, Kumar
4/18/2004
43
(
1
)
0.06
i
i 13
25
Tan,Steinbach, Kumar
4/18/2004
44
Tan,Steinbach, Kumar
4/18/2004
45
Bagging
Sampling
with replacement
Original Data
Bagging (Round 1)
Bagging (Round 2)
Bagging (Round 3)
Build
1
7
1
1
2
8
4
8
3
10
9
5
4
8
1
10
5
2
2
5
6
5
3
5
7
10
2
9
8
10
7
6
9
5
3
3
10
9
2
7
Each
Tan,Steinbach, Kumar
4/18/2004
46
Boosting
An
Tan,Steinbach, Kumar
4/18/2004
47
Boosting
Records
1
7
5
4
2
3
4
4
3
2
9
8
4
8
4
10
5
7
2
4
6
9
5
5
7
4
1
4
8
10
7
6
9
6
4
3
10
3
2
4
Tan,Steinbach, Kumar
4/18/2004
48