Beruflich Dokumente
Kultur Dokumente
Petr Pošík
c 2013
P. Pošík
Artificial Intelligence – 1 / 29
Decision Trees
What is a decision tree?
Attribute description
Expressiveness of
decision trees
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
Decision Trees
c 2013
P. Pošík
Artificial Intelligence – 2 / 29
What is a decision tree?
Decision Trees
What is a decision tree? Decision tree
Attribute description
✔ is a function that
Expressiveness of
decision trees
✘ takes a vector of attribute values as its input, and
Learning a Decision Tree
✘ returns a “decision” as its output.
Generalization and
Overfitting ✘ Both input and output values can be measured on a nominal, ordinal, interval,
Broadening the and ratio scales, can be discrete or continuous.
Applicability of Desicion
Trees
✔ The decision is formed via a sequence of tests:
Summary
✘ each internal node of the tree represents a test,
✘ the branches are labeled with possible outcomes of the test, and
✘ each leaf node represents a decision to be returned by the tree.
c 2013
P. Pošík
Artificial Intelligence – 3 / 29
What is a decision tree?
Decision Trees
What is a decision tree? Decision tree
Attribute description
✔ is a function that
Expressiveness of
decision trees
✘ takes a vector of attribute values as its input, and
Learning a Decision Tree
✘ returns a “decision” as its output.
Generalization and
Overfitting ✘ Both input and output values can be measured on a nominal, ordinal, interval,
Broadening the and ratio scales, can be discrete or continuous.
Applicability of Desicion
Trees
✔ The decision is formed via a sequence of tests:
Summary
✘ each internal node of the tree represents a test,
✘ the branches are labeled with possible outcomes of the test, and
✘ each leaf node represents a decision to be returned by the tree.
c 2013
P. Pošík
Artificial Intelligence – 3 / 29
Attribute description
Decision Trees
What is a decision tree? Example: A computer game.
Attribute description The main character of the game meets various robots along his way. Some behave like
Expressiveness of
decision trees
allies, others like enemies.
Learning a Decision Tree
ally head body smile neck holds class
Generalization and
Overfitting
circle circle yes tie nothing ally
Broadening the circle square no tie sword enemy
Applicability of Desicion
Trees ... ... ... ... ... ...
Summary
The game engine may use e.g. the following tree to assign
the ally or enemy attitude to the generated robots:
enemy neck
ot
ti e
h
er
smile body
e
ngl
oth
yes
no
tri a
r e
ally enemy ally enemy
c 2013
P. Pošík
Artificial Intelligence – 4 / 29
Expressiveness of decision trees
Decision Trees
What is a decision tree? The tree on previous slide is a Boolean decision tree:
Attribute description
Expressiveness of
✔ the decision is a binary variable (true, false), and
decision trees ✔ the attributes are discrete.
Learning a Decision Tree
✔ It returns ally iff the input attributes satisfy one of the paths leading to an ally leaf:
Generalization and
Overfitting
ally ⇔ ( neck = tie ∧ smile = yes ) ∨ ( neck = ¬tie ∧ body = triangle ),
Broadening the
Applicability of Desicion
Trees i.e. in general
Summary ✘ Goal ⇔ ( Path1 ∨ Path2 ∨ . . .), where
✘ Path is a conjuction of attribute-value tests, i.e.
✘ the tree is equivalent to a DNF of a function.
✔ We need a clever algorithm to find good hypotheses (trees) in such a large space.
c 2013
P. Pošík
Artificial Intelligence – 5 / 29
Decision Trees
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 6 / 29
A computer game
Decision Trees
Example 1:
Learning a Decision Tree
A computer game
Can you distinguish between allies and enemies after seeing a few of them?
A computer game
Alternative hypotheses Allies Enemies
How to choose the best
tree?
Learning a Decision Tree
Attribute importance
Choosing the test
attribute
Choosing the test
attribute (special case:
binary classification)
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 7 / 29
A computer game
Decision Trees
Example 1:
Learning a Decision Tree
A computer game
Can you distinguish between allies and enemies after seeing a few of them?
A computer game
Alternative hypotheses Allies Enemies
How to choose the best
tree?
Learning a Decision Tree
Attribute importance
Choosing the test
attribute
Choosing the test
attribute (special case:
binary classification)
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 7 / 29
A computer game
Decision Trees
Example 1:
Learning a Decision Tree
A computer game
Can you distinguish between allies and enemies after seeing a few of them?
A computer game
Alternative hypotheses Allies Enemies
How to choose the best
tree?
Learning a Decision Tree
Attribute importance
Choosing the test
attribute
Choosing the test
attribute (special case:
binary classification)
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 7 / 29
A computer game
Decision Trees
Example 1:
Learning a Decision Tree
A computer game
Can you distinguish between allies and enemies after seeing a few of them?
A computer game
Alternative hypotheses Allies Enemies
How to choose the best
tree?
Learning a Decision Tree
Attribute importance
Choosing the test
attribute
Choosing the test
attribute (special case:
binary classification)
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
Decision Trees
Example 2:
Learning a Decision Tree
A computer game
Some robots changed their attitudes:
A computer game
Alternative hypotheses Allies Enemies
How to choose the best
tree?
Learning a Decision Tree
Attribute importance
Choosing the test
attribute
Choosing the test
attribute (special case:
binary classification)
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 8 / 29
A computer game
Decision Trees
Example 2:
Learning a Decision Tree
A computer game
Some robots changed their attitudes:
A computer game
Alternative hypotheses Allies Enemies
How to choose the best
tree?
Learning a Decision Tree
Attribute importance
Choosing the test
attribute
Choosing the test
attribute (special case:
binary classification)
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 8 / 29
Alternative hypotheses
Alternative hypotheses (suggested by an oracle for now): Which of the trees is the best (right) one?
oth
an
ord
circle
re
ti e
tri
her
e
sw
r
smile body ally holds enemy enemy body
e
are
ngl
ord
oth
oth
oth
yes
no
squ
tri a
sw
e
e
r
r
c 2013
P. Pošík
Artificial Intelligence – 9 / 29
How to choose the best tree?
Decision Trees
We want a tree that is
Learning a Decision Tree
A computer game ✔ consistent with the data,
A computer game
Alternative hypotheses ✔ is as small as possible, and
How to choose the best ✔ which also works for new data.
tree?
Learning a Decision Tree
Attribute importance
Choosing the test
attribute
Choosing the test
attribute (special case:
binary classification)
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 10 / 29
How to choose the best tree?
Decision Trees
We want a tree that is
Learning a Decision Tree
A computer game ✔ consistent with the data,
A computer game
Alternative hypotheses ✔ is as small as possible, and
How to choose the best ✔ which also works for new data.
tree?
Learning a Decision Tree
Attribute importance Consistent with data?
Choosing the test
attribute
Choosing the test
attribute (special case:
binary classification)
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 10 / 29
How to choose the best tree?
Decision Trees
We want a tree that is
Learning a Decision Tree
A computer game ✔ consistent with the data,
A computer game
Alternative hypotheses ✔ is as small as possible, and
How to choose the best ✔ which also works for new data.
tree?
Learning a Decision Tree
Attribute importance Consistent with data?
Choosing the test
attribute ✔ All 3 trees are consistent.
Choosing the test
attribute (special case:
binary classification) Small?
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 10 / 29
How to choose the best tree?
Decision Trees
We want a tree that is
Learning a Decision Tree
A computer game ✔ consistent with the data,
A computer game
Alternative hypotheses ✔ is as small as possible, and
How to choose the best ✔ which also works for new data.
tree?
Learning a Decision Tree
Attribute importance Consistent with data?
Choosing the test
attribute ✔ All 3 trees are consistent.
Choosing the test
attribute (special case:
binary classification) Small?
Choosing the test
attribute (example) ✔ The right-hand side one is the simplest one:
Choosing subsequent
test attribute left middle right
Decision tree building
procedure depth 2 2 2
Algorithm
characteristics
leaves 4 4 3
Generalization and
conditions 3 2 2
Overfitting
Summary
c 2013
P. Pošík
Artificial Intelligence – 10 / 29
How to choose the best tree?
Decision Trees
We want a tree that is
Learning a Decision Tree
A computer game ✔ consistent with the data,
A computer game
Alternative hypotheses ✔ is as small as possible, and
How to choose the best ✔ which also works for new data.
tree?
Learning a Decision Tree
Attribute importance Consistent with data?
Choosing the test
attribute ✔ All 3 trees are consistent.
Choosing the test
attribute (special case:
binary classification) Small?
Choosing the test
attribute (example) ✔ The right-hand side one is the simplest one:
Choosing subsequent
test attribute left middle right
Decision tree building
procedure depth 2 2 2
Algorithm
characteristics
leaves 4 4 3
Generalization and
conditions 3 2 2
Overfitting
c 2013
P. Pošík
Artificial Intelligence – 10 / 29
Learning a Decision Tree
Decision Trees n
Learning a Decision Tree
It is an intractable problem to find the smallest consistent tree among > 22 trees.
A computer game We can find approximate solution: a small (but not the smallest) consistent tree.
A computer game
Alternative hypotheses
How to choose the best
tree?
Learning a Decision Tree
Attribute importance
Choosing the test
attribute
Choosing the test
attribute (special case:
binary classification)
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 11 / 29
Learning a Decision Tree
Decision Trees n
Learning a Decision Tree
It is an intractable problem to find the smallest consistent tree among > 22 trees.
A computer game We can find approximate solution: a small (but not the smallest) consistent tree.
A computer game
Alternative hypotheses
Top-Down Induction of Decision Trees (TDIDT):
How to choose the best
tree?
✔ A greedy divide-and-conquer strategy.
Learning a Decision Tree
Attribute importance ✔ Progress:
Choosing the test
attribute 1. Test the most important attribute.
Choosing the test
attribute (special case: 2. Divide the data set using the attribute values.
binary classification)
Choosing the test 3. For each subset, build an independent tree (recursion).
attribute (example)
Choosing subsequent ✔ “Most important attribute”: attribute that makes the most difference to the
test attribute
Decision tree building classification.
procedure
✔ All paths in the tree will be short, the tree will be shallow.
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 11 / 29
Attribute importance
c 2013
P. Pošík
Artificial Intelligence – 12 / 29
Attribute importance
A perfect attribute divides the examples into sets each of which contain only a single class. (Do you
remember the simply created perfect attribute from Example 1?)
A useless attribute divides the examples into sets each of which contains the same distribution of
classes as the set before splitting.
None of the above attributes is perfect or useless. Some are more useful than others.
c 2013
P. Pošík
Artificial Intelligence – 12 / 29
Choosing the test attribute
Information gain:
✔ Formalization of the terms “useless”, “perfect”, “more useful”.
✔ Based on entropy, a measure of the uncertainty of a random variable V with possible values vi :
H (V ) = − ∑ p( vi ) log2 p( vi )
i
c 2013
P. Pošík
Artificial Intelligence – 13 / 29
Choosing the test attribute
Information gain:
✔ Formalization of the terms “useless”, “perfect”, “more useful”.
✔ Based on entropy, a measure of the uncertainty of a random variable V with possible values vi :
H (V ) = − ∑ p( vi ) log2 p( vi )
i
✔ Entropy of the target class C measured on a data set S (a finite-sample estimate of the true entropy):
H (C, S ) = − ∑ p( ci ) log2 p( ci ),
i
NS (ci )
where p( ci ) = | S|
, and NS ( ci ) is the number of examples in S that belong to class ci .
c 2013
P. Pošík
Artificial Intelligence – 13 / 29
Choosing the test attribute
Information gain:
✔ Formalization of the terms “useless”, “perfect”, “more useful”.
✔ Based on entropy, a measure of the uncertainty of a random variable V with possible values vi :
H (V ) = − ∑ p( vi ) log2 p( vi )
i
✔ Entropy of the target class C measured on a data set S (a finite-sample estimate of the true entropy):
H (C, S ) = − ∑ p( ci ) log2 p( ci ),
i
NS (ci )
where p( ci ) = | S|
, and NS ( ci ) is the number of examples in S that belong to class ci .
✔ The entropy of the target class C remaining in the data set S after splitting into subsets Sk using
values of attribute A (weighted average of the entropies in individual subsets):
| Sk |
H (C, S, A ) = ∑ p(Sk ) H (C, Sk ), where p( Sk ) =
|S|
k
Choose the attribute with the highest information gain, i.e. the attribute with the lowest H (C, S, A ).
c 2013
P. Pošík
Artificial Intelligence – 13 / 29
Choosing the test attribute (special case: binary classification)
Decision Trees
✔ For a Boolean random variable V which is true with probability q, we can define:
Learning a Decision Tree
A computer game
A computer game
HB ( q ) = − q log2 q − (1 − q ) log2 (1 − q )
Alternative hypotheses
How to choose the best ✔ Entropy of the target class C measured on a data set S with Np positive and Nn
tree?
Learning a Decision Tree
negative examples:
Attribute importance
Np Np
Choosing the test
attribute H (C, S ) = HB = HB
Choosing the test Np + Nn |S|
attribute (special case:
binary classification)
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 14 / 29
Choosing the test attribute (example)
head: neck:
p(Shead=tri ) = 38 ; H (C, Shead=tri ) = HB 2+2 1 = 0.92 p(Sneck=tie ) = 48 ; H (C, Sneck=tie ) = HB 2+2 2 = 1
body: holds:
p(Sbody=tri ) = 28 ; H (C, Sbody=tri ) = HB 2+2 0 = 0 p(Sholds=ball ) = 28 ; H (C, Sholds=ball ) = HB 1+1 1 = 1
c 2013
P. Pošík
Artificial Intelligence – 15 / 29
Choosing subsequent test attribute
No further tests are needed for robots with triangular and squared bodies.
Dataset for robots with circular bodies:
c 2013
P. Pošík
Artificial Intelligence – 16 / 29
Choosing subsequent test attribute
No further tests are needed for robots with triangular and squared bodies.
Dataset for robots with circular bodies:
c 2013
P. Pošík
Artificial Intelligence – 16 / 29
Decision tree building procedure
Decision Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 17 / 29
Algorithm characteristics
Decision Trees
✔ There are many hypotheses (trees) consistent with the dataset S; the algorithm will
Learning a Decision Tree
A computer game
return any of them, unless there is some bias in choosing the tests.
A computer game ✔ The current set of considered hypotheses has always only 1 member (greedy
Alternative hypotheses
selection of the successor). The algorithm cannot provide answer to the question
How to choose the best
tree? how many hypotheses consistent with the data exist.
Learning a Decision Tree
✔ The algorithm does not use backtracking; it can get stuck in a local optimum.
Attribute importance
Choosing the test ✔ The algorithm uses batch learning, not incremental.
attribute
Choosing the test
attribute (special case:
binary classification)
Choosing the test
attribute (example)
Choosing subsequent
test attribute
Decision tree building
procedure
Algorithm
characteristics
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 18 / 29
Decision Trees
Generalization and
Overfitting
Overfitting
How to prevent
overfitting for trees?
Broadening the
Applicability of Desicion
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 19 / 29
Overfitting
Definition of overfitting:
✔ Let H be a hypothesis space.
Training data
✔ Let h ∈ H and h′ ∈ H be 2 different hypotheses from this Testing data
space.
Model Error
✔ Let ErrTr ( h ) be an error of the hypothesis h measured on
the training dataset (training error).
✔ Let ErrTst ( h ) be an error of the hypothesis h measured on
the testing dataset (testing error).
✔ We say that h is overfitted if there is another h′ for which
Model Flexibility
ErrTr ( h) < ErrTr ( h′ ) ∧ ErrTst ( h) > ErrTst ( h′ )
✔ “Overfitting is a situation when the model works well for the training data, but fails for new (testing)
data.”
✔ Overfitting is a general phenomenon related to all kinds of inductive learning (i.e. it applies to all
models, not only trees).
c 2013
P. Pošík
Artificial Intelligence – 20 / 29
Overfitting
Definition of overfitting:
✔ Let H be a hypothesis space.
Training data
✔ Let h ∈ H and h′ ∈ H be 2 different hypotheses from this Testing data
space.
Model Error
✔ Let ErrTr ( h ) be an error of the hypothesis h measured on
the training dataset (training error).
✔ Let ErrTst ( h ) be an error of the hypothesis h measured on
the testing dataset (testing error).
✔ We say that h is overfitted if there is another h′ for which
Model Flexibility
ErrTr ( h) < ErrTr ( h′ ) ∧ ErrTst ( h) > ErrTst ( h′ )
✔ “Overfitting is a situation when the model works well for the training data, but fails for new (testing)
data.”
✔ Overfitting is a general phenomenon related to all kinds of inductive learning (i.e. it applies to all
models, not only trees).
We want models and learning algorithms with a good generalization ability, i.e.
✔ we want models that encode only the patterns valid in the whole domain, not those that learned the
specifics of the training data,
✔ we want algorithms able to find only the patterns valid in the whole domain and ignore specifics of
the training data.
c 2013
P. Pošík
Artificial Intelligence – 20 / 29
How to prevent overfitting for trees?
Decision Trees
Tree pruning:
Learning a Decision Tree
Generalization and
✔ Let’s have a fully grown tree T.
Overfitting
✔ Choose a test node having only leaf nodes as descensdants.
Overfitting
How to prevent ✔ If the test appears to be irrelevant, remove the test and replace it with a leaf node
overfitting for trees?
with the majority class.
Broadening the
Applicability of Desicion ✔ Repeat, until all tests seem to be relevant.
Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 21 / 29
How to prevent overfitting for trees?
Decision Trees
Tree pruning:
Learning a Decision Tree
Generalization and
✔ Let’s have a fully grown tree T.
Overfitting
✔ Choose a test node having only leaf nodes as descensdants.
Overfitting
How to prevent ✔ If the test appears to be irrelevant, remove the test and replace it with a leaf node
overfitting for trees?
with the majority class.
Broadening the
Applicability of Desicion ✔ Repeat, until all tests seem to be relevant.
Trees
c 2013
P. Pošík
Artificial Intelligence – 21 / 29
How to prevent overfitting for trees?
Decision Trees
Tree pruning:
Learning a Decision Tree
Generalization and
✔ Let’s have a fully grown tree T.
Overfitting
✔ Choose a test node having only leaf nodes as descensdants.
Overfitting
How to prevent ✔ If the test appears to be irrelevant, remove the test and replace it with a leaf node
overfitting for trees?
with the majority class.
Broadening the
Applicability of Desicion ✔ Repeat, until all tests seem to be relevant.
Trees
Early stopping:
✔ Hmm, if we grow the tree fully and then prune it, why cannot we just stop the tree
building when there is no good attribute to split on?
✔ Prevents us from recognizing situations when
✘ there is no single good attribute to split on, but
✘ there are combinations of attributes that lead to a good tree!
c 2013
P. Pošík
Artificial Intelligence – 21 / 29
Decision Trees
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Missing data
Multivalued attributes
Attributes with different
prices
Continuous input
attributes
Continuous output
variable
Broadening the Applicability of Desicion Trees
Summary
c 2013
P. Pošík
Artificial Intelligence – 22 / 29
Missing data
Decision Trees
Decision trees are one of the rare model types able to handle missing attribute values.
Learning a Decision Tree
Generalization and
Overfitting 1. Given a complete tree, how to classify an example with a missing attribute value
Broadening the needed for a test?
Applicability of Desicion
Trees ✔ Pretend that the object has all possible values for this attribute.
Missing data
Multivalued attributes ✔ Track all possible paths to the leaves.
Attributes with different
prices ✔ The leaf decisions are weighted using the number of training examples in the
Continuous input leaves.
attributes
Continuous output
variable 2. How to build a tree if the training set contains examples with missing attribute
Summary
values?
✔ Introduce a new attribute value: “Missing” (or N/A).
✔ Build tree in a normal way.
c 2013
P. Pošík
Artificial Intelligence – 23 / 29
Multivalued attributes
Decision Trees
What if the training set contains e.g. name, social insurance number, or other id?
Learning a Decision Tree
Generalization and
✔ When each example has a unique value of an attribute A, the information gain of A
Overfitting is equal to the entropy of the whole data set!
Broadening the ✔ Attribute A is chosen for the tree root; yet, such a tree is useless (overfitted).
Applicability of Desicion
Trees
Missing data
Multivalued attributes
Attributes with different
prices
Continuous input
attributes
Continuous output
variable
Summary
c 2013
P. Pošík
Artificial Intelligence – 24 / 29
Multivalued attributes
Decision Trees
What if the training set contains e.g. name, social insurance number, or other id?
Learning a Decision Tree
Generalization and
✔ When each example has a unique value of an attribute A, the information gain of A
Overfitting is equal to the entropy of the whole data set!
Broadening the ✔ Attribute A is chosen for the tree root; yet, such a tree is useless (overfitted).
Applicability of Desicion
Trees
Missing data
Multivalued attributes Solutions:
Attributes with different
prices 1. Allow only Boolean test of the form A = vk and allow the remaining values to be
Continuous input tested later in the tree.
attributes
Continuous output 2. Use a different split importance measure instead of Gain, e.g. GainRatio:
variable
Summary ✔ Normalize the information gain by a maximal amount of information the split
can have:
Gain ( A, S )
GainRatio ( A, S ) = ,
H ( A, S )
c 2013
P. Pošík
Artificial Intelligence – 24 / 29
Attributes with different prices
Decision Trees
What if the tests in the tree are also cost something?
Learning a Decision Tree
Generalization and
✔ Then we would like to have the cheap test close to the root.
Overfitting
✔ If we have Cost ( A ) ∈ h0, 1i then we can use e.g.
Broadening the
Applicability of Desicion
Trees Gain2 ( A, S )
,
Missing data Cost( A )
Multivalued attributes
Attributes with different
prices or
Continuous input
attributes 2Gain( A,S) − 1
Continuous output
variable (Cost( A) + 1)w
Summary
to bias the preference for cheaper tests.
c 2013
P. Pošík
Artificial Intelligence – 25 / 29
Continuous input attributes
Decision Trees
Continuous or integer-valued input attributes:
Learning a Decision Tree
Generalization and
✔ have an infinite set of possible values. (Infinitely many branches?)
Overfitting
✔ Use a binary split with the highest information gain.
Broadening the
Applicability of Desicion ✘ Sort the values of the attribute.
Trees
Missing data ✘ Consider only split points lying between 2 examples with different
Multivalued attributes classification.
Attributes with different
prices Temperature -20 -9 -2 5 16 26 32 35
Continuous input Go out? No No Yes Yes Yes Yes No No
attributes
Continuous output
variable ✔ Previously used attributes can be used again in subsequent tests!
Summary
c 2013
P. Pošík
Artificial Intelligence – 26 / 29
Continuous output variable
Decision Trees
Regression tree:
Learning a Decision Tree
Generalization and
✔ In each leaf, it can have
Overfitting
✘ a constant value (usually an average of the output variable over the training
Broadening the
Applicability of Desicion set), or
Trees
✘ a linear function of some subset of numerical input attributes
Missing data
Multivalued attributes
Attributes with different ✔ The learning algorithm must decide when to stop splitting and begin applying
prices linear regression.
Continuous input
attributes
Continuous output
variable
Summary
c 2013
P. Pošík
Artificial Intelligence – 27 / 29
Decision Trees
Generalization and
Overfitting
Broadening the
Applicability of Desicion
Trees
Summary
Summary
Summary
c 2013
P. Pošík
Artificial Intelligence – 28 / 29
Summary
Decision Trees
✔ Decision trees are one of the simplest, most universal and most widely used
Learning a Decision Tree
prediction models.
Generalization and
Overfitting ✔ They are not suitable for all modeling problems (relations, etc.).
Broadening the ✔ TDIDT is the most widely used technique to build a tree from data.
Applicability of Desicion
Trees ✔ It uses greedy divide-and-conquer approach.
Summary ✔ Individual variants differ mainly
Summary
✘ in what type of attributes they are able to handle,
✘ in the attribute importance measure,
✘ if they make enumerative or just binary splits,
✘ if and how they can handle missing data,
✘ etc.
c 2013
P. Pošík
Artificial Intelligence – 29 / 29