You are on page 1of 14

# Bayesian Networks

Recap of Last Lecture Probability: Precise representation of uncertainty. Probability Theory: Optimal updating of knowledge bared on new information. Bayesian Inference Conditional Probability ( | ) Product Rule ( Chain Rule ( ) ) ( | ) ( ) ( ) ( | ) ( | ( | x, y independent iff ( ) ( ) ( ) ) ) ( ( ) )

## x and y are conditionally independent of z iff ( | ) | ( | ) ( | )

Probabilistic Models
Models describe how (a portion of) the world works. Models are always simplifications May not account for every variable May not account for all interactions between variables What do we do with probabilistic models? We (or the agents) need to reason about the unknown variables, given evidence. Example: a) Explanation (diagnostic reasoning) b) Prediction (causal reasoning) c) Value of information

Probabilistic Reasoning
Suppose we go to my house and see that the door is open. Whats the cause? Is it a burglar? Should we go in? Call the police? Then again, it could be just be my wife. Maybe she came home early. How should we represent these relationships?

## Bayes Nets: Big Picture

Two problems with using full joint distribution / tree tables as our probabilistic models. o Unless there are only a few variables, the joint is way too big to represent explicitly. For variables with domain size joint tables has entries. o Hard to learn (estimate) anything empirically about more than a few variables at a time. Bayes nets: a technique for describing complex joint distributions (models) using simple local distributions (conditional probabilities) o More properly called graphical models. o We describe how variables locally interact. o Local interactions chain together to give global indirect interactions.

## Graphical Model Notation

Causal relationships are represented in directed acyclic graphs. Arrows (Arcs) indicates relationships between nodes. For now: image that arrows mean direct causation (in global, they dont)

wife

burglar

open door
Types of Probabilistic Relationships
1. Independent
A B

( | ) ( | )

( ) ( )

2. Direct Cause
A B

( | )

3. Indirect Cause
A B C

A

A B

( |

## Lets Build A Causal Graphical Model

Example 1: T: Traffic R: If rains L: Low Pressure D: Roof Drips B: Ballgame C: Cavity Example 2: B: Burglary A: Alarm goes off M: Mary calls J: John calls E: Earthquake!

http://www.aispace.org/bayes/index.shtml

Inference
Inference: Calculating some useful quantity from a joint probability distribution. Examples: o Posterior Probability ( | ) Belief Network o Most likely explanation ( | ) Decision Network

Variable Elimination
Inference by enumeration is slow o You join up the whole joint distribution before you sum out the hidden variable. Idea: Inference joining and marginalizing! Variable Elimination

Example: Traffic Domain Random Variables R: rainy T: traffic L: late for class
L R T

Given P(R)

P (T|R) T

P (L|T)

0.3 0.1

0.7 0.9

## Operation 1: Joint Factors

Example: ( (a) ( ) ) ( ( ) ( | ) )

R, T (0.1)

(0.9)

(b)

( |

) ( | ) ( )

Operation 2: Eliminate
Example: P (T), P (L) (a) P (T) ( ) ( )

(b) (

P (L) ) ( )

Operation 3: Evidence
Example: Compute P (L|+r) From ( Find ( ) )

Normalize

( |

Another Example
Wife Burglar

Car in garage

Opened door

Damaged door

Given: ( ) ( ) ( | ) ( | )

( |

What is the probability that the door is open, it is my wife and not a burglar, we see the car in the garage, and the door is not damaged? ( )

One Solution!
We can just repeatedly apply the rule relating joint and conditional probabilities. ( ( | ( | ( | ) ) ( ) ) ( ( | ( | ) ( | ) ( | ) ) )

) ( ) ( |

) ( | ) ( ) ( )

( |

) ( | ) ( | ) ( ) ( )

Real-World BN Applications
Microsofts competitive advantage lies in its expertise in Bayesian Networks o Bill Gates quoted in LA Times, 1996. MS Answer Wizards, (printer) troubleshooters Medical Diagnosis Genetic pedigree analysis Speech recognition (HMM) Gene sequence/expression analysis Turbocodes (channel coding)