Sie sind auf Seite 1von 35

Bayesian Inference

Jrmie Mattout, PhD

Lyon Neuroscience Research Center, France


Brain Dynamics & Cognition Team (DYCOG)

With many thanks to Jean Daunizeau, Guillaume Flandin, Karl Friston & Will Penny

Introduction and general principles | key words and definitions | inference techniques and outcomes

Some prior belief to start with

Introduction and general principles | key words and definitions | inference techniques and outcomes

The ubiquity of Bayesian inference


Bayesian inference to test biophysical models of neuronal activity (neuroimaging data)

Bayesian inference to test computational models of the mind (behavioural data)


A comparison of fixed-step-size and Bayesian staircases for sensory threshold estimation
Alcal-Quintana, Roco; Garca-Prez, Miguel A.
Spatial Vision, 2007

Bayesian inference as a model of cognitive processes (sensory data)

Science 2011

Introduction and general principles | key words and definitions | inference techniques and outcomes

What does inference mean ?


Statistics: concerned with the collection, analysis and interpretation of data to
make decisions

Applied statistics

Descriptive statistics
summary statistics, graphics

Inferential statistics
Data interpretations, decision making
(Modeling, accounting for randomness and unvertainty, hypothesis testing, infering
hidden parameters)

Probability

Introduction and general principles | key words and definitions | inference techniques and outcomes

The logic of probability


The true logic for this world is the calculus of Probabilities, which takes account of the
magnitude of the probability which is, or ought to be, in a reasonable man's mind.
James Clerk Maxwell (1850)
Logic
You know that :

if A is true, then B is true


Binary values

Then if B is false, you deduce for sure that A is false

Plausibility
You know that :

if A is true, then B is true

What if A is false ? Isnt B a little less likely to be true ?


What if you observe B ? What could be induced about A ?
Real numbers

Introduction and general principles | key words and definitions | inference techniques and outcomes

The logic of probability


The true logic for this world is the calculus of Probabilities, which takes account of the
magnitude of the probability which is, or ought to be, in a reasonable man's mind.
James Clerk Maxwell (1850)

P. de Fermat (1601-1665)

B. Pascal (1623-1662)
A.N. Kolmogorov (1903-1987)

Cox-Jaynes theorem:
1.Divisibility and comparability
2.Common sense
3.Consistency

=0
=1
= /

Introduction and general principles | key words and definitions | inference techniques and outcomes

From consequences to causes

Given a bag with twice more white balls than red balls, what is the
probability to draw 2 red balls ?

Deductive reasoning

P( R 2) ?

Given that we drawn 5 red and 15 white balls, what is the proportion
of red balls in the bag ?

Inductive reasoning

Introduction and general principles | key words and definitions | inference techniques and outcomes

Bayes theorem

Rvrend Thomas Bayes

Pierre Simon Laplace

~1774

~1748

prior belief

+ objective observations = up-dated belief

F
p)E)ppF

pCppCF
E
C
)C
C
C
p (p
C(pp
)((p
C
E
EE
p
CC
C
C
Joint probability :

P C F

P F C PC
P F

Conditional probability :
Marginal probability :

C: causes
F: facts

p ( F , C ) p F C pC

p F C

p ( F ) p F Ci p (Ci )
i

Introduction and general principles | key words and definitions | inference techniques and outcomes

Frequentist vs. Bayesian

Frequentist interpretation

Bayesian interpretation

- Probability = frequency of the


occurrence of an event, given an infinite
number of trials

- Probability = degree
measure of uncertainty

of

belief,

- Is only defined for random processes


that can be observed many times

- Can be arbitrarily defined for any


type of event

- Is meant to be Objective

- Is considered as Subjective in essence

Introduction and general principles | key words and definitions | inference techniques and outcomes

An example of Bayesian reasoning

ML
MAP

Tenenbaum et al., Science, 2011

Introduction and general principles | key words and definitions | inference techniques and outcomes

In the context of neuroimaging


What I cannot create, I do not understand.
Richard Feynman (1918 1988)
Deductive Reasoning / Predictions / Generative model / Forward problem

PY , M
likelihood

Causes
posterior distribution

Observations

P Y , M

Inductive Reasoning / Estimations / Inference method / Inverse problem

Introduction and general principles | key words and definitions | inference techniques and outcomes

Bayesian inference
What I cannot create, I do not understand.
Richard Feynman (1918 1988)

Model/Hypothesis

Likelihood

P Y , M
Posterior
or conditional

Prior

PY , M P M
PY M

Marginal likelihood
or evidence

To be infered

Introduction and general principles | key words and definitions | inference techniques and outcomes

Likelihood function
Assumption

e.g. linear model

But data are noisy


f

PY , M

= ()
=

= +

P 4 0.05

1 2
p exp 2
2

0
Data distribution, given the parameter value:
2
1
p y exp 2 y f
2

Introduction and general principles | key words and definitions | inference techniques and outcomes

Incorporating priors

P M
Likelihood

generative model M

Prior

= +
~ ,

~ 0,

Introduction and general principles | key words and definitions | inference techniques and outcomes

Incorporating priors

P M
Likelihood

generative model M

Prior

= +
~ ,

~ 0,

Introduction and general principles | key words and definitions | inference techniques and outcomes

Incorporating priors

P M
Likelihood

generative model M

Prior

= +
~ ,

~ 0,

Introduction and general principles | key words and definitions | inference techniques and outcomes

About priors

Shrinkage prior

~ 0,

Uninformative (objective) prior ~ 0, with large


Ir-reverend Bayes

Conjugate prior

when the prior and posterior distributions belong to the same family
Likelihood dist.

Conjugate prior dist.

Binomiale

Beta

Multinomiale

Dirichlet

Gaussian

Gaussian

Gamma

Gamma

Hierarchical models and empirical priors


inference

Likelihood

= 1 +

~ 0,

Prior

= 1 , 2 , . . , 1
1 ~ 2 , 2
2 ~ 3 , 3

1 ~ ,

causality

Introduction and general principles | key words and definitions | inference techniques and outcomes

Hypothesis testing : classical vs. bayesian


define the null, e.g.: H0 : 0

invert model (obtain posterior pdf)

p t H0

p y

P H0 y

P t t * H0
t t Y

t*

estimate parameters (obtain test stat.)

define the null, e.g.: H0 : 0

apply decision rule, i.e.:

apply decision rule, i.e.:

if P t t * H 0

then reject H0

classical SPM

if P H 0 y then accept H0

Bayesian PPM

Introduction and general principles | key words and definitions | inference techniques and outcomes

Hypothesis testing : Bayesian model comparison


define the null and the alternative hypothesis in terms of priors, e.g.:

1 if 0
H 0 : p H 0
0 otherwise

p Y H 0
p Y H1

H1 : p H1 N 0,

apply decision rule, i.e.:

if

PY M

0
< then reject H0
1

Y
space of all datasets

Introduction and general principles | key words and definitions | inference techniques and outcomes

Inference on models

if

PY M 1 PY M 2

, select model 1

In practice, compute the Bayes Factor

BF12

PY M 1

PY M 2

and apply the decision rule

Introduction and general principles | key words and definitions | inference techniques and outcomes

Hypothesis testing and principle of parsimony

Model evidence

y = f(x)

Occams razor
Complex models should not be considered without necessity

y=f(x)

Data space

p(Y | M ) p(Y | , M ) p( | M )d
Usually no exact analytic solution !!

Introduction and general principles | key words and definitions | inference techniques and outcomes

Approximations to the model evidence

sup PY , M 1
BIC 2 log
n 2 n1 log N
sup PY , M 2

sup PY , M 1
AIC 2 log
2n2 n1

sup P Y , M 2

Free energy F

Obtained from Variational Bayes inference

For non-linear models, F is used as a proxy for the model evidence

Introduction and general principles | key words and definitions | inference techniques and outcomes

Variational Bayes Inference


Variational Bayes (VB) Expectation Maximization (EM) Restricted Maximum Likelihood (ReML)
Main features
Iterative optimization procedure
Yields a twofold inference on parameters and models
Uses a fixed-form approximate posterior
Make use of approximations (e.g. mean field, Laplace)
to approach , and
The criterion to be maximized is the free-energy F
F is a lower bound to the log-evidence
= ln ; ,
= ln , ;
F = accuracy - complexity

Introduction and general principles | key words and definitions | inference techniques and outcomes

Bayes rule for models

For non-linear models, F is used as a proxy for the model evidence

Introduction and general principles | key words and definitions | inference techniques and outcomes

Family level inference


Example of model posterior (8 models)

Similar models share probability mass (dilution).


The probability for any single model can become very small, especially for large model spaces.

Introduction and general principles | key words and definitions | inference techniques and outcomes

Family level inference


If we can assign each model m to a family f, one can compare families
based on their posterior probabilities which write simply

Introduction and general principles | key words and definitions | inference techniques and outcomes

Within family parameter estimation : Bayesian model averaging


Each DCM.mat file stores the posterior mean (DCM.Ep) and posterior covariance (DCM.Cp)
for that particular model M. They define the posterior distribution over parameters

P Y , M

The posterior can be combined with the posterior model probabilities to compute a posterior
over parameters independent of model assumptions (within the chosen set)

PM Y

We marginalized over model space (usually restricted to the winning family)

Introduction and general principles | key words and definitions | inference techniques and outcomes

Group model comparison : fixed effect (FFX)

Introduction and general principles | key words and definitions | inference techniques and outcomes

Group model comparison : random effect (FFX)

Introduction and general principles | key words and definitions | inference techniques and outcomes

Group model comparison : random effect (FFX)

Introduction and general principles | key words and definitions | inference techniques and outcomes

Group model comparison : random effect (FFX)

Introduction and general principles | key words and definitions | inference techniques and outcomes

Overview

Suggestion for further reading

When the facts change, I change


my mind, what do you do, sir ?
John Maynard Keynes

References

Das könnte Ihnen auch gefallen