Sie sind auf Seite 1von 35

Aero. Engr. & Engr. Mech.

, UT Austin
31 March 2011
Mark L. Psiaki
Sibley School of Mechanical & Aerospace Engr.,
Cornell University
Nonlinear Model-Based Estimation
Algorithms: Tutorial and Recent
Developments
UT Austin March 11 2 of 35
Acknowledgements
Collaborators
Paul Kintner, former Cornell ECE faculty member
Steve Powell, Cornell ECE research engineer
Hee Jung, Eric Klatt, Todd Humphreys, & Shan Mohiuddin,
Cornell GPS group Ph.D. alumni
Joanna Hinks, Ryan Dougherty, Ryan Mitch, & Karen Chiang,
Cornell GPS group Ph.D. candidates
Jon Schoenberg & Isaac Miller, Cornell Ph.D. candidate/alumnus
of Prof. M. Campbells autonomous systems group
Prof. Yaakov Oshman, The Technion, Haifa, Israel, faculty of
Aerospace Engineering
Massaki Wada, Saila System Inc. of Tokyo, Japan
Sponsors
Boeing Integrated Defense Systems
NASA Goddard
NASA OSS
NSF
UT Austin March 11 3 of 35
Goals:
Use sensor data from nonlinear systems to infer internal
states or hidden parameters
Enable navigation, autonomous control, etc. in
challenging environments (e.g., heavy GPS jamming) or
with limited/simplified sensor suites
Strategies:
Develop models of system dynamics & sensors that relate
internal states or hidden parameters to sensor outputs
Use nonlinear estimation to invert models & determine
states or parameters that are not directly measured
Nonlinear least-squares
Kalman filtering
Bayesian probability analysis
UT Austin March 11 4 of 35
Outline
I. Related research
II. Example problem: Blind tricyclist w/bearings-only
measurements to uncertain target locations
III. Observability/minimum sensor suite
IV. Batch filter estimation
Math model of tricyclist problem
Linearized observability analysis
Nonlinear least-squares solution
V. Models w/process noise, batch filter limitations
VI. Nonlinear dynamic estimators: mechanizations & performance
Extended Kalman Filter (EKF)
Sigma-points filter/Unscented Kalman Filter (UKF)
Particle filter (PF)
Backwards-smoothing EKF (BSEKF)
VII. Introduction of Gaussian sum techniques
VIII. Summary & conclusions
UT Austin March 11 5 of 35
Related Research
Nonlinear least squares batch estimation: Extensive
literature & textbooks, e.g., Gill, Murray, & Wright (1981)
Kalman filter & EKF: Extensive literature & textbooks, e.g.,
Brown & Hwang 1997 or Bar-Shalom, Li & Kirubarajan
(2001)
Sigma-points filter/UKF: Julier, Uhlmann, & Durrant-Whyte
(2000), Wan & van der Merwe (2001), etc.
Particle filter: Gordon, Salmond, & Smith (1993),
Arulampalam et al. tutorial (2002), etc.
Backwards-smoothing EKF: Psiaki (2005)
Gaussian mixture filter: Sorenson & Alspach (1971), van
der Merwe & Wan (2003), Psiaki, Schoenberg, & Miller
(2010), etc.
A Blind Tricyclist Measuring Relative
Bearing to a Friend on a Merry-Go-Round
UT Austin March 11 6 of 35
Assumptions/constraints:
Tricyclist doesnt know initial x-y position or heading, but
can accurately accumulate changes in location & heading
via dead-reckoning
Friend of tricyclist rides a merry-go-round & periodically
calls to him giving him a relative bearing measurement
Tricyclist knows merry-go-round location & diameter, but
not its initial orientation or its constant rotation rate
Estimation problem: determine initial location &
heading plus merry-go-round initial orientation &
rotation rate
Example Tricycle Trajectory &
Relative Bearing Measurements
See 1
st
Matlab movie
UT Austin March 11 7 of 35
Is the System Observable?
Observability is condition of having unique internal
states/parameters that produce a given measurement
time history
Verify observability before designing an estimator
because estimation algorithms do not work for
unobservable systems
Linear system observability tested via matrix rank calculations
Nonlinear system observability tested via local linearization rank
calculations & global minimum considerations of associated least-
squares problem
Failed observability test implies need for additional
sensing
UT Austin March 11 8 of 35
Observability Failure of Tricycle
Problem & a Fix
See 2
nd
Matlab movie for failure/non-
uniqueness
See 3
rd
Matlab movie for fix via additional
sensing
UT Austin March 11 9 of 35
Geometry of Tricycle Dynamics &
Measurement Models
UT Austin March 11 10 of 35

u
m
|
m
|

m
X
m
Y
X East,
Y North,
Y
X
Tricycle
Round - Go - Merry
th
m
V
UT Austin March 11 11 of 35
Constant-turn-radius transition from t
k
to t
k+1
= t
k
+At:







State & control vector definitions

Consistent with standard discrete-time state-vector
dynamic model form:
Tricycle Dynamics Model from Kinematics
]
tan
sinc cos
tan
cinc [sin } { } {
1
w
k k
k
w
k k
k k k k
b
t V
b
t V
t V X X

u

u + + =
+
]
tan
sinc sin
tan
cinc cos [ } { } {
1
w
k k
k
w
k k
k k k k
b
t V
b
t V
t V Y Y

u

u + + =
+
w
k k
k k
b
t V
u u
tan
1
+ =
+
t
mk mk mk
| | |

+ =
+1
mk mk
| |

=
+1
2 , 1 for = m
T
2 1 2 1
] , , , , , , [
k k k k k k k k
Y X | | | | u

= x
T
] , [
k k k
V = u
) , (
1 k k k k
u x f x =
+
UT Austin March 11 12 of 35
Trigonometry of bearing measurement to m
th
merry-
go-round rider


Sample-dependent measurement vector definition:





Consistent with standard discrete-time state-vector
measurement model form:
Bearing Measurement Model
),... cos cos {( atan2
k r k mk m m mk
b X u | + = X

=
shouts rider neither if []
shout riders both if
shouts 2 rider only if
shouts 1 rider only if
2
1
2
1
k
k
k
k
k

z
k k k k
u + = ) ( x h z
)} sin sin (
k r k mk m m
b Y u | + Y
UT Austin March 11 13 of 35
Over-determined system of equations:

Definitions of vectors & model function:

Nonlinear Batch Filter Model
big big big
u + = ) (
0
x h z
(
(
(
(

=
N
big
z
z
z
z

2
1
(
(
(
(

=
N
big
u
u
u
u

2
1
(
(
(
(

=

]} ), }, , { ( [ {
]} ), , ( [ {
]} , [ {
) (
1 2 3 3 2 1
1 0 0 0 1 2
0 0 0 1
0
N N N N N N N
big
u u u f f f h
u u x f f h
u x f h
x h

UT Austin March 11 14 of 35
Linearized local observability analysis:


Batch filter nonlinear least-squares estimation problem



Approximate estimation error covariance

Batch Filter Observability & Estimation
0
x
h
c
c
=
big
big
H
? ) dim( ) (
0
x =
big
H rank
0
: find x
: minimize to
)] ( [ )] ( [ ) (
0
1 T
0
2
1
0
x h z x h z x
big big big big big
R J =

uu
} ) )( {(
T
0 0 0 0 0
x x x x =
opt opt xx
E P
1 1 T 1
2
0
2
] [ ] [
0

~
c
c
~
big big big
H R H
J
opt
uu
x
x
Example Batch Filter Results
UT Austin March 11 15 of 35
-20 -10 0 10 20 30 40 50 60
-30
-20
-10
0
10
20
30
East Position (m)
N
o
r
t
h

P
o
s
i
t
i
o
n

(
m
)


Truth
Batch Estimate
UT Austin March 11 16 of 35
Typical form driven by Gaussian white random
process noise v
k
:



Tricycle problem dead-reckoning errors naturally
modeled as process noise
Specific process noise terms
Random errors between true speed V & true steer angle
and the measured values used for dead-reckoning
Wheel slip that causes odometry errors or that occurs in the
side-slip direction.
Dynamic Models with Process Noise
) , , (
1 k k k k k
v u x f x =
+
jk k j k k
Q E E o = = } { , 0 } {
T
v v v
Effect of Process Noise on Truth Trajectory
UT Austin March 11 17 of 35
-20 -10 0 10 20 30 40 50 60
-30
-20
-10
0
10
20
30
East Position (m)
N
o
r
t
h

P
o
s
i
t
i
o
n

(
m
)


No Process Noise
Process Noise Present
Effect of Process Noise on Batch Filter
UT Austin March 11 18 of 35
-20 -10 0 10 20 30 40 50 60
-30
-20
-10
0
10
20
30
40
East Position (m)
N
o
r
t
h

P
o
s
i
t
i
o
n

(
m
)


Truth
Batch Estimate
UT Austin March 11 19 of 35
Dynamic Filtering based on Bayesian
Conditional Probability Density





subject to x
i
for i = 0, , k-1 determined as
functions of x
k
& v
0
, , v
k-1
via inversion of the
equations:
{

=

=

1
0
1 T
2
1
k
i
i i i
Q J v v
} )] ( - [ )] ( - [
1 1 1
1
1
T
1 1 1 + + +

+ + + +
+
i i i i i i i
R x h z x h z
uu
)

- ( )

- (
0 0
1
0
T
0 0
2
1
x x x x

+
xx
P
} exp{ ) , , | , , , (
1 1 0
J C
k k k
=

z z v v x p
1 ,.., 0 for ) , , (
1
= =
+
k i
i i i i i
v u x f x
UT Austin March 11 20 of 35
Uses Taylor series approximations of f
k
(x
k
,u
k
,v
k
) & h
k
(x
k
)
Taylor expansions about approximate x
k
expectation values &
about v
k
= 0
Normally only first-order, i.e., linear, expansions used, but
sometimes quadratic terms are used
Gaussian statistics assumed
Allows complete probability density characterization in terms of
means & covariances
Allows closed-form mean & covariance propagations
Optimal for truly linear, truly Gaussian systems
Drawbacks
Requires encoding of analytic derivatives
Loses accuracy or even stability in the presence of severe
nonlinearities
EKF Approximation
EKF Performance, Moderate Initial Uncertainty
UT Austin March 11 21 of 35
-30 -20 -10 0 10 20 30 40 50 60 70
-30
-20
-10
0
10
20
30
East Position (m)
N
o
r
t
h

P
o
s
i
t
i
o
n

(
m
)


Truth
EKF Estimate
EKF Performance, Large Initial Uncertainty
UT Austin March 11 22 of 35
-40 -20 0 20 40 60 80
-60
-50
-40
-30
-20
-10
0
10
20
30
East Position (m)
N
o
r
t
h

P
o
s
i
t
i
o
n

(
m
)


Truth
EKF Estimate
UT Austin March 11 23 of 35
Evaluate f
k
(x
k
,u
k
,v
k
) & h
k
(x
k
) at specially chosen sigma points &
compute statistics of results
Sigma points & weights yield pseudo-random approximate Monte-Carlo
calculations
Can be tuned to match statistical effects of more Taylor series terms than
EKF approximation
Gaussian statistics assumed, as in EKF
Mean & covariance assumed to fully characterize distribution
Sigma points provide a describing-function-type method for improving mean
& covariance propagations, which are performed via weighted averaging
over sigma points
No need for analytic derivatives of functions
Also optimal for truly linear, truly Gaussian systems
Drawback
Additional Taylor series approximation accuracy may not be sufficient for
severe nonlinearities
Extra parameters to tune
Singularities & discontinuities may hurt UKF more than other filters
Sigma-Points UKF Approximation
UKF Performance, Moderate Initial Uncertainty
UT Austin March 11 24 of 35
-20 -10 0 10 20 30 40 50 60 70
-30
-20
-10
0
10
20
30
East Position (m)
N
o
r
t
h

P
o
s
i
t
i
o
n

(
m
)


Truth
UKF A Estimate
UKF B Estimate
UKF Performance, Large Initial Uncertainty
UT Austin March 11 25 of 35
-20 0 20 40 60 80 100
-60
-40
-20
0
20
40
East Position (m)
N
o
r
t
h

P
o
s
i
t
i
o
n

(
m
)


Truth
UKF A Estimate
UKF B Estimate
UT Austin March 11 26 of 35
Approximate the conditional probability distribution using Monte-Carlo
techniques
Keep track of a large number of state samples & corresponding weights
Update weights based on relative goodness of their fits to measured data
Re-sample distribution if weights become overly skewed to a few points,
using regularization to avoid point degeneracy
Advantages
No need for Gaussian assumption
Evaluates f
k
(x
k
,u
k
,v
k
) & h
k
(x
k
) at many points, does not need analytic
derivatives
Theoretically exact in the limit of large numbers of points
Drawbacks
Point degeneracy due to skewed weights not fully compensated by
regularization
Too many points required for accuracy/convergence robustness for high-
dimensional problems
Particle Filter Approximation
PF Performance, Moderate Initial Uncertainty
UT Austin March 11 27 of 35
-30 -20 -10 0 10 20 30 40 50 60 70
-30
-20
-10
0
10
20
30
East Position (m)
N
o
r
t
h

P
o
s
i
t
i
o
n

(
m
)


Truth
Particle Filter Estimate
PF Performance, Large Initial Uncertainty
UT Austin March 11 28 of 35
-20 0 20 40 60 80
-50
-40
-30
-20
-10
0
10
20
30
East Position (m)
N
o
r
t
h

P
o
s
i
t
i
o
n

(
m
)


Truth
Particle Filter Estimate
UT Austin March 11 29 of 35
Maximizes probability density instead of trying to
approximate intractable integrals
Maximum a posteriori (MAP) estimation can be biased, but also can
be very near optimal
Standard numerical trajectory optimization-type techniques can be
used to form estimates
Performs explicit re-estimation of a number of past process noise
vectors & explicitly considers a number of past measurements in
addition to the current one, re-linearizing many f
i
(x
i
,u
i
,v
i
) & h
i
(x
i
) for
values of i <= k as part of a non-linear smoothing calculation
Drawbacks
Computationally intensive, though highly parallelizable
MAP not good for multi-modal distributions
Tuning parameters adjust span & solution accuracy of re-smoothing
problems
Backwards-Smoothing EKF Approximation
UT Austin March 11 30 of 35
Implicit Smoothing in a Kalman Filter
0 1 2 3 4 5
-1.5
-1
-0.5
0
0.5
1
1.5
2
2.5
3
3.5
x
1
Sample Count, k
Filter Output
1-Point Smoother
2-Point Smoother
3-Point Smoother
4-Point Smoother
5-Point Smoother
Truth
BSEKF Performance, Moderate Initial Uncertainty
UT Austin March 11 31 of 35
-20 -10 0 10 20 30 40 50 60
-30
-20
-10
0
10
20
30
East Position (m)
N
o
r
t
h

P
o
s
i
t
i
o
n

(
m
)


Truth
BSEKF A Estimate
BSEKF B Estimate
BSEKF Performance, Large Initial Uncertainty
UT Austin March 11 32 of 35
-40 -20 0 20 40 60
-50
-40
-30
-20
-10
0
10
20
30
East Position (m)
N
o
r
t
h

P
o
s
i
t
i
o
n

(
m
)


Truth
BSEKF A Estimate
BSEKF B Estimate
A PF Approximates the Probability Density
Function as a Sum of Dirac Delta Functions
GNC/Aug. 10 33 of 24
-8 -6 -4 -2 0 2 4 6 8
0
0.2
0.4
0.6
x
p
x
(
x
)
,

f
(
x
)
0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0.55 0.6
0
10
20
30
f
p
f
(
f
)
Particle filter approximation of
nonlinearly propagated p
f
(f)
using 50 Dirac delta functions
Particle filter approximation
of original p
x
(x) using
50 Dirac delta functions
A Gaussian Sum Spreads the Component
Functions & Can Achieve Better Accuracy
GNC/Aug. 10 34 of 24
-8 -6 -4 -2 0 2 4 6 8
0
0.2
0.4
0.6
x
p
x
(
x
)
,

f
(
x
)
0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0.55 0.6
0
10
20
30
f
p
f
(
f
)
100-element re-sampled Gaussian
approximation of original p
x
(x)
probability density function
100 Narrow weighted Gaussian
components of re-sampled mixture
EKF/100-narrow-element Gaussian
mixture approximation of
propagated p
f
(f) probability
density function
Summary & Conclusions
Developed novel navigation problem to illustrate
challenges & opportunities of nonlinear estimation
Reviewed estimation methods that extract/estimate
internal states from sensor data
Presented & evaluated 5 nonlinear estimation
algorithms
Examined Batch filter, EKF, UKF, PF, & BSEKF
EKF, PF, & BSEKF have good performance for moderate initial errors
Only BSEKF has good performance for large initial errors
BSEKF has batch-like properties of insensitivity to initial
estimates/guesses due to nonlinear least-squares optimization with
algorithmic convergence guarantees
UT Austin March 11 35 of 35

Das könnte Ihnen auch gefallen