Beruflich Dokumente
Kultur Dokumente
T
2
min J (θ ,θˆ) = min ˆ
{ ∫ y (t ,θ ) − yˆ (t ,θ ) dt
θˆ 0
Identification (cont.)
Model structures :
Regression models,
General (SISO) models,
State models,
”Black-box-models” (e.g. impulse response models like
the residence time distribution, neural net models, any
input-output model can be considered in this class)
Identification (cont.)
Input signal:
The estimation result depends crucially on the
characteristics of the input signal
- Convergence of the estimate
- The signal must be rich enough the wake the
dynamics (”persistently exciting”)
If the model structure is too simple, changes in the
output are explaned by parameter variations, not good.
A too complex model does not usually improve the
input-output prediction much.
Least Squares Estimation
The model
y (t ) = ϕ1 (t )θ1 + ϕ 2 (t )θ 2 + L + ϕ n (t )θ n = ϕ (t )T θ
regressors parameters
Least Squares (cont.)
Y (t ) = [ y (1) y (2) L y (t ) ]
T
Residual = estimation error:
T
E (t ) = [ε (1) ε (2) L ε (t ) ] ε (i ) = y (i ) − yˆ (i ) = y (i ) − ϕ T (i )θ
Least Squares (cont.)
ϕ T (1)
T −1
ϕ (2) t
P(t ) = Φ (t )Φ (t ) = ∑ ϕ (i )ϕ (i )
−1
Φ= T T
M
i =1
ϕ (t )
T
2 i =1 2 i =1 2 2
Least Squares (cont.)
in which
E = Y − Yˆ = Y − Φθ
Solution:
T
2 V (θ , t ) = E E = (Y − Φθ ) (Y − Φθ )
T
= Y T Y − Y T Φθ − θ T ΦT Y + θ T ΦT Φθ = V1 (θ , t )
T
But θ Φ Y = (θ Φ Y ) = Y T Φθ
T T T T
(scalar)
Least Squares (cont.)
T T T
which gives θ Φ Φ =Y Φ
T T
Φ Φθ = Φ Y (normal equations)
i =1 i =1 i =1
AT Ax = 0 ⇔ Ax = 0
−1
t
P (t ) = Φ (t )Φ (t ) = ∑ ϕ (i )ϕ (i )
T −1 T
i =1
P (t ) −1 = P(t − 1) −1 + ϕ (t )ϕ T (t ) and
t
t −1
θ (t ) = P(t )∑ ϕ (i ) y (i) = P(t ) ∑ ϕ (i ) y (i) + ϕ (t ) y (t )
ˆ
i =1 i =1
Recursive Least Squares (cont.)
∑ ϕ
i =1
(i ) y (i ) = P (t − 1) −1 ˆ
θ (t − 1) = P (t ) −1 ˆ
θ (t − 1) − ϕ (t )ϕ T
(t )θˆ(t − 1)
= θˆ(t − 1) + K (t )ε (t )
Recursive Least Squares (cont.)
where K (t ) = P(t )ϕ (t )
ε (t ) = y (t ) − ϕ T (t )θˆ(t )
−1 −1 −1
I = ( A + BDC ) A − A B D + CA B CA−1 =
−1 −1
( )
(
I −B D −1 −1
+ CA B )
−1
CA −1 −1
+ BDCA − BDCA B D −1
( −1 −1
+ CA B )
−1
CA−1 =
−1
(
I + BDCA − B I + DCA B D −1
)( −1
+ CA B−1
)−1
CA−1 =
−1
I + BDCA − BD D ( −1
+ CA B D −1
)( −1
+ CA B−1
)
−1
CA−1 =
I + BDCA−1 − BDCA−1 =
I
Recursive Least Squares (cont.)
which gives
−1
P(t ) = P(t − 1) − P(t − 1)ϕ (t ) I + ϕ (t ) P(t − 1)ϕ (t ) ϕ T (t ) P(t − 1)
T
It follows that
ϕ T (t ) P(t − 1)ϕ (t )
K (t ) = P(t )ϕ (t ) = P(t − 1)ϕ (t ) 1 − T
1 + ϕ (t ) P(t − 1)ϕ (t )
1
= P (t − 1)ϕ (t )
1 + ϕ T (t ) P (t − 1)ϕ (t )
P (t − 1)ϕ (t )
K (t ) = P(t )ϕ (t ) =
{ T
n×1 1 + ϕ (t ) P (t − 1)ϕ (t )
P(t − 1)ϕ (t )ϕ T (t ) P(t − 1)
{P (t ) = P (t − 1) − T
=
I − K (t )ϕ T
(t ) P(t − 1)
n× n 1 + ϕ (t ) P(t − 1)ϕ (t )
θ (t + 1) = θ (t )
y (t ) = ϕ T (t )θ (t ) + e(t )
P(0) = P0
−1
P(t ) = P + Φ (t )Φ (t )
0
−1 T
1 t t −i
V (θ , t ) = ∑ λ y (i ) − ϕ (i )θ
T 2
2 i =1
y (t ) = ay (t − 1) + bu (t − 1) + e(t )
u (t ) = gy (t )
Identification in Closed Loop
e(t)
1
1 − az −1
+
bz −1 +
1 − az −1
u(t) y(t)
γϕ (t )
θˆ(t ) = θˆ(t − 1) +
α + ϕ T (t )ϕ (t )
(
y (t ) − ϕ T
(t )θˆ(t − 1) )
where α ≥ 0 and 0 < γ < 2
SA:
(
θˆ(t ) = θˆ(t − 1) + P(t )ϕ (t ) y (t ) − ϕ T (t )θˆ(t − 1) )
−1
t
where P(t ) = ∑ ϕ (i )ϕ (i)
T
is a scalar.
i =1
LMS: ˆ ˆ (
θ (t ) = θ (t − 1) + γϕ (t ) y (t ) − ϕ (t )θˆ(t − 1)
T
)
where γ is a constant.
Continuous-Time Models
Model: y (t ) = ϕ T (t )θ
t
Criterion: V (θ ) = ∫ e −α ( t −τ )
(y(τ ) − ϕ T
)
2
(τ )θ dτ
0
t −α (t −τ ) t
∫e ϕ (τ )ϕ T
(τ ) dτ θˆ(t ) = ∫ e −α (t −τ )ϕ (τ ) y (τ )dτ
0 0
where P (t ) = R (t ) −1