Sie sind auf Seite 1von 18

Recursive Least Squares Parameter

Estimation for Linear Steady State and


Dynamic Models

Thomas F. Edgar
Department of Chemical Engineering
University of Texas
Austin, TX 78712

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 1
Outline
Static model, sequential estimation
Multivariate sequential estimation
Example
Dynamic discrete-time model
Closed-loop estimation

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 2
Least Squares Parameter Estimation

Linear Time Series Models

ref: PC Young, Control Engr., p. 119, Oct, 1969

scalar example (no dynamics)


model y = ax
data y * ax : error

least squares estimate of a: (a )

k
min
a ax
i 1
y
i i
* 2
(1)

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 3
Simple Example

The analytical solution for the minimum (least squares) estimate is

1
k
2 k
*
ak xi xi yi (2)
14
i 1
2 43 14i 1
2 43
pk bk

pk, bk are functions of the number of samples

This is the non-sequential form or non-recursive


form

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 4
Sequential or Recursive Form

To update ak based on a new data pt. (yi , xi), in Eq. (2) let
k
pk 1
xi 2 pk 11 xk 2 (3)
i 1

and k
bk xi y i * bk 1 xk y k * (4)
i 1

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 5
Recursive Form for Parameter Estimation

ak ak 1 K k xk ak 1 y k * (5)
1 44 2 4 43
estimation error
where


1
K k pk 1xk 1 pk 1xk 2
(6)

To start the algorithm, need initial estimates a0


and p0. To update p,


1
pk pk 1 p k 1
2
xk 1 pk 1xk
2 2
(7)

(Set p0 = large positive number)

Eqn. (7) shows pk is decreasing with k

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 6
Estimating Multiple Parameters
(Steady State Model)

T
y x a a1x1 a2 x2 L an x n (8)

2
min k
i
a i 1
xT

a y i
*
(non-sequential solution requires
n x n inverse)

To obtain a recursive form for a,

1 1 T
Pk P k 1 x k x k (9)
B k B k 1 x k y k * (10)

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 7
Recursive Solution


1
T T
a k a k 1 P k 1 x k 1 x k P k 1 x k x k a k 1 y k * (11)
1 4 4 4 4 2 4 4 4 43 1 4 4 2 4 43
Kk estimation
error

1
P k P k 1 P k 1 x k 1 x k P k 1 x k
T T
x k P k 1 (12)

need to assume a 0 (vector) P11 0


P0 P22

P0 (diagonal matrix)
0 Pnn
Pii large

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 8
Simple Example (Estimate Slope & Intercept)
Linear parametric model
input, u 0 1 2 3 4 10 12 18
output, y 5.71 9 15 19 20 45 55 78
y = a1 + a2 u

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 9
Sequential vs. Non-sequential Covariance Matrix vs. Number
Estimation of a2 Only (a1 = 0) of Samples Using Eq. (12)

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 10
Sequential Estimation of a1 and a2 Using Eq. (11)

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 11
Application to Digital Model and Feedback Control

Linear Discrete Model with Time Delay:

y (t ) a1y (t 1) a2 y (t 2) L an y (t n )

b1u(t 1 N ) b2u(t 2 N ) L br u(t r N ) d (13)

y: output u: input d: disturbance N: time delay

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 12
Recursive Least Squares Solution

y (t ) T (t 1) (t 1) (t ) (14)

where
T (t 1) [ y (t 1), y (t 2),. . . y(t n ),
u(t 1 N ),. . . u(t r N ), 1]
T (t 1) [a1, a2 ,. . . an , b1, b2, . . . br , d ].
min t 2
14 (4i 441)2(i4) 4y4(i )43
T
(15)
i 1
"least squares"
(predicted value of y)

(t ) (t 1) P (t ) (t 1)[ y (t ) T (t 1)(t 1)] (16)

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 14
Recursive Least Squares Solution

P (t ) P (t 1) P (t 1) (t 1)[ T (t 1)P (t 1) (t 1) 1]1

T (t 1)P (t 1) (17)

P (t 1) (t 1)
K (t ) (18)
1 T (t 1)P (t 1) (t 1)

P (t ) [I K (t ) T (t 1)]P (t 1) (19)

(t ) (t 1) K (t )[ y (t ) y (t )] (20)

K: Kalman filter gain

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 15
Closed-Loop RLS Estimation

There are three practical considerations in


implementation of parameter estimation algorithms

- covariance resetting

- variable forgetting factor

- use of perturbation signal

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 16
Enhance sensitivity of least squares estimation algorithms
with forgetting factor
t 2

J ( (t )) t i T (i 1) (i ) y (i ) (21)
i 1
1
P (t ) [P (t 1) P (t 1) (t 1)[ T (t 1)P (t 1) (t 1) ] 1.

T (t 1)P (t 1)] (22)

(t ) (t 1) P (t ) (t 1)[ y (t ) T (t 1)(t 1)] (23)

prevents elements of P from becoming too small


(improves sensitivity) but noise may lead to incorrect
parameter estimates
0 < < 1.0 1.0 all data weighted equally
~ 0.98 typical

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 17
Closed-Loop Estimation (RLS)
Perturbation signal is added to process input (via set point)
to excite process dynamics
large signal: good parameter estimates
but large errors in process output

small signal: better control but more sensitivity to noise

Guidelines: Vogel and Edgar, Comp. Chem. Engr., Vol. 12,


pp. 15-26 (1988)
1. set forgetting factor = 1.0

2. use covariance resetting (add diagonal matrix D to P


when tr (P) becomes small)

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 18
3. use PRBS perturbation signal only when
estimation error is large and P is not small.
Vary PRBS amplitude with size of elements
proportional to tr (P).
4. P(0) = 104 I

5. filter new parameter estimates


c (k ) c (k 1) (1 ) ( k )
: tuning parameter
(c used by controller)

6. Use other diagnostic checks such as the sign of the


computed process gain

Thomas F. Edgar (UT-Austin) RLS Linear Models Virtual Control Book 12/06 19

Das könnte Ihnen auch gefallen