Sie sind auf Seite 1von 4

Dr.

Bob John
Adaptive Network Based Fuzzy Inference Systems (ANFIS)
As we have already seen, fuzzy systems present particular problems to a developer:
Rules. The ifthen rules have to be determined somehow. This is usually done by
!"nowled#e ac$uisition% from an e&pert. 't is a time consumin# process that is frau#ht
with problems.
(embership functions. A fuzzy set is fully determined by its membership function. This
has to be determined. 'f it%s #aussian then what are the parameters)
The A*+', approach learns the rules and membership functions from data.
A*+', is an adaptive network. An adaptive networ" is networ" of nodes and directional lin"s.
Associated with the networ" is a learnin# rule for e&ample bac" propa#ation. 't%s called adaptive
because some, or all, of the nodes have parameters which affect the output of the node. These
networ"s are learnin# a relationship between inputs and outputs.
Adaptive networ"s covers a number of different approaches but for our purposes we will investi#ate
in some detail the method proposed by Jan# "nown as A*+',.
The A*+', architecture is shown below. The circular nodes represent nodes that are fi&ed whereas
the s$uare nodes are nodes that have parameters to be learnt.
-ayer . -ayer / -ayer 0 -ayer 1 -ayer 2

.
w
.
w
. .
f w
3
+
4
/
w
/
w
/ /
f w
A
.
A
/
B
.
B
/
An A*+', architecture for a two rule ,u#eno system
A Two Rule ,u#eno A*+', has rules of the form:
. . . . . .
r y q x p f THEN B is y and A is x If + + =
/ / / / / /
r y q x p f THEN B is y and A is x If + + =
+or the trainin# of the networ", there is a forward pass and a bac"ward pass. 5e now loo" at each
layer in turn for the forward pass. The forward pass propa#ates the input vector throu#h the
networ" layer by layer. 'n the bac"ward pass, the error is sent bac" throu#h the networ" in a similar
manner to bac"propa#ation.
Dr. Bob John
Layer 1
The output of each node is:
/ , . 6 7
, .
= = i for x O
i
A i
1 , 0 6 7
/
, .
= =

i for y O
i
B i

,o, the
6 7
, .
x O
i is essentially the membership #rade for
x
and
y
.
The membership functions could be anythin# but for illustration purposes we will use
the bell shaped function #iven by:
i
b
i
i
A
a
c x
x
/
.
.
6 7

+
=
where i i i
c b a , ,
are parameters to be learnt. These are the premise parameters.
Layer 2
8very node in this layer is fi&ed. This is where the tnorm is used to !A*D% the
membership #rades for e&ample the product:
/ , . 6, 7 6 7
, /
= = = i y x w O
i i
B A i i

Layer
-ayer 0 contains fi&ed nodes which calculates the ratio of the firin# stren#ths of the
rules:
/ .
, 0
w w
w
w O
i
i
i
+
= =
Layer !
The nodes in this layer are adaptive and perform the conse$uent of the rules:
6 7
, 1 i i i i i i i
r y q x p w f w O + + = =
The parameters in this layer 7 i i i
r q p , ,
6 are to be determined and are referred to as the
conse$uent parameters.
Dr. Bob John
Layer "
There is a sin#le node here that computes the overall output:

= =
i
i
i
i i
i
i
i i
w
f w
f w O
, 2
This then is how, typically, the input vector is fed throu#h the networ" layer by layer.
5e now consider how the A*+', learns the premise and conse$uent parameters for
the membership functions and the rules.
There are a number of possible approaches but we will discuss the hybrid learnin#
al#orithm proposed by Jan#, ,un and (izutani 7*euro+uzzy and ,oft 9omputin#,
:rentice ;all, .<<=6 which uses a combination of ,teepest Descent and -east ,$uares
8stimation 7-,86. This can #et very complicated 7>6 so here ' will provide a very
hi#h level description of how the al#orithm operates.
't can be shown that for the networ" described if the premise parameters are fi&ed the
output is linear in the conse$uent parameters.
5e split the total parameter set into three:
#
? set of total parameters
.
#
? set of premise 7nonlinear6 parameters
/
#
? set of conse$uent 7linear6 parameters
,o, A*+', uses a two pass learnin# al#orithm:
+orward :ass. ;ere .
#
is unmodified and /
#
is computed usin# a -,8
al#orithm.
Bac"ward :ass. ;ere /
#
is unmodified and .
#
is computed usin# a #radient
descent al#orithm such as bac" propa#ation.
,o, the hybrid learnin# al#orithm uses a combination of steepest descent and least
s$uares to adapt the parameters in the adaptive networ".
The summary of the process is #iven below:
T$e %orward &ass
:resent the input vector
9alculate the node outputs layer by layer
Repeat for all data
A
and
y
formed
Dr. Bob John
'dentify parameters in /
#
usin# -east ,$uares
9ompute the error measure for each trainin# pair
Backward &ass
@se steepest descent al#orithm to update parameters in .
#
7bac"propa#ation6
+or #iven fi&ed values of .
#
the parameters in /
#
found by this approach are
#uaranteed to be the #lobal optimum point.

Das könnte Ihnen auch gefallen