834 views

Uploaded by mohan463

- MobileWiFi.lnk
- TREE-CART
- Blue
- 3 Matlab Anfis
- 54431
- 10_Geeva
- Neural Networks
- 1999ANN-Based Handwritten Character Recognition
- Detection of Lung Cancer Using Data Mining Classification
- En 36855867
- 5 Pre Diagnosis of Hypertension Using Artificial Neural Network
- 03wf001
- L06 Backpropagation
- Forex Robot TFOT 8 is Available now
- Taller Capitulo 4 Clasificacion (Solucion)
- A Comparative Study of Use of Shannon, Rényi and Tsallis Entropy for Attribute Selecting in Network Intrusion Detection
- A comparative study of some network approaches to predict the effect of the reinforcement content on the hot strength of Al–base composites
- Head Pose estimation.docx
- Exam Time Table for I & IIII & v Semester Complete
- Performance Evaluation of Neural Classifiers Through Confusion Matrices To Diagnose Skin Conditions

You are on page 1of 29

Analysis and Applications

Piero P. Bonissone

GE CRD,

Schenectady, NY USA

Email: Bonissone@crd.ge.com

Outline

• Objective

• Fuzzy Control

– Background, Technology & Typology

• ANFIS:

– as a Type III Fuzzy Control

– as a fuzzification of CART

– Characteristics

– Pros and Cons

– Opportunities

– Applications

– References ©Copyright 1997 by Piero P. Bonissone

ANFIS Objective

• To integrate the best features of Fuzzy

Systems and Neural Networks:

– From FS: Representation of prior knowledge into a

set of constraints (network topology) to reduce the

optimization search space

– From NN: Adaptation of backpropagation to

structured network to automate FC parametric

tuning

• ANFIS application to synthesize:

– controllers (automated FC tuning)

– models (to explain past data and predict future

behavior) ©Copyright 1997 by Piero P. Bonissone

FC Technology & Typology

• Fuzzy Control

– A high level representation language with

local semantics and an interpreter/compiler

to synthesize non-linear (control) surfaces

– A Universal Functional Approximator

• FC Types

– Type I: RHS is a monotonic function

– Type II: RHS is a fuzzy set

– Type III: RHS is a (linear) function of state

FC Technology (Background)

• Fuzzy KB representation

– Scaling factors, Termsets, Rules

• Rule inference (generalized modus ponens)

• Development & Deployment

– Interpreters, Tuners, Compilers, Run-time

– Synthesis of control surface

• FC Types I, II, III

FC of Type II, III, and ANFIS

• Type III Fuzzy Control (TSK variety) have an

automatic RHS tuning

• ANFIS will provide both RHS and LHS tuning

ANFIS Neurons: Clarification note

• Note that neurons in ANFIS have different

structures:

– Values [membership function defined by

parameterized soft trapezoids]

– Rules [differentiable T-norm - usually product ]

– Normalization [arithmetic division ]

– Functions [ linear regressions and multiplication

with normalized λ]

– Output [ Algebraic Sum ]

ANFIS as a Type III FC

• (L0): FC state variables are nodes in ANFIS inputs

layer

• (L1): termsets of each state variable are nodes in

ANFIS values layer, computing the membership value

• (L2): each rule in FC is a node in ANFIS rules layer

using soft-min or product to compute the rule

matching factor λi

• (L3): each λi is scaled into λ′i in the normalization

layer

• (L4): each λ′i weighs the result of its linear regression

fi in the function layer, generating the rule output

• (L5): each rule output is added in the output layer

©Copyright 1997 by Piero P. Bonissone

ANFIS as a generalization of CART

• Classification and Regression Tree (CART)

– Algorithm defined by Breiman et al in 1984

– Creates a binary decision tree to classify the data

into one of 2n linear regression models to

minimize the Gini index for the current node c:

Gini(c) = 1 − ∑ p 2j

j

where:

• pj is the probability of class j in node c

• Gini(c) measure the amount of “impurity” (incorrect

classification) in node c

©Copyright 1997 by Piero P. Bonissone

CART Problems

• Discontinuity

• Lack of locality (sign of coefficients)

Cart Binary Partion Tree and Rule table

Representation

x1 x1 x2 y

a1 ≤ x1 a1 > x1

a1 ≤ a2 ≤ f1(x1,x2)

x2 x2

a1 ≤ > a2 f2(x1,x2)

a2 ≤ x2 a2 > x2 a2 > x2

a2 ≤ x2 a1 > a2 ≤ f3(x1,x2)

Discontinuities due to small input

perturbations

y= f3(x11, . )

y= f1(x11, . ) Let's assume two inputs: I1=(x11,x12), and I2=(x21,x22) such that:

x11 = ( a1-ε )

x21 = ( a1+ ε )

x12 = x22 < a2

Thus I1 is assigned f1(x1,x2) while I2 is assigned f2(x1,x2)

µ (x1)

(a1 ≤ )

X1

x11 a1 x21

ANFIS Characteristics

• Adaptive Neural Fuzzy Inference System

(ANFIS)

– Algorithm defined by J.-S. Roger Jang in 1992

– Creates a fuzzy decision tree to classify the data

into one of 2n (or pn) linear regression models to

minimize the sum of squared errors (SSE):

SSE = ∑ e2j

j

where:

• ej is the error between the desired and the actual output

• p is the number of fuzzy partitions of each variable

• n is the number of input variables

©Copyright 1997 by Piero P. Bonissone

ANFIS Computational Complexity

L0 Inputs n 0

L1 Values (p•n) 3•(p•n)=|S1|

L2 Rules pn 0

L3 Normalize pn 0

L4 Lin. Funct. pn (n+1)•pn=|S2|

L5 Sum 1 0

ANFIS Parametric Representation

• ANFIS uses two sets of parameters: S1 and S2

– S1 represents the fuzzy partitions used in the rules

LHS

S1 = {{a11 , b11 , c11 }, {a12 , b12 , c12 },..., {a1p , b1p ,c1p },..., {anp , bnp , cnp }}

– S2 represents the coefficients of the linear

functions in the rules RHS

0

, c p n 1 ,..., c p n n }}

ANFIS Learning Algorithms

• ANFIS uses a two-pass learning cycle

– Forward pass:

• S1 is unmodified and S2 is computed using a Least

Squared Error (LSE) algorithm (Off-line Learning)

– Backward pass:

• S2 is unmodified and S1 is computed using a gradient

descent algorithm (usually Backprop)

ANFIS LSE Batch Algorithm

• LSE used in Forward pass:

– S = {S1 S2}, and {S1 S2 = ∅}

H(Output ) = H $ F (I, S ),where H $ F is linear in S 2

– For given values of S1, using K training data, we

can transform the above equation into B=AX,

where X contains the elements of S2

– This is solved by: (ATA)-1AT B=X* where (ATA)-1AT

is the pseudo-inverse of A (if ATA is nonsingular)

– The LSE minimizes the error ||AX-B||2 by

approximating X with X*

©Copyright 1997 by Piero P. Bonissone

ANFIS LSE Batch Algorithm (cont.)

• Rather than solving directly (ATA)-1AT B=X* ,

we resolve it iteratively:

T

Si a(i +1) a(i+1) Si

Si +1 = Si − ,

1 + a(i +1) Si a(i+1)

T

for i = 0,1,..., K − 1

X i+1 = Xi + S(i +1) a(i +1) (b(i+1) − a(i+1) Xi )

T T

X0 = 0,

S0 = γI,(where γ is a large number )

a Ti = i th line of matrix A

b Ti = i th element of vector B

X * = Xk ©Copyright 1997 by Piero P. Bonissone

ANFIS Backprop

• Error measure Ek

(for the kth (1≤k≤K) entry of the training data)

N( L)

Ek = ∑ i L,i

(d − x )2

i =1

where:

N(L) = number nodes in layer L

di = i th component of desired output vector

x L,i = i th component of actual output vector

K

E= ∑ Ek

k =1 ©Copyright 1997 by Piero P. Bonissone

ANFIS Backprop (cont.)

• For each parameter αi the update formula is:

∂ +E

∆α i = − η

∂α i

where:

κ

η= is the learning rate

∂E 2

∑ ∂α

i i

κ is the step size

∂ +E

is the ordered derivative

∂α i

ANFIS Pros and Cons

• ANFIS is one of the best tradeoff between

neural and fuzzy systems, providing:

– smoothness, due to the FC interpolation

– adaptability, due to the NN Backpropagation

• ANFIS however has strong computational

complexity restrictions

ANFIS Pros

Characteristics Advantages

knowledge into three layers not Smaller convergency

network topology fully connected fan-out for than typical ++

& initial fuzzy (inputs-values- Backprop feedforward

partitions rules) NN

Induced partial-

Smaller size

order is usually

training set +++

preserved

determine implementation of compactness

rules RHS Takagi-Sugeno-Kang (smaller # rules

+

(TSK model) FLC than using labels)

ANFIS Cons

Characteristics Disadvantages

Translates prior

Surface oscillations

knowledge into Sensitivity to

around points (caused

network topology initial number

by high partition --

& initial fuzzy of partitions " P"

number)

partitions

number of input complexity: --

variables " n " # Rules = ^n

P

determine rules Partial loss of always consistent with

RHS (TSK rule "locality" underlying monotonic -

model) relations

ANFIS Pros (cont.)

Characteristics Advantages

Uses LMS algorithm Uses fuzzy

to compute partitions to

polynomials discount outlier

coefficients effects

Automatic FLC

parametric +++

tuning

tune fuzzy modify only

partitions "values" layer

mechanism to guaranteed by ++

interpolate among rules interpolation

ANFIS Cons (cont.)

Characteristics Disadvantages

Uses LMS algorithm Not possible to

Batch process

to compute represent known

disregards previous

polynomials

state (or IC) monotonic -

coefficients relations

requires derivability of Cannot use

fuzzy partitions and trapezoids nor -

Uses Backprop to T-norms used by FLC "Min"

tune fuzzy

partitions

quadratic treatment & great

--

error cost outliers influence

Uses convex sum: interpolation

mechanism to

Σ λ i f i (X)/ Σ λ i between slopes of -

interpolate among rules

different sign

ANFIS Opportunities

• Changes to decrease ANFIS complexity

– Use “don’t care” values in rules (no connection

between any node of value layer and rule layer)

– Use reduced subset of state vector in partition tree

while evaluating linear functions on complete state

X = (X r X (n−r) )

– Use heterogeneous partition granularity (different

partitions pi for each state variable, instead of “p”)

n

# RULES = ∏ pi

i=1

ANFIS Opportunities (cont.)

• Changes to extend ANFIS applicability

– Use other cost function (rather than SSE) to

represent the user’s utility values of the error

(error asymmetry, saturation effects of outliers,etc.)

convex sum) to better handle slopes of different

signs.

ANFIS Applications at GE

• Voltage Instability Predictor (Smart Relay)

• Collateral Evaluation for Mortgage Approval

ANFIS References

• “ANFIS: Adaptive-Network-Based Fuzzy Inference System”,

J.S.R. Jang, IEEE Trans. Systems, Man, Cybernetics,

23(5/6):665-685, 1993.

• “Neuro-Fuzzy Modeling and Control”, J.S.R. Jang and C.-T.

Sun, Proceedings of the IEEE, 83(3):378-406

• “Industrial Applications of Fuzzy Logic at General Electric”,

Bonissone, Badami, Chiang, Khedkar, Marcelle, Schutten,

Proceedings of the IEEE, 83(3):450-465

• The Fuzzy Logic Toolbox for use with MATLAB, J.S.R. Jang

and N. Gulley, Natick, MA: The MathWorks Inc., 1995

• Machine Learning, Neural and Statistical Classification Michie,

Spiegelhart & Taylor (Eds.), NY: Ellis Horwood 1994

• Classification and Regression Trees, Breiman, Friedman,

Olshen & Stone, Monterey, CA: Wadsworth and Brooks, 1985

©Copyright 1997 by Piero P. Bonissone

- MobileWiFi.lnkUploaded bysavisu
- TREE-CARTUploaded byArindam Mondal
- BlueUploaded bySwati Chandak
- 3 Matlab AnfisUploaded byThiru Mal J
- 54431Uploaded bymohitgadhavi
- 10_GeevaUploaded bydimasairlangga
- Neural NetworksUploaded byElli Panagi
- 1999ANN-Based Handwritten Character RecognitionUploaded byEmin Kültürel
- Detection of Lung Cancer Using Data Mining ClassificationUploaded byprakash
- En 36855867Uploaded byAnonymous 7VPPkWS8O
- 5 Pre Diagnosis of Hypertension Using Artificial Neural NetworkUploaded byDenise Carvalho
- 03wf001Uploaded byHadyan Widyadhana
- L06 BackpropagationUploaded byTheDreamM
- Forex Robot TFOT 8 is Available nowUploaded byVictor
- Taller Capitulo 4 Clasificacion (Solucion)Uploaded byFabian Alejandro León Almeciga
- A Comparative Study of Use of Shannon, Rényi and Tsallis Entropy for Attribute Selecting in Network Intrusion DetectionUploaded byresplandor
- A comparative study of some network approaches to predict the effect of the reinforcement content on the hot strength of Al–base compositesUploaded bySoheil Mirtalebi
- Head Pose estimation.docxUploaded byمسافر عابر
- Exam Time Table for I & IIII & v Semester CompleteUploaded bySanjay Gomasta
- Performance Evaluation of Neural Classifiers Through Confusion Matrices To Diagnose Skin ConditionsUploaded byAI Coordinator - CSC Journals
- machineLearning · GitHubUploaded byTrust Munyaradzi Marongedze
- Random ForestUploaded byazn
- Machine Learning FirewallsUploaded byAnonymous IHf491d
- Data Visualization and Data Mining for RetailersUploaded byIJIRST
- Point Count Systems in Imperfect Information GameUploaded byIRJET Journal
- Wrappers for Feature Subset SelectionUploaded byDani Zurita Millán
- Survey of Neural Network Based Heart Disease PredictionUploaded byInternational Journal for Scientific Research and Development - IJSRD
- Head Pose estimation.docxUploaded byمسافر عابر
- Face Recognition with Neural NetworksUploaded byInternational Journal for Scientific Research and Development - IJSRD
- NeuralUploaded bysramiz_1987

- Lpc2148 With Keil-carmUploaded bymohan463
- learning Algorithms for Neural NetworksUploaded bymohan463
- Writing C Code for the 8051[1]Uploaded byantnbee
- gps vhdlUploaded bymohan463
- Matlab PrimerUploaded bymohan463
- Rfid AUploaded bymohan463
- Finger Advantages DisadvantagesUploaded bymohan463
- A Neural Network Approach to Estimate Snowfall Parameters From PassiveUploaded bymohan463

- 810 InvoiceUploaded byJoseph Demastrie
- Morris County Master Plan: Circulation ElementUploaded byMorris County NJ
- 476TL-T12.pdfUploaded byAllan
- Campus Guidelines_Vodafone VoyageUploaded byHarshdeep Bhatia
- Beam FormingUploaded byshijujoseph1989
- Report Coastal LiDAR WS 25 26March Stellenbosch Final Draft 29Aug2014Uploaded byAyorinde T Tunde
- Data Compression CompleteUploaded byRuchi Rewri
- Solow model (Macro)Uploaded byAndra MH
- Vlsi Interview Question AnswerUploaded bySuchitav Khadanga
- Conformance to RequirementsUploaded bySudhagar
- The Quintessential PsionUploaded byannabella44
- 10g Flashback DataguardUploaded bySai Kiran
- RE - 1984-02Uploaded byAnonymous kdqf49qb
- adams matse psu sp19Uploaded byapi-390422023
- Welded Tanks for Oil StorageUploaded byDwi Hermawan
- Course e Book Web v2.1Uploaded byAmeya
- 329111004 State Department Clinton Foundation Emails 2Uploaded byRWDS
- ZH Screw Compressor Presentation_CAW_EnglishUploaded byjmdc
- oig_method_validation_procedure_01.pdfUploaded byMargaretaSianne
- ByteBlaster InstructionsUploaded byildevan
- Instructional Media EltUploaded byDyah Wulandari
- slyc124Uploaded byahimpli
- cap9Uploaded byLuiz Fernando T. Vargas
- Power Calcuation in a Lossy TLUploaded bychandan kumar
- Written Manual v80Uploaded byaejparry
- English FinalsUploaded byMark M. Alipio
- Preview_BASF-Redbook.pdfUploaded bymushava
- OblivionUploaded bySiempre Contigo
- Water Purify (6)Uploaded byPurnima Saha
- 12 Useful Pandas Techniques in Python for Data ManipulationUploaded byxwpom2