Library of Congress CataloginginPublication Data
Lin, C. T. (Chin Teng). Neural fuzzy systems : a neurofuzzy synergism to intelligent systems / ChinTeng Lin, C. S. George Lee. p. Includes bibliographical references and índex. ISBN 0132351692
cm.
1. Intelligent control systems. networks (Computer science) II. Title. TJ217.25.L57 1995
629.8'9—dc20
2. Fuzzy systems.
3. Neural
I. Lee, C. S. G. (C. S. George)
96170
CIP
Editorial/production supervisión: bookworks / Karen Fortgcmg Cover design director: Jerry Votta Cover design: Amane Altiparmakian Manufacturing manager: Alexis R. Heydt Acquisitions editor: Bernard Goodwin
© 1996 by Prentice Hall P T R
PrenticeHall, Inc.
A Simón & Schuster Company
Upper Saddle River, NJ 07458
MATLAB is a registered trademark of the Math Works, Inc.
The publisher offers discounts on this book when ordered in bulk quantities. For more information, contact
Corporate Sales Department Prentice Hall P T R One Lake Street Upper Saddle River, New Jersey 07458
Phone: 8003833419 FAX: 2012367141 Email: corpsales@prenhall.com
All rights reserved. N o part of this book may be reproduced, in any form or by any means without permission in writíng from the publisher.
Printed in the United States of America
10
98765
4
3
2
1
ISBN 0132351692
PrenticeHall International (UK) Limited, London PrenticeHall of Australia Pty. Limited, Sydney PrenticeHall Cañada Inc., Toronto PrenticeHall Hispanoamericana S.A., México PrenticeHall of India Prívate Limited, New Delhi PrenticeHall of Japan, Inc., Tokyo Simón & Schuster Asia Pte. Ltd., Singapore Editora PrenticeHall do Brasil, Ltda., Rio de Janeiro
This book is dedicated to the ChiaoTung University Centennial
(18961996)
Contents
PREFACE xiii
l INTRODUCTION 1
1.1 Motivation 1
1.2 Fuzzy Systems 3
1.3 Neural Networks 4
1.4 Fuzzy Neural Integrated Systems 7
1.5 Organization of the Book 8
1.6 References 9
PART I
FUZZY SYSTEMS
2 BASICS OF FUZZY SETS 10
2.1 Fuzzy Sets and Operation on Fuzzy Sets 
10 

2.2 Extensions of Fuzzy SetConcepts 
22 

2.2.1 Other Kinds of Fuzzy Sets, 22 

2.2.2 Further Operations on Fuzzy Sets, 
23 

2.3 Extensión Principie and Its Applications 
29 
2.3.1 Operations ofType2 Fuzzy Sets, 31
2.3.2 Consistency Degree ofTwo Fuzzy Sets, 32
2.4 Concluding Remarks
2.5 Problems
34
33
3
FUZZY RELATIONS 37
3.1 Basics of Fuzzy Relations
3.2 Operations on Fuzzy Relations
37
41
3.3 Various Types of Binary Fuzzy Relations
3.3.1
Similarity
Relations,
49
47
3.3.2 Resemblance 
Relations, 
51 
3.3.3 Fuzzy Partial Ordering, 
52 
3.4 Fuzzy Relation Equations
3.5 Concluding Remarks
3.6 Problems 60
59
54
4 FUZZY MEASURES 63
4.1 Fuzzy Measures
64
4.1.1 Belief and Plausibility Measures, 65
4.1.2 Probability Measures, 72
4.1.3 Possibility and Necessity Measures, 74
4.2 Fuzzy Integráis 80
4.3 Measures of Fuzziness 83
4.4 Concluding Remarks 85
4.5 Problems 86
POSSIBILITY THEORY AND FUZZY ARITHMETIC 89
5.1 Basics of Possibility Theory
5.2 Fuzzy Arithmetic
93
89
5.2.1 
Interval Representarían ofUncertain Valúes, 94 

5.2.2 
Operations and Properties 
of Fuzzy Numbers, 
97 
5.2. 3 
Ordering of Fuzzy Numbers, 
108 
5.3 Concluding Remarks
5.4 Problems
111
111
FUZZY LOGIC AND APPROXIMATE REASONING
6.1 Linguistic Variables
Fuzzy Logic
6.2
118
114
6.2.1 Truth Valúes and Truth Tables in Fuzzy Logic,
6.2.2 Fuzzy Propositions,
121
6.3 Approximate Reasoning
123
6.3.1 Categorical 
Reasoning, 
123 
6.3.2 Qualitative 
Reasoning, 
126 
114
119
6.3.3 Syllogistic Reasoning, 127
6.3.4 Dispositional Reasoning, 129
6.4 Fuzzy Expert Systems
MILORD,
6.4.2 ZII, 136
6.4.1 132
6.5 Concluding Remarks
131
139
6.6
Problems
140
7
FUZZY LOGIC CONTROL SYSTEMS 142
7.1 Basic Structure and Operation of Fuzzy Logic Control Systems 
142 

7.1.1 InputOutput Spaces, 143 

7.1.2 Fuzzifier, 145 

7.1.3 Fuzzy Rule Base, 145 

7.1.4 Inference Engine, 145 

7.1.5 Defuzzjfier, 156 

7.2 Design Methodology of Fuzzy Control Systems 159 

7.3 Stability Analysis of Fuzzy Control Systems 166 

7.4 Applications of Fuzzy Controllers 172 

7.5 Concluding Remarks 175 

7.6 Problems 177 

8 
APPLICATIONS OF FUZZY THEORY 180 

8.1 Fuzzy Pattern Recognition 
180 

8.1.1 Classification Methods Based on Fuzzy Relations, 
182 

8.1.2 Fuzzy Clustering, 
186 

8.2 Fuzzy Mathematical Programming 
190 

8.3 Fuzzy Databases 193 

8.3.1 Fuzzy Relational Databases, 194 

8.3.2 Fuzzy ObjectOriented Databases, 196 

8.4 HumanMachine Interactions 
199 

8.5 Concluding Remarks 202 

8.6 Problems 
203 

II 
ARTIFICIAL NEURAL NETWORKS 

9 INTRODUCTION TO ARTIFICIAL NEURAL NETWORKS 
205 

9.1 
Fundamental Concepts of Artificial Neural Networks 
205 

9.2 
Basic Models and Learning Rules ofANNs 207 

9.2.1 Processing Elements, 
207 

9.2.2 Connections, 211 

9.2.3 Learning Rules, 212 

9.3 
Distributed Representations 
217 

9.4 
Concluding Remarks 221 

9.5' 
Problems 
221 

10 FEEDFORWARD NETWORKS AND SUPERVISED LEARNING 
224 

10.1 SingleLayer Perceptron Networks 
224 

10.1.1 Perceptron Learning Rule, 225 

10.1.2 Adaline, 231 

10.2 Multilayer Feedforward Networks 
235 
Contents
10.2.1 Back Propagation, 236
10.2.2 Learning Factors ofBack Propagation, 244
10.2.3 TimeDelay Neural Networks,
250
10.3
Other Feedforward Networks
253
10.3.1 FunctionalLink Networks, 253
10.3.2 Tree Neural Networks, 254
10.3.3 Wavelet Neural Networks, 255
10.4 Concluding Remarks
10.5 Problems
257
256
11 SINGLELAYER FEEDBACK NETWORKS AND ASSOCIATIVE MEMORIES 263
11.1 Hopfield Networks
263
11.1.1 Discrete Hopfield Networks,
11.1.2 Continuous Hopfield Networks,
263
267
11.2 Associative Memories
272
11.2.1 Recurrent Autoassociative Memory—Hopfield Memory, 273
11.2.2 Bidirectional Associative Memory, 277
11.2.3 Temporal Associative
Memory, 282
11.3 Optimization Problems
284
11.3.1 Hopfield Networks for Optimization Problems, 
284 

11.3.2 Boltzmann Machines, 291 

11.4 Concluding Remarks 
296 

11.5 Problems 
298 

12 UNSUPERVISED LEARN1NG NETWORKS 301 

12.1 Unsupervised Learning Rules 301 

12.1.1 Signal Hebbian Learning Rule, 302 

12.1.2 Competitive Learning Rule, 304 

12.1.3 Differential Hebbian Learning Rule, 308 

12.1.4 Differential Competitive Learning Rule, 309 

12.2 Hamming Networks 
309 

12.3 SelfOrganizing Feature Maps 311 

12.4 Adaptive Resonance Theory 
314 

12.4.1 The InstarOutstar Model—Shunting Activation 
Equations, 
315 
12.4.2 Adaptive Resonance Theory, 321
12.5 Counterpropagation Networks 326
12.6 Radial Basis Function Networks 328
12.7 Adaptive Bidirectional Associative Memories 331
12.8 Hierárchical Networks—Neocognitron 332
12.9 Concluding Remarks 336
12.10 Problems 337
13 RECURRENT NEURAL NETWORKS 340
viii
13.1 Feedback Backpropagation Networks
341
13.1.1 Recurrent Backpropagation Networks,
13.1.2 Partially Recurrent Networks,
345
13.2 Fully Recurrent Networks
349
13.2.1 RealTime Recurrent Learning,
350
341
Contents
13.2.2 TimeDependent Recurrent Backpropagation, 353
13.2.3 SecondOrder
13.2.4 The Extended Kalman Filter, 363
Recurrent Networks,
357
13.3 Reinforcement Learning
367
13.3.1 Associative RewardPenalty, 368
13.3.2 REINFORCE Algorithms, 371
13.3.3 Temporal Difference Methods,
373
13.4 Concluding Remarks 
379 

13.5 Problems 
380 

14 GENETIC ALGORITHMS 
382 
14.1 Basics of Genetic Algorithms
14.2 FurtherEvolutionof Genetic Algorithms
382
14.2.1 Improved Selection Schemes, 393
14.2.2
14.3 Hybrid Genetic Algorithms
14.4 Applications of Genetic Algorithms
Advanced Operators, 394
398
399
393
14.4.1 Genetic Algorithms for Neural Network Parameter Learning, 399
14.4.2 Genetic Algorithms for Path Planning, 404
14.4.3 Genetic Algorithms for System Identification and Controls, 405
14.5 Genetic Programming 
406 
14.6 Concluding Remarks 
411 
14.7 Problems
411
15 STRUCTUREADAPTIVE NEURAL NETWORKS
414
15.1 Simulated Evolution for Neural Network Structure Learning
15.1.1 Genetic Algorithms for Designing
15.1.2
Neural Networks,
414
Evolutionary
Programming for Designing
Neural Networks,
15.2 Pruning Neural Networks
424
15.2.1 Weight Decay, 424
15.2.2 Connection and Node Pruning, 425
15.3 Growing Neural Networks
427
15.3.1 Input Space Partitioning, 427
15.3.2 Prototype Selection, 433
15.4 Growing and Pruning Neural Networks
15.4.1 ActivityBased
15.4.2 Function Networks,
Structural Adaptation,
439
15.5 Concluding Remarks
15.6 Problems
443
442
437
437
16 APPLICATIONS ÓF NEURAL NETWORKS
445
414
420
16.1 Neural Networks in Control Systems
16.1.1
16.1.2 Cerebellar Model Articulation
Dynamic Backpropagation
445
for System Identification
Controller,
457
16.2 Neural Networks in Sensor Processing
16.3 Neural Networks in Communications
464
468
and Control,
448
Contents
_{i}_{x}
16.4 Neural KnowledgeBased Systems 470
16.5 Concluding Remarks 474
16.6 Problems 475 .
PART III
FUZZY NEURAL INTEGRATED SYSTEMS
17 INTEGRATING FUZZY SYSTEMS AND NEURAL NETWORKS 478
17.1 Basic Concept of Integrating Fuzzy Systems and Neural Networks 
478 

17.1.1 General Comparisons of Fuzzy Systems and Neural Networks, 478 

17.1.2 Choice of Fuzzy Systems or Neural Networks, 480 

17.1.3 Reasons for Integrating Fuzzy Systems and Neural Networks, 
481 

17.2 The Equivalence of Fuzzy Inference Systems and Neural Networks 
482 
17.2.1 Fuzzy Inference Systems as Universal Approximators, 483
17.2.2 Equivalence ofSimplified Fuzzy Inference Systems and Radial Basis Function Networks, 487
17.2.3 Stability Analysis of Neural Networks Using Stability Conditions of Fuzzy Systems, 489
17.3 Concluding Remarks
17.4 Problems
494
494
18 NEURALNETWORKBASED
FUZZY SYSTEMS 496
18.1 Neural Realization of Basic Fuzzy Logic Operations
18.2 Neural NetworkBased Fuzzy Logic Inference
498
18.2.1 Fuzzy Inference Networks,
18.2.2 Fuzzy Aggregation Networks,
18.2.3 Neural NetworkDriven
498
504
Fuzzy Reasoning,
18.3 Neural NetworkBased Fuzzy Modeling
511
18.3.1 RuleBased Neural Fuzzy Modeling,
511
507
496
18.3.2 Neural Fuzzy Regression Models, 
517 
18.3.3 Neural Fuzzy Relational Systems, 
523 
18.4 Concluding Remarks
18.5 Problems
531
530
19 NEURAL FUZZY CONTROLLERS 533
19.1 Types of Neural Fuzzy Controllers
19.2 Neural Fuzzy Controllers with Hybrid StructureParameter Learning
534
19.2.1 Fuzzy Adaptive
19.2.2 Fuzzy Basis Function Network with Orthogonal Least
Learning Control Network,
535
Squares
Learning, 545
19.3 Parameter Learning for Neural Fuzzy Controllers
551
535
19.3.1 Neural Fuzzy Controllers with Fuzzy Singleton Rules, 551
19.3.2 Neural Fuzzy Controllers with TSK Fuzzy Rules, 556
19.3.3 Fuzzy Operator Tuning, 559
Contents
19.4
Structure Learning for Neural Fuzzy Controllers
561
19.4.1 Fuzzy Logic Rule Extraction from Numerical Training Data,
19.4.2 Genetic Algorithms for Fuzzy Partition oflnput
Space,
567
19.5 OnLine Structure Adaptive Neural Fuzzy Controllers
573
562
19.5.1 FALCON with OnLine Supervised Structure and Parameter Learning, 
573 

19.5.2 ' FALCON with ARTNeural Learning, 
579 

19.6 Neural Fuzzy Controllers with Reinforcement Learning 592 

19.6.1 FALCON with Reinforcement Learning, 592 

19.6.2 Generalized Approximate ReasoningBased Intelligent Controller, 600 

19.7 Concluding Remarks 
604 

19.8 Problems 
605 

20 FUZZY LOGICBASED NEURAL NETWORK MODELS 609 

20.1 Fuzzy Neurons 
609 

20.1.1 Fuzzy Neuron of Type I, 610 

20.1.2 Fuzzy Neuron of Type II, 612 

20.1.3 Fuzzy Neuron of Type IIL 613 

20.2 Fuzzification of Neural Network Models 
614 

20.2.1 Fuzzy Perceptron, 614 

20.2.2 Fuzzy Classification with the Backpropagation Network, 618 

20.2.3 Fuzzy Associative Memories, 620 

20.2.4 Fuzzy ART Models, 626 

20.2.5 Fuzzy Kohonen Clustering Network, 635 

20.2.6 Fuzzy RCE Neural Network, 639 

20.2.7 Fuzzy Cerebellar Model Articulation Controller, 641 

20.3 Neural Networks with Fuzzy Training 643 

20.3.1 Neural Networks with Fuzzy Teaching Input, 643 

20.3.2 Neural Networks with Fuzzy Parameters, 648 

20.3.3 Fuzzy Control for Learning Parameter Adaptation, 654 

20.4 Concluding Remarks 
657 

20.5 Problems 
658 

21 FUZZY NEURAL SYSTEMS FOR PATTERN RECOGNITION 661 

21.1 Fuzzy Neural Classification 661 

21.1.1 Uncertainties with TwoClass Fuzzy Neural Classification Boundaries, 
661 
21.1.2
21.1.3 Genetic Algorithms for Fuzzy Classification
Multilayer
Fuzzy Neural Classification
Networks,
667
Using Fuzzy Rules, 674
21.2 Fuzzy Neural Clustering
678
'21.2.1 
Fuzzy Competitive Learning for Fuzzy Clustering, 
678 

21.2.2 
Adaptive Fuzzy Leader Clustering, 
680 
21.3 Fuzzy Neural Models for Image Processing
684
21.3.1 Fuzzy SelfSupervised Multilayer Network for Object Extraction, 684
21.3.2 Genetic Algorithms with Fuzzy Fitness Function for Image Enhancement, 690
21.4 Fuzzy Neural Networks for Speech Recognition
692
Contents
xi
21.5
FuzzyNeural Hybrid Systems for System Diagnosis 696
21.6 Concluding Remarks 700
21.7 Problems 702
A 
MATLAB FUZZY LOGIC TOOLBOX 704 

A.l 
Demonstration Cases 
706 

A. 
1.1 
A Simple Fuzzy Logic Controller, 
706 

A 1.2 
A Neural Fuzzy System—ANFIS, 707 

A. 
1.3 
Fuzzy cmeans Clustering, 
708 

A.2 
Demonstration of Fuzzy Logic Applications 
709 

A.2.1 
A Fuzzy Controller for Water Bath Temperature Control, 709 

A.2.2 
A Neural Fuzzy Controller for Water Bath Temperature Control, 
712 

B 
MATLAB NEURAL NETWORK TOOLBOX 715 

B. 1 
Demonstration of Various Neural Network Models 716 

B.l.l Hebbian Learning Rule, 716 

B.1.2 Perceptron Learning Rule, 717 

B.1.3 
Adaline, 718 

B.1.4 BackPropagation, 719 

B.1.5 
Hopfield Network, 722 

B.1.6 
Instar Learning Rule, 723 

B.1.7 Competitive Learning Rule, 724 

B.1.8 
LVQ Learning Rule (Supervised Competitive Learning Rule), 726 

B.1.9 SelfOrganizing Feature Map, 726 B.l.lO Radial Basis Function Network, 727 

B.l.l 1 Elman Network, 729 

B.2 
Demonstration of Neural Network Applications 730 

B.2.1 
Adaptive Noise Cancelation Using the Adaline Network, 730 

5.2.2 
Nonlinear System Identification, 
731 

BIBLIOGRAPHY 734 

INDEX 783 

xii 
Contents 
Preface
Fuzzy systems and neural networks have attracted the growing interest of researchers, sci entists, engineers, practitioners, and students in various scientific and engineering areas. Fuzzy sets and fuzzy logic are based on the way the brain deals with inexact information, while neural networks (or artificial neural networks) are modeled after the physical archi tecture of the brain. Although the fundamental inspirations for these two fields are quite dif ferent, there are a number of parallels that point out their similarities, The intriguing differences and similarities between these two fields have prompted the writing of this book to examine the basic concepts of fuzzy set theory, fuzzy logic, fuzzy logic control systems, and neural networks; to explore their applications separately and in combination; and to explore the synergism of integrating these two techniques for the realization of intelligent systems for various applications. Since the publication of Lotfi Zadeh's seminal work, "Fuzzy Sets," in 1965, the num ber and variety of applications of fuzzy logic have been growing. The performance of fuzzy logic control and decisión systems critically depends On the input and output membership functions, the fuzzy logic control rules, and the fuzzy inference mechanism. Although a great amount of literature has been published dealing with the applications and the theoret ¿cal issues of fuzzy logic control and decisión systems, a unified and systematic design methodology has yet to be developed. The use of neural networks to automate and synthe size the design of a general fuzzy logic control or decisión system presents a novel and innovative approach and a viable design solution to this difficult design and realization problem. The publication of J. J. Hopfield's seminal work on a singlelayer feedback neural network with symmetric weights in 1982 revived neural network research activity from its doldrums in the 1960s and 1970s because of the ability of neural networks to classify, store,
xiii
recall, and associate information or patterns. The performance of neural networks depends on the computational function of the neurons in the network, the structure and topology of the network, and the learning rule or the update rule of the connecting weights. Publication of the article "Learning Representations by Backpropagation Errors," by Rumelhart, Hin ton, and Williams in 1986 has further extended and improved the learning ability of neural networks. This concept of trainable neural networks further strengthens the idea of utilizing the learning ability of neural networks to learn the fuzzy control rules, the membership functions, and other parameters of a fuzzy logic control or decisión system. Fuzzy sets and fuzzy logic were developed as a means for representing, manipulating, and utilizing uncertain information and to provide a framework for handling uncertainties and imprecisión in realworld applications, while neural networks were developed to pro vide computational power, fault tolerance, and learning capability to the systems. This happy marriage of techniques from fuzzy logic systems and neural networks reaps the ben efits of both neural networks and fuzzy logic systems. That is, the neural networks provide the connectionist structure (fault tolerance and distributed representation properties) and learning abilities to the fuzzy logic systems, and the fuzzy logic systems provide a struc tural framework with highlevel fuzzy IFTHEN rule thinking and reasoning to the neural networks. It is this synergistic integration of neural networks and fuzzy logic systems into functional systems with lowlevel learning capability and highlevel thinking and reasoning that separates our book from other books on fuzzy set theory, neural networks, neural fuzzy systems, and fuzzy neural networks. This textbook was written to provide engineers, scientists, researchers, and students involved in fuzzy systems, neural networks, and fuzzy neural integrated systems with a comprehensive, wellorganized, and uptodate account of basic principies underlying the design, analysis, and synthesis of fuzzy neural integrated systems. This book is the outgrowth of lecture notes for courses taught by the authors at Pur due University and the National ChiaoTung University. The material has been tested exten sively in the classroom as well as through numerous tutorial short courses and several intensive summer courses taught by both authors for the past several years. The suggestions and criticisms of students in these courses have had a significant influence on the way the material is presented in this book. The mathematical level in all chapters is well within the grasp of firstyear gradúate students in a technical discipline such as engineering, computer science, or technology requiring introductory preparation in classical set theory, discrete mathematics, matrix computations, probability, and computer programming. In presenting the material, empha sis is placed on the utilization of developed results from basic concepts. In addition to the MATLAB® examples in the appendices, numerous examples are worked out in the text to illustrate the discussion, and exercises of various types and complexity are included at the end of each chapter. Some of these problems allow the reader to gain further insight into the points discussed in the text through practice in problem solution. Others serve as supple ments and extensions of the material in the book. A complete solutions manual is also avail able to the instructor from the publisher. This textbook consists of three major parts and two appendices:
Part I (Chapters 2 to 8) covers fundamental concepts and operations of fuzzy sets, fuzzy relations, fuzzy measures, possibility theory, fuzzy logic and approximate reasoning,
_{x}_{i}_{v}
Preface
and their application to fuzzy logic control systems and other systems such as pattern recog nition systems and fuzzy expert systems. Part II (Chapters 9 to 16) covers important concepts and topics in neural networks, including singlelayer and multilayer feedforward networks, recurrent networks, and unsu pervised learning, supervised learning, and reinforcement learning for training neural net works. Genetic algorithms and their use for the structure and parameter learning of neural networks are also explored. Part ID (Chapters 17 to 21) covers three major integrated systems and their rationale for integration. Neural fuzzy (control) systems, fuzzy neural networks (for pattern recogni tion), and fuzzy neural hybrid systems are discussed and explored.  Finally, Appendices A and B illustrate the computer simulations of fuzzy logic sys tems and neural networks using the MATLAB Fuzzy Logic Toolbox and the Neural Net work Toolbox. These toolboxes are useful for student projects and independent studies. The authors have used the materials from this book to teach three different gradúate courses (15week semester courses):
• Fuzzy Control Systems: Chapters 18 and Appendix A
• Introduction to Neural Networks: Chapter 1, Chapters 916, and Appendix B
• NeuroFuzzy Control Systems: Chapters 1 and 2, Sections 3.1, 3.2,3.4, and 6.16.3, Chapter 7, Chapters 910, Sections 12.112.4 and 12.6, Chapter 17, and Sections 19.119.3, and 19.519.6
Other combinations of the material in this book for teaching are also possible. For instance, for a neural network gradúate course with an emphasis on (fuzzy) pattern recognition, Chapters 23 and 913 plus selected sections in Chapters 20 and 21 can be covered in one semester; Chapters 1721 can be taught in a secondyear gradúate course on neural fuzzy systems with a prerequisite of courses in fuzzy logic and neural networks.
ACKNOWLEDGMENTS
We are indebted to a number of individuáis who, directly or indirectly, assisted in the prepa ration of the text. In particular, we wish to extend our appreciation to Professors L. A. Zadeh, G. N. Saridis, T. T. Lee, J. Yen, W. H. Tsai, C. C. Teng, M. J. Syu, W. P. Yang, C. C. Jou, M. J. Chung, C. L. Chen, S. F. Su, and Dr. Harold Su. As is true with most projects car ried out in a university environment, our students over the past few years have influenced not only our thinking but also the topics covered in this book. The following individuáis have worked with us in the course of their gradúate programs: C. J. Lin, C. F. Juang, I. F. Chung, Y. C. Lú, Dr. W. Hsu, Dr. S. Park, G. H. Kim, T. K. Yin, Y. J. Wang, L. K. Sheu, M. C. Kan, W. F. Lin, S. C. Hsiao, W. C. Lin, C. P. Lee, and C. Y. Hu. Thanks is also due to I. F. Chung, Linda Stovall, Kitty Cooper, C. F. Juang, W. F. Lin, S. C. Hsiao, and Dee Dee Dex ter for typing numerous versions of the manuscript. We would also like to express our appreciation to Purdue University's National Sci ence Foundation Engineering Research Center for Intelligent Manufacturing Systems, the
Preface
_{x}_{v}
National Science Council of Taiwan, R.O.C., the Ford Foundation, and the Spring Founda tion of National ChiaoTung University for their sponsorship of our research activities in neurofuzzy systems, robotic assembly and manufacturing, and related areas. Finally, we thank Bemard M. Goodwin, editor and publisher at Prentice Hall, for his strong support throughout this project; Karen Fortgang, production editor at bookworks, for her skillful coordination of the production of the book; and Cristina Palumbo of the MathWorks, Inc. for her continued support of our use of MATLAB and the Fuzzy Logic Toolbox and the Neural Network Toolbox. Last but not the least, we would like to thank our parents, M. T. Lin and Y. L. Kuo, and M. K. Lee and K. H. Un, for their constant encouragement. Without their constató encouragement, this book would not have been possible. C. T. Lin would also like to thank his brother C. I Lin for providing an excellent environment for him to finish the manuscript; C. S. G. Lee would like to thank his wife, PeiLing, and his children, Lawrence, Steven, and Anthony, for understanding why their daddy could not spend more time with them at their ballgames.
C. T. Lin
Hsinchu,
Taiwan
C. S. G. Lee
West Lafayette,
Indiana
xvi
Contents
7
Introduction
1.1 MOTIVATION
Fuzzy systems (FSs) and neural networks (NNs) have attracted the growing interest of researchers in various scientific and engineering areas. The number and variety of applica tions of fuzzy logic and neural networks have been increasing, ranging from consumer products and industrial process control to medical instrumentation, information systems, and decision analysis. Fuzzy logic (FL) is based on the way the brain deals with inexact information, while neural networks are modeled after the physical architecture of the brain. Although the fun damental inspirations for these two fields are quite different, there are a number of parallels that point out their similarities. Fuzzy systems and neural networks are both numerical modelfree estimators and dynamical systems. They share the common ability to improve the intelligence of systems working in an uncertain, imprecise, and noisy environment To a certain extent, both systems and their techniques have been successfully applied to a vari ety of control systems and devices to improve their intelligence. Both fuzzy systems and neural networks have been shown to have the capability of modeling complex nonlinear processes to arbitrary degrees of accuracy. Although fuzzy systems and neural networks are formally similar, there are also sig nificant differences between them. Fuzzy systems are structured numerical estimators. They start from highly formalized insights about the structure of categories found in the real world and then articulate fuzzy IFTHEN rules as a kind of expert knowledge. Fuzzy sys tems combine fuzzy sets with fuzzy rules to produce overall complex nonlinear behavior. Neural networks, on the other hand, are trainable dynamical systems whose learning, noise
1
tolerance, and generalization abilities grow out of their connectionist structures, their dynamics, and their distributed data representation. Neural networks have a large number of highly interconnected processing elements (nodes) which demonstrate the ability to learn and generalize from training patterns or data; these simple processing elements also collectively produce complex nonlinear behavior. In light of their similarities and differences, fuzzy systems and neural networks are suitable for solving many of the same problems and achieving some degree of machine intelligence. Their differences have prompted a recent surge of interest in merging or com bining them into a functional system to overcome their individual weaknesses. This innov ative idea of integration reaps the benefits of both fuzzy systems and neural networks. That is, neural networks provide fuzzy systems with learning abilities, and fuzzy systems pro vide neural networks with a structural framework with highlevel fuzzy IFTHEN rule thinking and reasoning. Consequently, the two technologies can complement each other, with neural networks supplying the brute force necessary to accommodate and interpret large amounts of sensory data, and fuzzy logic providing a structural framework that uti lizes and exploits these lowlevel results. The incorporation of fuzzy logic and neural net work techniques has been used in the conception and design of complex systems in which analytical and expert system techniques are used in combination. Moreover, the synergism of fuzzy logic and neural network techniques has been employed in a wide variety of con sumer products, endowing these products with the capability to adapt and learn from expe rience. Most neurofuzzy products are fuzzy rulebased systems in which neural network techniques are used for learning and/or adaptation. Viewed from a much broader perspective, fuzzy logic and neural networks are con stituents of an emerging research area, called soft computing, a term coined by Lotfi Zadeh (the father of fuzzy logic). It is believed that the most important factor that underlies the marked increase in machine intelligence nowadays is the use of soft computing to mimic the ability of the human mind to effectively employ modes of reasoning that are approxi mate rather than exact. Unlike traditional hard computing whose prime desiderata are pre cision, certainty, and rigor, soft computing is tolerant of imprecision, uncertainty, and partial truth. The primary aim of soft computing is to exploit such tolerance to achieve tractability, robustness, a high level of machine intelligence, and a low cost in practical applications. In addition to fuzzy logic and neural networks, another principal constituent of soft computing is probabilistic reasoning, which subsumes genetic algorithms, evolu tionary programming, belief networks, chaotic systems, and parts of learning theory. Among these, genetic algorithms and evolutionary programming are similar to neural net works in that they are based on lowlevel microscopic biological models. They evolve toward finding better solutions to problems, just as species evolve toward better adaptation to their environments. Coincidentally, fuzzy logic, neural networks, genetic algorithms, and evolutionary programming are also considered the building blocks of computational intelligence as conceived by James Bezdek. Computational intelligence is lowlevel cog nition in the style of the human mind and is in contrast to conventional (symbolic) artifi cial intelligence. In the partnership of fuzzy logic, neural networks, and probabilistic reasoning, fuzzy logic is concerned in the main with imprecision and approximate reason ing, neural networks with learning, and probabilistic reasoning with uncertainty. Since fuzzy logic, neural networks, and probabilistic reasoning are complementary rather than competitive, it is frequently advantageous to employ them in combination rather than
2
Introduction
Chap. 1
exclusively. Various important topics, both in theories and applications, of fuzzy systems and neural networks (with genetic algorithms) and their synergism are covered in this book.
1.2 FUZZY SYSTEMS
In the past decade, fuzzy systems have supplanted conventional technologies in many sci entific applications and engineering systems, especially in control systems and pattern recognition. We have also witnessed a rapid growth in the use of fuzzy logic in a wide vari ety of consumer products and industrial systems. Prominent examples include washing machines, camcorders, autofocus cameras, air conditioners, palmtop computers, vacuum cleaners, automobile transmissions, ship navigators, subway trains, combustion control regulators, and cement kilns. The same fuzzy technology, in the form of approximate rea soning, is also resurfacing in information technology, where it provides decisionsupport and expert systems with powerful reasoning capabilities bound by a minimum of rules. It is this wealth of deployed, successful applications of fuzzy technology that is, in the main, responsible for current interest in fuzzy systems. Fuzzy sets, introduced by Zadeh in 1965 as a mathematical way to represent vague ness in linguistics, can be considered a generalization of classical set theory. The basic idea of fuzzy sets is quite easy to grasp. In a classical (nonfuzzy) set, an element of the universe either belongs to or does not belong to the set. That is, the membership of an element is crisp—it is either yes (in the set) or no (not in the set). A fuzzy set is a generalization of an ordinary set in that it allows the degree of membership for each element to range over the unit interval [0,1]. Thus, the membership function of a fuzzy set maps each element of the universe of discourse to its range space which, in most cases, is set to the unit interval. One of the biggest differences between crisp and fuzzy sets is that the former always have unique membership functions, whereas every fuzzy set has an infinite number of member ship functions that may represent it This enables fuzzy systems to be adjusted for maxi mum utility in a given situation. In a broad sense, as pointed out by Lotfi Zadeh, any field can be fuzzified and hence generalized by replacing the concept of a crisp set in the target field by the concept of a fuzzy set For example, we can fuzzify some basic fields such as arithmetic, graph theory, and probability theory to develop fuzzy arithmetic, fuzzy graph theory, and fuzzy probability theory, respectively; we can also fuzzify some applied fields such as neural networks, genetic algorithms, stability theory, pattern recognition, and math ematical programming to obtain fuzzy neural networks, fuzzy genetic algorithms, fuzzy stability theory, fuzzy pattern recognition, and fuzzy mathematical programming, respec tively. The benefits of such fuzzification include greater generality, higher expressive power, an enhanced ability to model realworld problems, and a methodology for exploit ing the tolerance for imprecision. Hence, fuzzy logic can help to achieve tractability, robust ness, and lower solution cost. Fuzziness is often confused with probability. The fundamental difference between them is that fuzziness deals with deterministic plausibility, while probability concerns the likelihood of nondeterministic, stochastic events. Fuzziness is one aspect of uncertainty. It is the ambiguity (vagueness) found in the definition of a concept or the meaning of a term such as "young person" or "large room." However, the uncertainty of probability generally relates to the occurrence of phenomena, as symbolized by the concept of randomness. In
Sec. 1.2
Fuzzy Systems
_{3}
other words, a statement is, probabilistic if it expresses some kind of likelihood or degree of certainty or if it is the outcome of clearly defined but randomly occurring events. For exam ple, the statements "There is a 5050 chance that he will be there," "It will rain tomorrow," "Roll the dice and get a four" have the uncertainty of randomness. Hence, fuzziness and randomness differ in nature; that is, they are different aspects of uncertainty. The former conveys "subjective" human thinking, feelings, or language, and the latter indicates an "objective" statistic in the natural sciences. From the modeling point of view, fuzzy models and statistical models also possess philosophically different kinds of information: fuzzy memberships represent similarities of objects to imprecisely defined properties, while probabilities convey information about relative frequencies. One major feature of fuzzy logic is its ability to express the amount of ambiguity in human thinking and subjectivity (including natural language) in a comparatively un distorted manner. Thus, when is it appropriate to use fuzzy logic? When the process is concerned with continuous phenomena (e.g., one or more of the control variables are con tinuous) that are not easily broken down into discrete segments; when a mathematical model of the process does not exist, or exists but is too difficult to encode, or is too complex to be evaluated fast enough for realtime operation, or involves too much memory on the designated chip architecture; when high ambient noise levels must be dealt with or it is important to use inexpensive sensors and/or lowprecision microcontrollers; when the process involves human interaction (e.g., human descriptive or intuitive thinking); and when an expert is available who can specify the rules underlying the system behavior and the fuzzy sets that represent the characteristics of each variable. With these properties, fuzzy logic techniques find their applications in such areas as [Munakata and Jani, 1994] (1) control (the most widely applied area), (2) pattern recogni tion (e.g., image, audio, signal processing), (3) quantitative analysis (e.g., operations research, management), (4) inference (e.g., expert systems for diagnosis, planning, and pre diction; natural language processing; intelligent interface; intelligent robots; software engi neering), and (5) information retrieval (e.g., databases).
1.3 NEURAL NETWORKS
Fundamentally, there are two major different approaches in the field of artificial intelli gence (AI) for realizing human intelligence in machines [Munakata, 1994]. One is sym bolic AI, which is characterized by a high level of abstraction and a macroscopic view. Classical psychology operates at a similar level, and knowledge engineering systems and logic programming fall in this category. The second approach is based on lowlevel micro scopic biological models. It is similar to the emphasis of physiology or genetics. Artificial neural networks and genetic algorithms are the prime examples of this latter approach. They originated from modeling of the brain and evolution. However, these biological mod els do not necessarily resemble their original biological counterparts. Neural networks are a new generation of information processing systems that are deliberately constructed to make use of some of the organizational principles that characterize the human brain. The main theme of neural network research focuses on modeling of the brain as a parallel com putational device for various computational tasks that were performed poorly by traditional serial computers. Neural networks have a large number of highly interconnected processing
4
Introduction
Chap. 1
elements (nodes) that usually operate in parallel and are configured in regular architectures. The collective behavior of an NN, like a human brain, demonstrates the ability to learn, recall, and generalize from training patterns or data. Since the first application of NNs to consumer products appeared at the end of 1990, scores of industrial and commercial appli cations have come into use. Like fuzzy systems, the applications where neural networks have the most promise are also those with a realworld flavor, such as speech recognition, speechtotext conversion, image processing and visual perception, medical applications, loan applications and counterfeit checks, and investing and trading. Models of neural networks are specified by three basic entities: models of the pro cessing element themselves, models of interconnections and structures (network topology), and the learning rules (the ways information is stored in the network). Each node in a neural network collects the values from all its input connections, per forms a predefined mathematical operation, and produces a single output value. The infor mation processing of a node can be viewed as consisting of two parts: input and output. Associated with the input of a node is an integration function (typically a dot product) which serves to combine information or activation from an external source or other nodes
into a net input to the node. A second action of each node is to output an activation value as
a function of its net input through an activation function which is usually nonlinear. Each
connection has an associated weight that determines the effect of the incoming input on the activation level of the node. The weights may be positive (excitatory) or negative (inhibi tory). The connection weights store the information, and the value of the connection weights is often determined by a neural network learning procedure; It is through adjust ment of the connection weights that the neural network is able to learn.
In a neural network, each node output is connected, through weights, to other nodes or to itself. Hence, the structure that organizes these nodes and the connection geometry among them should be specified for a neural network. We can first take a node and combine
it with other nodes to make a layer of nodes. Inputs can be connected to many nodes with
various weights, resulting in a series of outputs, one per node. This results in a singlelayer feedforward network We can further interconnect several layers to form a multilayer feed forward network. The layer that receives inputs is called the input layer and typically per forms no computation other than buffering of the input signal. The outputs of the network are generated from the output layer. Any layer between the input and the output layers is called a hidden layer because it is internal to the network and has no direct contact with die external environment. There may be from zero to several hidden layers in a neural network. The two types of networks mentioned above are feedforward networks since no node out put is an input to a node in the same, layer or preceding layer. When outputs are directed back as inputs to same or precedinglayer nodes, the network is afeedback network. Feed back networks that have closed loops are called recurrent networks.
The third important element in specifying a neural network is the learning scheme. Broadly speaking, there are two kinds of learning in neural networks: parameter learning, which concerns the updating of the connection weights in a neural network, and structure learning, which focuses on the change in the network structure, including the number of nodes and their connection types. These two kinds of learning can be performed simultane ously or separately. Each kind of learning can be further classified into three categories:
supervised learning, reinforcement learning, and unsupervised learning. These three cate gories of learning will be covered in detail in Part II of this book.
Sec. 1.3
Neural Networks
_{5}
Among the existing neural network models, two of the important ones that started the modern era of neural networks are the Hopfield network and the backpropagation network. The Hopfield network is a singlelayer feedback network with symmetric weights proposed by John Hopfield in 1982. The network has a point attractor dynamics, making its behavior relatively simple to understand and analyze. This result provides the mathematical foundation for understanding the dynamics of an important class of networks. The backpropagation net work is a multilayer feedforward network combined with a gradientdescenttype learning algorithm called the backpropagation learning rule by Rumelhart, Hinton, and Williams [1986b], In this learning scheme, the error at the output layer is propagated backward to adjust the connection weights of preceding layers and to minimize output errors. The backpropaga tion learning rule is an extension of the work of Rosenblatt, Widrow, and Hoff to deal with learning in complex multilayer networks and thereby provide an answer to one of the most severe criticisms of the neural network field. With this learning algorithm, multilayer net works can be reliably trained. The backpropagation network has had a major impact on the field of neural networks and is the primary method employed in most of the applications. Another important tool for the structure and parameter learning of neural networks is genetic algorithms (GAs). Genetic algorithms are search algorithms based on the mechan ics of natural selection and natural genetics. The mathematical framework of GAs was developed in the 1960s and is presented in Holland's pioneering book [Holland, 1975]. Genetic algorithms have been used primarily in optimization and machine learning prob lems. A simple GA processes a finite population of fixedlength binary strings called genes. Genetic algorithms have three basic operators: reproduction of solutions based on their fit ness, crossover of genes, and mutation for random change of genes. Another operator asso ciated with each of these three operators is the selection operator, which produces survival of the fittest in the GA. Reproduction directs the search toward the best existing strings but does not create any new strings, the crossover operator explores different structures by exchanging genes between two strings at a crossover position, and mutation introduces diversity into the population by altering a bit position of the selected string. The mutation operation is used to escape the local minima in the search space. The combined action of reproduction and crossover is responsible for much of the effectiveness of a GA's search, while reproduction and mutation combine to form a parallel, noisetolerant hillclimbing algorithm. The structure and parameter learning problems of neural networks are coded as genes (or chromosomes), and GAs are used to search for better solutions (optimal structure and parameters) for NNs. Furthermore, GAs can be used to find the membership functions and fuzzy rules of a fuzzy logic system. Neural networks offer the following salient characteristics and properties:
1. Nonlinear inputoutput mapping: Neural networks are able to learn arbitrary non
linear inputoutput mapping directly from training data.
2. Generalization: Neural networks can sensibly interpolate input patterns that are new to the network. From a statistical point of view, neural networks can fit the desired func tion in such a way that they have the ability to generalize to situations that are different from the collected training data.
3. Adaptivity: Neural networks can automatically adjust their connection weights, or
even network structures (number of nodes or connection types), to optimize their behavior as controllers, predictors, pattern recognizers, decision makers, and so on.
_{6}
Introduction
Chap. 1
4. Fault tolerance: The performance of a neural network is degraded gracefully under faulty conditions such as damaged nodes or connections. The inherent faulttolerance capa bility of neural networks stems from the fact that the large number of connections provides much redundancy, each node acts independently of all the others, and each node relies only on local information.
With these properties, when is it appropriate to use neural networks? When nonlinear mappings must be automatically acquired (e.g., robot control and noise removal); when only a few decisions are required from a massive amount of data (e.g., speech recognition and fault prediction); when a nearoptimal solution to a combinatorial optimization prob lem is required in a short time (e.g., airline scheduling and network routing); and when there are more input variables than can be feasibly utilized by other approaches [Simpson, 1992a]. Moreover, despite the possibility of equally comparable solutions to a given prob lem, several additional aspects of a neural network solution are appealing, including (VLSI) parallel implementations that allow fast processing; less hardware which allows faster re sponse time, lower cost, and quicker design cycles; and online adaptation that allows the networks to change constantly according to the needs of the environment
1.4 FUZZY NEURAL INTEGRATED SYSTEMS
Fuzzy logic and neural networks (with genetic algorithms) are complementary technolo gies in the design of intelligent systems. Each method has merits and demerits. A compari son of these techniques with symbolic AI and conventional control theory is presented in Table 1.1 [Fukuda and Shibata, 1994]. To combine their merits and overcome their demer its, some integration and synthesis techniques have been proposed. In this book, we shall mainly discuss the synergism of fusing fuzzy logic techniques and neural networks tech niques into an integrated system. Neural networks are essentially lowlevel computational structures and algorithms that offer good performance in dealing with sensory data, while fuzzy logic techniques often deal with issues such as reasoning on a higher level than neural networks. However, since fuzzy systems do not have much learning capability, it is difficult for a human opera
TABLE l. l Comparisons of Fuzzy Systems (FS), Neural Networks (NN), Genetic Algorithms (GA), Conventional Control Theory, and Symbolic AI*
Mathematical model Learning ability Knowledge representation Expert knowledge Nonlinearity Optimization ability Fault tolerance Uncertainty tolerance Realtime operation
FS 
NN 
GA 
Control Theory 
Symbolic AI 
SG 
B 
B 
G 
SB 
B 
G 
SG 
B 
B 
G 
B 
SB 
SB 
G 
G 
B 
B 
SB 
G 
G 
G 
G 
B 
SB 
B 
SG 
G 
SB 
B 
G 
G 
G 
B 
B 
G 
G 
G 
B 
B 
G 
SG 
SB 
G 
B 
Th e fuzzy terms used for grading are good (G), slightly good (SG), slightly bad (SB), and bad (B).
Sec. 1.4
Fuzzy Neural Integrated Systems
7
tor to tune the fuzzy rules and membership functions from the training data set. Also, because the internal layers of neural networks are always opaque to the user, the mapping rules in the network are not visible and are difficult to understand; furthermore, the conver gence of learning is usually very slow and not guaranteed. Thus, a promising approach for reaping the benefits of both fuzzy systems and neural networks (and solving their respec tive problems) is to merge or fuse them into an integrated system. This fusion of two dif ferent technologies can be realized in three directions, resulting in systems with different characteristics:
1. Neural fuzzy systems: use of neural networks as tools in fuzzy models
2. Fuzzy neural networks: fuzzification of conventional neural network models
3. Fuzzyneural hybrid systems: incorporation of fuzzy logic technology and neural net
works into hybrid systems.
The first two systems represent supportive combinations, where one technology assists the other, and the third system exhibits collaborative combination, where two tech nologies are incorporated intimately to perform a common task. Neural fuzzy systems, or neurofuzzy systems, aim at providing fuzzy systems with automatic tuning abilities. With this approach, we witness the use of neural networks in learning or tuning membership functions and fuzzy rules of fuzzy systems. Neural network learning techniques can thus substantially reduce development time and cost while improv ing the performance of fuzzy systems. After learning, the user can understand the acquired rules in the network. With respect to learning speed, neural fuzzy systems are usually faster than conventional neural networks. Fuzzy neural networks retain the basic properties and functions of neural networks with some of their elements being fuzzified. In this approach, a network's domain knowledge becomes formalized in terms of fuzzy sets, later being applied to enhance the learning of the network and augment its interpretation capabilities. For instance, a neural network can be fuzzified in such a way that it leams the mapping between inputoutput fuzzy sets. Further more, fuzzy logic can be used to determine the learning step of neural networks according to the state of convergence. By incorporating fuzzy principles into a neural network, more user flexibility is attained and the resultant network or system becomes more robust. In a fuzzyneural hybrid system, both fuzzy logic techniques and neural networks are utilized separately to establish two decoupled subsystems which perform their own tasks in serving different functions in the combined system The architecture of fuzzyneural hybrid systems is usually applicationoriented. Making use of their individual strengths, fuzzy logic and neural network subsystems complement each other efficiently and effectively to achieve a common goal.
1.5 ORGANIZATION OF THE BOOK
This book covers basic concepts and applications of fuzzy sets, fuzzy logic, and neural net works and their integration synergism. Thus, it is logically divided into three major parts. Part I (Chaps. 28) covers fuzzy set theory and its applications. In these seven chap ters, we cover fundamental concepts and operations of fuzzy sets, fuzzy relations, fuzzy measures, possibility theory, fuzzy logic and approximate reasoning, and their application
8
Introduction
Chap. 1
to fuzzy logic control systems and other systems such as pattern recognition systems and fuzzy expert systems. Part II (Chaps. 916) covers important concepts and topics involving neural net works. Singlelayer and multilayer feedforward networks as well as recurrent networks are covered. Unsupervised learning, supervised learning, and reinforcement learning for train ing neural networks are also discussed. Genetic algorithms along with their applications (Chap. 14) are briefly introduced, and the use of GAs and other techniques for the structure learning of neural networks are explored in Chap. 15. Various applications of neural net works are then discussed in Chap. 16. Part III (Chaps. 1721) covers the basic concepts of integrating fuzzy logic and neural networks into a working functional system. Three major integrated systems are dis cussed and explored: neural fuzzy (control) systems in Chaps. 18 and 19, fuzzy neural net works (for pattern recognition) in Chaps. 20 and 21, and neural fuzzy hybrid systems in Chap. 21. The rationale for their integration is discussed, and methods for realizing their integration are also explored with examples. Various applications of such integration are also considered. Finally, Apps. A and B are included to illustrate computer simulations of fuzzy logic systems and neural networks using MATLAB neural network and fuzzy logic toolboxes. Some typical and interesting examples are worked out to illustrate the characteristics of these toolboxes which will be useful for student projects and independent study.
1.6 REFERENCES
A complete bibliography consisting of more than 1100 references is included at the end of the book for the reader to further pursue the subject area of interest to him or her. The bibliogra phy is organized in alphabetical order by author and contains all the pertinent information for each reference cited in the text. In addition, the concluding remarks at the end of each chapter discuss references that are keyed to specific topics discussed in the chapter. In addition to those, the general references cited below are representative of publica tions dealing with topics of interest regarding fuzzy neural (integrated) systems (including fuzzy systems, neural networks, genetic algorithms, and their synergism) and related fields. They include major journals and conference proceedings. IEEE Transactions on Fuzzy Systems', Fuzzy Sets and Systems; IEEE Transactions on Neural Networks; International Neural Network Society Journal, Neural Networks', Evolu tionary Computation', IEEE Transactions on Systems, Man, and Cybernetics', International Journal ofApproximate Reasoning', Adaptive Behavior, Evolutionary Computation', Complex Systems', Proceedings of IEEE International Conference on Fuzzy Systems; Proceedings of IEEE International Conference on Neural Networks; Proceedings of IEEE Conference on Evolutionary Computation; Proceedings of International Conference on Genetic Algo rithms; Proceedings of International Fuzzy Systems Association (IFSA) World Congress; Proceedings of International Conference on Fuzzy Logic, Neural Nets, and Soft Computing; Proceedings of the North American Fuzzy Information Processing Society (NAFIPS) Bian nual Conference; Proceedings of the NASA Joint Technology Workshop on Neural Networks and Fuzzy Logic; Proceedings of the Conference of Parallel Problem Solving from Nature; Proceedings of the Workshop on the Foundations of Genetic Algorithms.
Sec. 1.6
References
_{9}
2
Basics of Fuzzy Sets
In this chapter, we introduce the principal concepts and mathematical notions of fuzzy set
theory—a theory of classes of objects with unsharp boundaries. We first view fuzzy sets as a generalization of classical crisp sets by generalizing the range of the membership function (or characteristic function) from {0, l} to a real number in the unit interval [0,1] . Various basic concepts of fuzzy sets such as representation, support, acuts, convexity, and fuzzy numbers are then introduced. The resolution principle, which can be used to expand a fuzzy set in terms of its acuts, is discussed and proved. Various settheoretic operations and prop erties involving crisp sets and fuzzy sets are discussed; further fuzzy set operations such as rnorms, fconorms, and other aggregation operations are also considered. Finally, the extension principle, which allows the generalization of crisp mathematical concepts to the fuzzy set framework, is presented with several examples. The material covered in this chap ter will be used extensively in later chapters.
2.1 FUZZY SETS AND OPERATION ON FUZZY SETS
A classical (crisp) set is a collection of distinct objects. It is defined in such a way as to
dichotomize the elements of a given universe of discourse into two groups: members and
Let
nonmembers. Finally, a crisp set can be defined by the socalled characteristic function.
U be a universe of discourse. The characteristic function
(X) of a crisp set A in U takes
its values in {0,1} and is defined such that \L _{A} (X) = 1 if x is a member of A (i.e., x£A ) and
0 otherwise. That is,
1 if an d onl y if x SEA
Pm (*) =
" _{Q}
if and only if x £ A.
^{(}^{2}^{.}^{1}^{)}
10
Note that (i) the boundary of set A is rigid and sharp and performs a twoclass dicho tomization (i.e., x £ A or x £ A), and (ii) the universe of discourse U is a crisp set.
A fuzzy set, on the other hand, introduces vagueness by eliminating the sharp bound
ary that divides members from nonmembers in the group. Thus, the transition between full membership and nonmembership is gradual rather than abrupt. Hence, fuzzy sets may be viewed as an extension and generalization of the basic concepts of crisp sets; however, some theories are unique to the fuzzy set framework.
A fuzzy set A in the universe of discourse U can be defined as a set of ordered pairs,
^{(}^{2}^{.}^{2}^{)}
where is called the membership function (or characteristic function) of A and jjl^W is the grade (or degree) of membership of x in A, which indicates the degree that x belongs to A. The membership function p^() maps U to the membership space M, that is, U—+M. When M ={0 , l}, set A is nonfuzzy and p, _{A} () is the characteristic function of the crisp set A. For fuzzy sets, the range of the membership function (i.e., M) is a subset of the nonnegative real numbers whose supremum is finite. In most general cases, M is set to the unit interval [0,1].
A={0t,M*))Uec/} ,
Example 2.1 Let U be the real line R and let crisp set A represent "real numbers greater than and equal to 5"; ^{t}^{h}^{e}^{n} ^{w}^{e} ^{h}^{a}^{v}^{e}
A={(x,ix _{A} (.x))\xeU},
where the characteristic function is

C
x<5
*>5,
which is shown in Fig. 2.1(a). Now let fuzzy set J A r. represent "real numbers close to 5." Then we have
(*) A
1
0
A={(X,M(.X))\X(=U},
^{1} r
1.5
^ 
c 
I 
1 
\ 

4 
5 
6 
xeu 
. 
3456 
7 
* e 
I 
(a)
Figure 2.1
(b)
Characteristic functions of crisp set A and fuzzy set A in Example 2.1.
^{1} If there is no confusion between a fuzzy set A and a crisp set A, the tilde above the fuzzy set A will be elim inated to simplify the notation.
Sec. 2.1
Fuzzy Sets and Operation on Fuzzy Sets
_{1}_{1}
where the membership function is
which is shown in Fig. 2.1(b).
1 + 10 (x — 5) ^{2} '
The fuzzy set A in the above example can also be represented as
(2.3)
It is noted that we can use another membership function for fuzzy set A in Example 2.1; for example,
A={(x, _{(} i _{/} _{i} W)^(x )
=
[l
+
10(x5) ^{2} ]  ^{1} } .
(2.4)
1 +
These two different membership functions show that assignment of the membership func tion of a fuzzy set is subjective in nature; however, it cannot be assigned arbitrarily. A qual itative estimation reflecting a given ordering of the elements in A may be sufficient. Furthermore, estimating membership functions is complicated, and a better approach is to utilize the learning power of neural networks to approximate them. The above example shows that fuzziness is a type of imprecision that stems from a grouping of elements into classes that do not have sharply defined boundaries; it is not a lack of knowledge about the elements in the classes (e.g., a particular parametric value). It is worth pointing out that x _{A} (x) G [0, 1] indicates the membership grade of an element x G U in fuzzy set A and that it is not a probability because X^W^ ^ The grades of membership basically reflect an ordering of the objects in fuzzy set A. Another way of representing a fuzzy set is through use of the support of a fuzzy set.
(x r (x)
™
5.
5) ^{2}
(x 
The support of a fuzzy set A is the crisp
set of all x £ U such that
(X) > 0. That is,
Supp(A) = {x G U\
x _{A} (x) > 0}.
(2.5)
Example 2.2 Assume that in an examination, all the possible scores are U = { 10,20,
, fuzzy sets, A = "High Score," B = "Medium Score," and C = "Low Score," whose member ship functions are defined in Table 2.1.
100}. Consider three
TABLE 2.1
Fuzzy Sets in Example 2.2
Numerical Score
High Score (A)
Medium Score (B)
Low Score (C)
10 
0 
0 
1 
20 
0 
0 
1 
30 
0 
0.1 
0.9 
_{4}_{0} 
0 
0.5 
0.7 
50 
0.1 
0.8 
0.5 
60 
0.3 
1 
0.3 
70 
_{0}_{.}_{5} 
0.8 
0.1 
80 
0.8 
0.5 
0 
90 
1 
0 
0 
100 
_{1} 
0 
_{0} 
12
Basics of Fuzzy Sets
Chap. 2
Then we have
Supp(A) = Supp ("High Score") = {50, 60,70,80,90,100}, Supp(B) = Supp ("Medium Score") = {30,40, 50,60,70, 80},
Supp(Q = Supp ("Low Score") =
{10,20,30,40,50,60,70}.
Instead, if U— [0,100] and
(•) is defined by the following continuous membership function,
( 0
{l
, ^
for0<x<40 

+ 
[(*40)/ 5 
P}  
for 40 < x s i 
20, 
^
then Supp(A) = Supp(High_Score) = interval (40,120]. Note that the summation of member
ship degrees of a fuzzy set
empty support; that is, the membership function assigns 0 to all elements of the universal set U.
(x,) ] is not necessarily equal to 1. An empty fuzzy set has
A fuzzy set A whose support is a single
point in U with jt _{A} (x) = 1 is referred to as a
crossover
fuzzy singleton. Moreover, the element* £ U at which ^(x ) = 0.5 is called the
point. The kernel of a fuzzy set A consists of the element x whose membership grade is 1; that is, ker (A) = {x 11x. _{A} (x) = l}. The height of a fuzzy set A is the supremum of /x _{A} (x) over
U. That is,
Height of A = Height(A) = sup (x _{A} (x).
X
(2.7)
A fuzzy set is normalized when the height of the fuzzy set is unity [i.e., Height(A) = 1]; otherwise it is subnormal. The three fuzzy sets in Example 2.2 are all normalized. A non empty fuzzy set A can always be normalized by dividing (JL _{A} (X) by the height of A. The representation of a fuzzy set can be expressed in terms of the support of the fuzzy set For a discrete universe of discourse U = {x _{l}_{5} x _{2} ,. •., xj , a fuzzy set A can be represented using the ordered pairs concept and written as
(2.8)
For example, in Example 2.2,
A = {(*!, x _{A} (xi)), (x _{2} , Mx _{2} )), . •(x„ , x _{A} (x„))}.
B = "Medium Score" = {(10,0), (20,0), (30, 0.1), (40,0.5), (50,0.8), (60,1), (70,0.8), (80,0.5), (90,0), (100,0)}.
Using the support of a fuzzy set A, we can simplify the representation of a fuzzy set A as
n
(2.9)
<=i where + indicates the union of the elements and ^ is the grade of membership of x„ that is, (I, = x _{A} (x,) > 0. For example, the above fuzzy set B can be represented as
B = 0.1/30 + 0.5/40 + 0.8/50 + l/60 + 0.8/70 + 0.5/80. Note that in using the support of a fuzzy set to represent a fuzzy set, we consider only those elements in the universe of discourse that have a nonzero degree of membership grade in the fuzzy set. If U is not discrete, but is an interval of real numbers, we can use the notation
A
= \ijxx
+ \ijxi
+ ••••
+
m/x, +
• • • +
I xjx _{n} =
[Ljx _{t} ,
A 
= 
f 
li _{A} (x)/x, 
(2.10) 

J 
v 
Sec. 2.1
Fuzzy Sets and Operation on Fuzzy Sets
_{1}_{3}
where J indicates the union of the elements in A. For example, the fuzzy set A in Example 2.1 can be written as
I r
1 +
10(x5) ^{2} / ^{X} '
Another important notion and property of fuzzy sets is the resolution principle which requires us to understand acuts or alevel sets. An a cut (or alevel set) of a fuzzy set A is
a crisp set A _{a} that contains all the elements of the universal set U that have a membership grade in A greater than or equal to a. That is,
(2.11)
If
Furthermore, the set of all
levels a £ (0, 1] that represents distinct acuts of a given fuzzy set A is called a level set of
A. That is,
A _{a} ={x
G
U  p. _{A} (x) >
a},
a €
(0,1] .
A _{a}
={ x E
U 
(x) > a}, then A _{a} is called a strong acut.
A _{a}
={ a
 \L _{A} (X) =
a ,
fo r some x E. U}.
(2.12 )
Example 2.3
Consider the test score example in Example 2.2 again (Table 2.1). We have
A _{Q}_{5} = High_Score _{0} _{5} ={ 70,80,90,100},
B _{0}_{g} = Medium_Score _{0} _{g} = { 50,60,70},
C _{0} _{2} = Low_Score _{0} _{2} ={ 10,20, 30,40,50,60},
and 

A _{a} = A ^ _{S}_{c}_{o}_{r}_{e} ={0.1,0.3,0.5,0.8,1}, 

A _{c} = A _{L}_{o}_{w} _{S}_{c}_{o}_{r}_{c} ={0.1,0.3,0.5,0.7, 0.9, l}. 

It is clear 
that if a < 
then A _{p} CA _{a} . 
With this understanding of acuts, we shall introduce an important property of fuzzy set theory, called the resolution principle, which indicates that a fuzzy set A can be ex panded in terms of its acuts.
Theorem 2.1 Let A be a fuzzy set in the universe of discourse U. Then the membership function of A can be expressed in terms of the characteristic functions of its acuts according to
_{1} jl _{a} (a:)=
sup [aA^W ]
a£(0,l]
VxSU ,
(2.13)
where A denotes the min operation and
(x) is the characteristic function of the crisp set A _{a} ,
""j o 10
if and only if x 6 A„ . otherwise.
fh
(2.14)
Proof:
Let V denote the max operation. Since x _{A} (x) = 1 [i.e., m (x) a a] if x E A _{a} , and
x _{A} (x) = 0 [i.e., (JL _{A} (X) < a] if x ^ A _{a} , we have
14
Basics of Fuzzy Sets
Chap. 2
sup 
[a A p. _{A}_{<}_{>} (x) ] = 
sup 
[a A ^ 
(x) ] V 
sup 
[a A x _{A}_{<}_{j} (x) ] 
= 
sup ae(0,u. _{A} (x)] 
[aAl] V 
sup CS(11. _{A} W,1] 
[a AO] 

= 
sup aeco.^W] 
a 
Theorem 2.1 leads to the following representation of a fuzzy set A using the resolu tion principle. Let A be a fuzzy set in the universe of discourse U. Let a A _{a} denote a fuzzy set with the membership function
H^W^aA^W ] 
VxEt/ . 
(2.15) 

Then the resolution principle states that the fuzzy set A can be expressed in the form 

A 
= 
I J 
<xA _{a} 
or 
A = 
J ^{1} oA _{a} . 
(2.16) 
The resolution principle indicates that a fuzzy set A can be decomposed into aA _{a} , a G (0,1]. On the other hand, a fuzzy set A can be retrieved as a union of its aA _{a} , which is
theorem. In other words, a fuzzy set can be expressed in terms of
called the representation
its acuts without resorting to the membership function. This concept is illustrated in Fig. 2.2.
Example 2.4 Consider the fuzzy set A in Example 2.2. That is,
A = 0.1/50 + 0.3/60 + 0.5/70 + 0.8/80 + l/90 + l/lOO.
Figure 2.2
Decomposition of a fuzzy set.
Sec. 2.1
Fuzzy Sets and Operation on Fuzzy Sets
_{1}_{5}
Using the resolution principle, A can be written as
A = 
0.1/50 + 0.3/60 + 0.5/70 + 0.8/80 + l/90 + l/lOO 
= 
0.1/50 + 0.1/60 + 0.1/70 + 0.1/80 + 0.1/90 + O.l/lOO 
+ 0.3/60 + 0.3/70 + 0.3/80 + 0.3/90 + 0.3/100
+ 0.5/70 + 0.5/80 + 0.5/90 + 0.5/100
+ 0.8/80 + 0.8/90 + 0.8/100
or, in terms of otA _{a} ,
+
1/90+1/100
A = 0.1/50 + 0.3/60 + 0.5/70 + 0.8/80 + l/90 + 1/100
= 0.1(1/50 + 1/60 + 1/70 + 1/80 + 1/90 + l/lOO)
+ 0.3(1/60 + 1/70+1/80 + 1/90+1/100)
+ 0.5 (1/70 + 1/80 + 1/90 + 1/100)
+ 0.8(1/80+1/90+1/100)
+
1(1/90+1/100)
= 0. lAo.i + 0.3AO. _{3} + 0.5A _{0} .5 + 0.8A„.8 + 1A!
where
= (J
oiA _{a} ,
where A _{a} = {0.1,0.3,0.5,0.8,1},
O.lAo^ = 0.1 (1/50 + 1/60 + 1/70 + l/8 0 + l/90 + l/lOO),
0.3AOJ = 0.3 (1/60 + 1/70 + l/80 + l/90 + l/lOO) ,
0.5Ao.s = 0.5 (1/70 + 1/80 + l/90 + l/lOO),
0.8Ao. _{g} =
0.8 (1/80 + 1/90 + l/lOO),
1A, = 1 (1/90 + 1/100).
On the other hand, if we are given A _{0}_{A} = {l, 2, 3,4, 5}, A _{0}_{4} = {2, 3, 5}, A _{0} _{8} = {2, 3}, and A] = {3}, then using the representation theorem, A can be expressed as
A = 
(J 
oA _{a} = 
( J 
aA _{a} 

ae\ _{A} 
aS{0.1,0.4,0.8,1} 

= 
0.1A _{0} ., + O.4A _{0} .4 + 0.8A _{0} .8 + 1A, 

= 
0.1 (l/ l + 
1/2 + 
1/3 + 
1/4 + 
1/5) + 
0.4 (1/2 + 
1/3 + 
1/5) 

+ 
0.8 (1/2 + 
1/3) + 1 (1/3) 

= 
O.l/l + 0.8/2 + 1/3 + 0.1/4 + 0.4/5. 
16
Basics of Fuzzy Sets
Chap. 2
Convexity of fuzzy sets plays an important role in fuzzy set theory, and it can be defined in terms of acuts or membership functions. A fuzzy set is convex if and only if each of its acuts is a convex set. Or, equivalently, a fuzzy set A is convex if and only if
+
(1 ^{_}
>0* _{2} )
mi n (m4 (Xj) , \h _{A} (x _{2} )),
(2.17)
where x _{v} x _{2} G U,KE. [0,1]. Equation (2.17) can be interpreted as: Take two elements x _{l} and x _{2} in a fuzzy set A and draw a connecting straight line between them; then the mem
bership grade of all the points on the line must be greater than or equal to the minimum of
jjl _{a} (XJ) and \l _{a} (x _{2} ). For
malized The fiizzy set B in Fig. 2.3(b) is not convex, but it is normalized. Note that the con vexity definition does not imply that the membership function of a convex fuzzy set is a convex function [see Fig. 2.3(c)],
A convex, normalized fuzzy set defined on the real line R whose membership func
tion is piecewise continuous or, equivalently, each acut is a closed interval, is called a fuzz y
number. Two typical fuzzy numbers are the Sfunction tively, defined by
example, the fuzzy set A in Fig. 2.3(a) is convex, but it is not nor
and the ttfunction,
for x <
a
for a < x
r
for — —
a+b
<
< _
for x^b ,
for x <
for x >
b
b.
^{a} ^{+} ^{b}
which are, respec
(2.18)
b
^{(}^{2}^{.}^{1}^{9}^{)}
0
S (x; a, b)
=
if^) ^{2}
= ^{j}^{s}^{(}^{x}^{;}
b 
a,
b)
l
^{1}
— S (x; b,b

+
x < ^ ,
it
(x; a, b)
a)
These two functions are shown in Fig. 2.4(d) and (b), respectively. In S (x\ a, b), the cross over point is (a + b)/ 2. In IT (x; a, b), b is the point at which tt is unity, while the two crossover points are fc — a/ 2 and b + a/2 . The separation between these two crossover points, a, is the bandwidth. Sometimes the it function is simply defined as
IT' (X; a , b) =
^{1}
l+[xa/fc r
,
Figure 23
(b)
Convex and nonconvex fuzzy sets.
(c)
(2.20)
Sec. 2.1
Fuzzy Sets and Operation on Fuzzy Sets
_{1}_{7}
(a+b)/2b
(a) S(x\ a,b)
ba (ba)/2b
(b)
Figure 2.4
(b+a)/2
b+a
x
ab
a
a+b
x
k (x;a,b)
(c) another definition of K(x;a,b)
The S function and the IT function.
which is shown in Fig. 2.4(c), where 2b is the bandwidth. With the S function and the tt function, it is convenient to express the membership function of a fuzzy subset of the real line in terms of one of these "standard" functions whose parameters can be adjusted accord ingly to fit a specified membership function approximately. More details on fuzzy numbers, including the arithmetic operations (e.g., addition and subtraction) of two fuzzy numbers, will be presented in Sec. 5.2. Similar to the cardinality of a crisp set, which is defined as the number of elements in the crisp set, the cardinality (or scalar cardinality) of a fuzzy set A is the summation of the membership grades of all the elements of x in A. That is,
•A= £
xE. U
MaOO
The relative cardinality of A is
(2.21)
_{I}_{4}_{e}_{l} _{=}
M
M '
(2.22)
where [t/ is finite. The relative cardinality evaluates the proportion of elements of U having the property A when U is finite. When a fuzzy set A has a finite support, its cardinality can be defined as a fuzzy set. This fuzzy cardinality is denoted as Ay and defined,by Zadeh [1978a] as
(2.23)
W/=X[f r
Example 2.5 Consider the fuzzy set A in Table 2.1 (Example 2.2). We have
A = High_Score = 0.1 + 0.3 + 0.5 + 0.8 + 1 +
A rel
_{=}
JAL _{=}
)t/ 
JAL
1()
0.37,
1 = 3.7,
\A\ _{f} = 0.1/6 + 0.3/5 + 0.5/4 + 0.8/3 + l/2,
where the fuzzy cardinality of A can be interpreted as "approximately 3."
With these basic notations and definitions for fuzzy sets, we are now ready to intro duce some basic settheoretic definitions and operations for fuzzy sets. Let A and B be fuzzy sets in the universe of discourse U.
[0, 1], the complement of A, denoted as A, is
defined by its membership function as [see Fig. 2.5(a)]
1. Complement: When
\x _{A} (x) 6
18
Basics of Fuzzy Sets
Chap. 2
( ^{a} )A
Figure 2.5
(b)AnB
(c)Auf i
Complement, intersection, and union of two fuzzy sets A and B.
(2.24)
2. Intersection: The intersection of fuzzy sets A and B, denoted as A H B, is defined
K
(*)
£
1 
IL _{A} (X)
VX G U.
by [see Fig. 2.5(b)]
^AOB 
(* ) 
= mi* ^{1} [^A W ' 
Vb to ] = 
MA to A 
(x) 
VX G U, 
(2.25) 

where A indicates the min operation. It is clear that 

A 
fl B O A 
and 
ACiBCB. 
(2.26) 
3. Union: The union of fuzzy sets A and B, denoted as A U B, is defined by [see
Fig. 2.5(c)]
^AUB to = MAX [M* to. VB to1 = MATO ^{V} 
^B 
to 
VX ^{e} 
^{u} > 
( ^{2}  ^{2}^{7} > 

where V indicates the max operation. It is clear that 

ACAU B 
and BCAUB. 
(2.28) 
4. Equality: A and B are equal if and only if
M
* )
=
M* )
V x E
U.
(2.29)
Hence, if \i. _{A} (x) # x _{B} (x) for somex G U, then A ¥= B. This definition of "equality" is crisp. To check the degree of equality of two fuzzy sets, we can use the similarity measure [Lin and Lee, 1992]:
E (A, B) = degree (A = B)
a
(2.30)
where fl and U denote intersection and union of A and B, respectively. When A = B, =
not overlap at all), E (A, B) = 0. In most
general cases, 0 ^ E (A, B) ^ 1.
E (A, B) =
1; when A n
0 (i.e., A and B do
5. Subset: A is a subset of B; that is, A C B if and only if
Vx G U.
(2.31)
If A C B and A ¥= B, then A is a proper subset of B; that is, ACB. Again, the definition of subset is crisp. To check the degree that A is a subset of B, we can use the subsethood mea sure [Kosko, 1992a]:
S (A, B) = degree (ACB)
4
^
^
.
(2.32)
Sec. 2.1
Fuzzy Sets and Operation on Fuzzy Sets
_{1}_{9}
Example 2.6 Consider the two membership functions of the fuzzy sets A and B in Table 2.1. Obviously, A + B, A is not a subset of B, and B is not a subset of A either. Moreover, we have
A = 1/10 + 1/20 + 1/30 + 1/40 + 0.9/50 + 0.7/60 + 0.5/70 + 0.2/80,
A n B = 0.1/50 + 0.3/60 + 0.5/70 + 0.5/80,
AUB
= 0.1/30 + 0.5/40 + 0.8/50 + l/60 + 0.8/70 + 0.8/80 + l/90 + l/lOO,
_{F} _{(} _{A} ^{V}
_{'}
_{m}
^{;}
r/A _{m}
l ^{A}^{n}^{B} l
AU5 
l ^{A}^{n}^{f}^{i} l
\A\


0.1 40.3+ 0.5+ 0.5 
. 
14 

0.1+0.5 + 0.8 + 1+0. 8 + 0.8 + 
1 + 
1 
6 

0.1+0.3 + 0.5+ 0.5 
1.4 

0.1+0.3+0. 5 
+ 0.8+ 1 + 
1 
3.7 
^{a}^{3}^{8} " 
'
'
A justification of the choice of negation, min, and max as complement, intersection, and
union operators, respectively, was given in Bellman and Giertz [1973] where these opera tors were shown to be the only operators that meet a set of requirements (axioms). With the above definitions, the reader can verify the following facts [refer to Eqs. (2.29) and (2.31) for definitions of "equality" and "subset"].
(A D B) _{a} =A _{a} f)B _{a} and (A U B) _{a} =A _{a} U B _{a} , but (A) _{a} * A _{a} . (2.33)
As in the case of crisp sets, we have the doublenegation law (involution) and DeMorgan's laws for fuzzy sets.
6. Doublenegation law (involution):
A=A . 
(2.34) 

7. DeMorgan's laws: 

ADB=AUB . 
(2.36) 

HoweveMhe law of the excluded middle (i.e., EUE 
= 
U) and the law of 
contradic 
tion (i.e., E fl E = 0 ) of the crisp set E are no longer true and valid irj fuzzy sets. That is, for fuzzy set A,
AUA^U
and
ADA
¥= 0,
(2.37)
which means that because of the lack of precise boundaries, complementary sets are over lapping and cannot cover the universal set U perfectly. On the contrary, these two laws are the necessary characteristics of crisp sets, which bring together a crisp set E and its com plement E to provide the whole set; that is, nothing exists between E and E. The other properties of fuzzy set operations that are common to crisp set operations are listed in Table 2.2. However, it is noted that the above discussion about the properties of
fuzzy set operations is based on the definitions of complement, intersection, and union operations in Eqs. (2.24), (2.25), and (2.27), respectively. As we will see in the next section, there are other definitions of these operations and hence some properties mentioned above may fail to hold while others may become true for various definitions. As a matter of fact,
it has been shown that if the excludedmiddle laws hold for fuzzy sets, then union and inter
section cannot be idempotent and are no longer mutually distributive [Kandel, 1986; Klir
20
Basics of Fuzzy Sets
Chap. 2
TABLE 2.2
Properties of Fuzzy Set Operations
_{I}_{d}_{e}_{m}_{p}_{o}_{t}_{e}_{n}_{c}_{e} 
AUA = A,AflA = A An(Buc> ALi(finc) = (/iuB)n(Auc) (Ans)u(An o 

_{D}_{i}_{s}_{t}_{r}_{i}_{b}_{u}_{t}_{i}_{v}_{i}_{t}_{y} 
= 

Commutativity 
A\JB = B\JA, ADB=BnA 

Associativity 
(AUfl)UC = AU(BUC) 

(An5)nc=An(sno 

Absorption Law of identity 
AU(ADS) = A, A U1/ = {/, An(AUfl) = A _{A}_{C}_{\}_{U} _{=} _{A} ADS = AUS 

Law of zero 
_{A}_{U}_{0}_{=}_{A} _{,} A P10 = 0 

Doublenegation law _{D}_{e} _{M}_{o}_{r}_{g}_{a}_{n}_{'}_{s} _{l}_{a}_{w}_{s} 
_{A} _{=} _{A} AUB = AnB, 
andFolger, 1988]. Hence, when we choose union and intersection to combine fuzzy sets, we have to give up either excludedmiddle laws or distributivity and idempotency. Next, we shall introduce some popular algebraic operations on fuzzy sets. Further fuzzy set operations will be discussed in the next section.
8. Cartesian product: Let A _{v} A _{2} ,
,A _{n}
be fuzzy sets in U _{x} , U _{2} ,
,
A
_{n}
, is a fuzzy set in the product space
The Cartesian product of A _{p} A _{2} ,
X U _{n} with the membership function as
U _{n} , respectively.
U _{2}
f/ j X
X
• • •
Mv^X^Xxa.( ^{x} i> ^{x} 2>=
x _{l} E.U _{l} ,x _{2} G.
™
U _{2} ,
to)
,x
_{n}
> VA _{2}
€E U _{n} .
to)
• 
MA to) ].
(2.38)
9. Algebraic sum: The algebraic sum of two fuzzy sets, A + B, is defined by
PA+fi
to
£
Mvt
to
+
MB to 
ih _{A}
to
• VB
to
(2.39)
10. Algebraic product: The algebraic product of two fuzzy sets, A • B, is defined by
PA B 
to 
£ 
MA to • VB 
to. 
(2.40) 

11. Bounded sum: The bounded sum of two fuzzy sets, A ffi B, is defined by 

Ma«b 
to 
^ 
min{ 1, (x^ (x) + 
(x)}. 
(2.41) 
< 12. Bounded difference: The bounded difference of two fuzzy sets, A © B, is defined
by 

V^AeB to = 
max{ 0, \s, _{A} (x) 
^ 
(x)}. 
(2.42) 

Example 2.7 
 
Let A ={(3,0.5) , (5,1) , (7,0.6)} and B ={(3,1) , (5,0.6)}. Then we can obtain
A XB ={[(3,3),0.5], [(5,3), 1], [(7,3),0.6], [(3,5),0.5],
[(7,5), 0.6]},
A + B ={(3,1), (5,1), (7,0.6)},
A • B ={(3,0.5), (5,0i6), (7,0)},
A ® B ={(3,1), (5,1), (7,0.6)},
AeB ={(3,0), (5,0.4), (7,0.6)}.
[(5,5),0.6],
Sec. 2.1
Fuzzy Sets and Operation on Fuzzy Sets
21
2.2 EXTENSIONS OF FUZZY SET CONCEPTS
In the previous section, we introduced basic definitions and operations on fuzzy sets. The membership functions were assumed to be crisp, and the membership space M was re stricted to the space of real numbers. Specific operations of set complement, intersection, and union were given in Eqs. (2.24), (2.25), and (2.27), where negation, min, and max oper ators were adopted in these operations, respectively. These specific operations are called the standard operations of fuzzy sets that are always used in possibility theory [Zadeh, 1978a]. More on possibility theory will be presented in Chaps. 4 and 5. The basic concept covered in Sec. 2.1 can be extended in two possible directions. The first one concerns other kinds of fuzzy sets including different structures of the membership space and different assumptions about the membership functions. This extension is treated in Sec. 2.2.1. The second extension concerns the operations of fuzzy sets. It is understood that the standard operations of fuzzy sets are not the only possible generalization of crisp set operations. Several different classes of functions, which possess proper properties, have been proposed. These will be discussed in Sec. 2.2.2.
2.2.1 Other Kinds of Fuzzy Sets
An extension of ordinary fuzzy sets is to allow the membership values to be a fuzzy set instead of a crisply defined degree. A fuzzy set whose membership function is itself a fuzzy set is called a type2 fuzzy set. A type1 fuzzy set is an ordinary fuzzy set. Hence, a type2 fuzzy set is a fuzzy set whose membership values are type1 fuzzy sets on [0,1]. A type2 fuzzy set in a universe of discourse U is characterized by & fuzzy membership function p. _{A} as [Mizumoto andTanaka, 1976]
^C/—[0 , 1] ^{[}^{<}^{U}^{1} , 
(2.43) 

where (x) is thefuzzy grade and is a fuzzy set in [0,1] represented by 

M * ) = {/(«)/"> 
«6 [ 0,1], 
(2.44) 
where/i s a membership function for the fuzzy grade x _{A} (x) and is defined as 

/ : [0,1][0,1] . 
(2.45) 
For example, we may define a type2 fuzzy set "Beautiful" with membership values as type1 fuzzy sets such as Below average, Average, Above average, Superior, and so on. We can recursively define a typem fuzzy set (m > 1) in U whose membership values are type (m — 1) fuzzy sets on [0,1]. A different extension of the concept of fuzzy sets is to consider a fuzzy set of fuzzy sets of U, that is, a fuzzy set whose elements are fuzzy sets. Such fuzzy sets are called level2 fuzzy sets. For example, a level2 fuzzy set is the collection of desired attributes for an electric razor. The elements of this level2 fuzzy set are ordinary (level1) fuzzy sets such as Reliable, Inexpensive, Good appearance, and so on. Recursively, a levelk (k> 1) fuzzy set can be defined, where k indicates the depth of nesting. Given a universe of discourse U, let 2P (U) denote the set of all fuzzy subsets of U and let ^{<} 3 ^{>}^{k} (U) be defined by
#*(E/) =
(I/)),
(2.46)
22
Basics of Fuzzy Sets
_{C}_{h}_{a}_{p}_{.} _{2}
for all integers k>2 . Then, a levelfc fuzzy set A is defined by
(CO[0,1] .
2.2.2 Further Operations on Fuzzy Sets
(2.47)
As we have mentioned, the standard operations, that is, negation, min, and max operations in Eqs. (2.24), (2.25), and (2.27), respectively, are not the only possible generalization of the crisp set complement, intersection, and union operations. This raises a question con cerning the requirements, specifications, and properties of other functions that can be viewed as a generalization of the crisp set operations. We shall first discuss several differ ent classes of functions for each of the above three standard set operators. These functions will possess appropriate properties. For each operation, the corresponding functions can be divided into two categories. One is nonparametric functions such as Eqs. (2.24), (2.25), and (2.27), and the other is parametric functions in which parameters are used to adjust the "strength" of the corresponding operations. Based on these functions, we will introduce other kinds of fuzzy set operations. Let us consider the fuzzy complement first. A complement of a fuzzy set A, denoted as A, is specified by a function
such that
c: [0,1][0,1] ,
^ _{A} (x) = c(p. _{A} (x)),
(2.48)
(2.49)
where the function c (•) satisfies the following conditions:
cl. Boundary conditions: c (0) = 1 and c (1) = 0.
c2. Monotonic property: For any x _{v}
c (A _{A} (X _{2} ) ) ; that is, c (•) is monotonic nonincreasing. c3. Continuity: c (•) is a continuous function. c4. Involution: c (•) is involutive, which means that
c(c(^(x)) ) = x _{A} (x), Vx
x _{2} £
U,
if
i _{A} (x,) <
£
p, _{A} (x _{2} ), then
Viel mehr als nur Dokumente.
Entdecken, was Scribd alles zu bieten hat, inklusive Bücher und Hörbücher von großen Verlagen.
Jederzeit kündbar.