Sie sind auf Seite 1von 3

Neural Networks By Hajek

Table of contents
1 Introduction.............................................................................................................4
1.1 What is a neural network?..................................................................................4
1.2 Benefits of neural networks ...............................................................................6
1.3 Recommended Literature...................................................................................7
1.3.1 Books .....................................................................................................7
1.3.2 Web sites................................................................................................8
1.4 Brief history .......................................................................................................9
1.5 Models of a neuron ............................................................................................9
1.6 Types of activation functions...........................................................................10
1.6.1 Threshold activation function (McCullochPitts model) ....................10
1.6.2 Piecewise-linear activation function ....................................................11
1.6.3 Sigmoid (logistic) activation function .................................................11
1.6.4 Hyperbolic tangent function ................................................................12
1.6.5 Softmax activation function.................................................................13
1.7 Multilayer feedforward network ......................................................................13
1.8 Problems ..........................................................................................................14
2 Learning process...............................................................................16
2.1 Error-correction learning .................................................................................17
2.2 Hebbian learning..............................................................................................18
2.3 Supervised learning..........................................................................................19
2.4 Unsupervised learning .....................................................................................19
2.5 Learning tasks ..................................................................................................20
2.5.1 Pattern recognition...............................................................................20
2.5.2 Function approximation .......................................................................21
2.6 Problems ..........................................................................................................22
3 Perceptron .................................................................................................23
3.1 Batch learning ..................................................................................................26
3.2 Sample-by-sample learning .............................................................................26
3.3 Momentum learning.........................................................................................27
3.4 Simulation results.............................................................................................27
3.4.1 Batch training.......................................................................................27
3.4.2 Sample-by-sample training ..................................................................29
3.5 Perceptron networks.........................................................................................30
3.6 Problems ..........................................................................................................32
4 Back-propagation networks.............................................................................................33
4.1 Forward pass ....................................................................................................34
4.2 Back-propagation.............................................................................................35
4.2.1 Update of weight matrix w2 .................................................................36
4.2.2 Update of weight matrix w1 .................................................................37
4.3 Two passes of computation..............................................................................39
4.4 Stopping criteria...............................................................................................39
4.5 Momentum learning.........................................................................................40

4.6 Rate of learning................................................................................................41


4.7 Pattern and batch modes of training.................................................................42
4.8 Weight initialization.........................................................................................42
4.9 Generalization..................................................................................................43
4.10 Training set size ...............................................................................................45
4.11 Network size ....................................................................................................45
2 Neural networks.doc

4.11.1 Complexity regularization .................................................................46


4.12 Training, testing, and validation sets ...............................................................47
4.13 Approximation of functions.............................................................................47
4.14 Examples..........................................................................................................48
4.14.1 Classification problem interlocked spirals......................................48
4.14.2 Classification statistics.......................................................................51
4.14.3 Classification problem overlapping classes ....................................52
4.14.4 Function approximation .....................................................................56
4.15 Problems ..........................................................................................................57
5 The Hopfield network.......................................................................................................61
5.1 The storage of patterns.....................................................................................62
5.1.1 Example ...............................................................................................63
5.2 The retrieval of patterns ...................................................................................63
5.3 Summary of the Hopfield model......................................................................64
5.4 Energy function................................................................................................65
5.5 Spurious states .................................................................................................66
5.6 Computer experiment.......................................................................................68
5.7 Combinatorial optimization problem...............................................................70
5.7.1 Energy function for TSP ......................................................................71
5.7.2 Weight matrix ......................................................................................72
5.7.3 An analog Hopfield network for TSP ..................................................74
5.8 Problems ..........................................................................................................75
6 Self-organizing feature maps ...........................................................................................76
6.1 Activation bubbles ...........................................................................................76
6.2 Self-organizing feature-map algorithm............................................................78
6.2.1 Adaptive Process..................................................................................79
6.2.2 Summary of the SOFM algorithm .......................................................80
6.2.3 Selection of Parameters........................................................................80
6.2.4 Reformulation of the topological neighbourhood................................81
6.3 Examples..........................................................................................................82
6.3.1 Classification problem overlapping classes ......................................82
6.4 Problems ..........................................................................................................84
7 Temporal processing with neural networks ...................................................................86
7.1 Spatio-temporal model of a neuron..................................................................87
7.2 Finite duration impulse response (FIR) model ................................................88
7.3 FIR back-propagation network ........................................................................90
7.3.1 Modelling time series...........................................................................93
7.4 Real-time recurrent network ............................................................................95
7.4.1 Real-time temporal supervised learning algorithm..............................96
8 Radial-basis function networks .......................................................................................99

8.1 Basic RBFN ...................................................................................................100


8.1.1 Fixed centre at each training sample..................................................100
8.1.2 Example of function approximation large RBFN...........................101
8.1.3 Fixed centres selected at random .......................................................102
8.1.4 Example of function approximation small RBFN ..........................102
8.1.5 Example of function approximation noisy data ..............................103
8.2 Generalized RBFN.........................................................................................104
8.2.1 Self-organized selection of centres ....................................................104
8.2.2 Example noisy data, self-organized centres....................................105
8.2.3 Supervised selection of centres..........................................................106
Neural networks.doc 3

9 Adaline (Adaptive Linear System) ................................................................................108


9.1 Linear regression............................................................................................108
9.2 Linear processing element .............................................................................109
9.3 Gradient method.............................................................................................109
9.3.1 Batch and sample-by-sample learning ...............................................110
9.4 Optimal hyperplane for linearly separable patterns (Adatron) ......................111
9.4.1 Separation boundary generated by a NeuroSolutions SVM ..............113

Das könnte Ihnen auch gefallen