Beruflich Dokumente
Kultur Dokumente
RKJHA
Lecture-01
NeuralNetwork:
Work on artificial Neural N/W is commonly referred as Neural Networks
Neural Network
RKJHA
Soft Computing
1. Knowledge is acquired by the N/W from its environment through a learning process. 2. Inter neuron connection strengths, known as synaptic weights, are used to store the acquired knowledge. The procedure used to perform the learning process is called learning algorithm, the function of which is to modify the synaptic weight of the network in an orderly fashion to attain a desired design objective.
BenefitsofNeuralNetwork:
Neural networks derive its computing power through: 1. Massively parallel distributed structure. 2. Its ability to learn and there for generalize.
Generalization refers to that the neural network can produce reasonable output for inputs not encountered during training (learning). **N.N can not provide solution by working individually. Rather it can be integrated into a system engineering process.
Neural Network
RKJHA
Soft Computing
Lecture-02
NN offers the following useful properties: 1. Nonlinearity:-A N can be linear as well as non linear. 2. Input-Output Mapping:- NN can be trained using sample data or task example. Each example consists of a unique input signal and a corresponding classical response. The network is trained by adjusting the weights to minimize difference between classical o/p and actual o/p. 3. Adaptivity:- Neural network have a built in capability to adopt their synaptic weights to changes in the surrounding environment. In particular a neural network trained to operate in a specific environment can be easily retrained to deal with minor changes in the operating environmental condition. Also if NN is meant to function is a non stationary environment, it can be designed to change its synaptic weight in real time. This enables to make it a useful tool in adaptive pattern classification, adaptive signal processing, and adaptive control. ** To realize the full benefit of adaptivity, the principle time constant of the system should be long enough for the system to ignore spurious disturbances and yet short enough to respond to meaning full changes in the environment. 4. Evidential response:- In context to pattern classification, a neural network can be designed to provide information not only about which pattern to select but also about the confidence in the discussion made: The latter information is used to reject ambiguous patterns. 5. Contextual Information:- Knowledge is represented by the very structure and actuation state of a neural network. Every neuron in the network is potentially affected by the global activity of all other neurons in the network.
Neural Network
RKJHA
Soft Computing 6. Fault Tolerance:- A neural network, implemented in hardware form has the potential to be inherently fault tolerant or capable of about computation, in the sense that is performance degrades gradually under adverse operating conditions. Thus in principle a neural network exhibits a graceful degradation in performance rather than catastrophic failure. 7. VLSI: Implementation:- The massively parallel nature of a neural network makes it potentially fast for the computation of certain task. This same feature makes a neural network will suited for implementation using very large scale integrated technology. 8. Uniformity of Analysis & Design:- A neural n/w enjoys universality as information processes. I.e. like same notion as used in all domains involving application of NN. 9. Neurobiological Analogy:- The design of a neural network is motivated by analogy with the brain, which is a living proof that fault tolerant parallel processing is not only physically possible but also fast and powerful.
Neural Network
RKJHA
Soft Computing
Lecture-03 NeuralNetworks:
Brain contains about 1010 basic units called neurons. A neuron is a small cell that receives electro-chemical signals from its various sources and in turn responds by transmitting electrical impulses to other neurons. Some neurons perform input operation referred to as afferent cell; some perform output operation referred to as efferent cells, the remaining form a part of interconnected network of neurons which are responsible for signal transformation and storage of information.
Structureofneuron:Graph
Dendrites: Behave as input channels, i.e. all inputs from other neurons arrive
through the dendrites.
Axiom: Is electrically active and serves as an output channel. There are the non
linear threshold devices which produce a voltage pulse called Action Potential. It the cumulative inputs received by the soma raise the interval electric potential of the cell neuron as Membrane potential, then the neuron fires by propagating the action potential shown the axiom to either or inhibit other neurons.
SynapseorSynaptic Junction:
The axiom terminates in a specialized contact called synapse or synaptic function that connects axiom to dendrites links of other neurons.
Neural Network
RKJHA
Soft Computing This synaptic function which is a very minute gap at the end of the dendrite link contacts a neuron transmitter fluid. The size of the synaptic junction or synapses is believed to be related to learning. Thus, synapses with large area are thought to be exhibitory while those with small area are believed to be inhibitory.
ModelofAArtificialNeuron: Human brain is a highly interconnected network of simple processing elements called neurons. The behavior of a neuron can be captured by a simple model termed as artificial neuron.
In artificial neurons acceleration and retardation of modeled by weights. An efficient synapse which transmits a stronger signal will have a corresponding larger weight.
Neural Network
RKJHA