Sie sind auf Seite 1von 4

Code No: R05320505 Set No.

1
III B.Tech II Semester Regular Examinations, Apr/May 2009
NEURAL NETWORKS
(Computer Science & Engineering)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆⋆⋆⋆⋆

1. (a) Explain the concept of similarity between patterns.


(b) What is a pattern? Expalin [4+12]

2. (a) Compare Error-correction learning and competitive learning.


(b) Compare Hebbian learning and competitive learning. [8+8]

3. (a) Explain in detail about linear least-square filter.


(b) Explain least-mean-square algorithm. [8+8]

4. “Hidden neurons play a critical role in the operation of a multilayer perceptron


with back propagation learning”. Explain? [16]

5. (a) Explain Hessian matrix.


(b) Discuss a few tasks that can be performed by a Back Propagation Algorithm.
[8+8]

6. (a) Summarize the self organizing map algorithm.


(b) Illustrate the behavior of the Self organization maps algorithm by using com-
puter simulations to study network with 100 neurons arranged in the form of
two dimensional lattice with 10 rows and 10 columns. [8+8]

7. Explain the design procedure of Neuro-Controller for a dynamical system with a


case study. [16]

8. (a) A Hopfield network made up of 5 neurons, which is required to store the


following three fundamental memories
ξ1 = {+1, +1, +1, +1, +1}T
ξ2 = {+1, -1, -1, +1, -1}T
ξ3 = {-1, +1, -1, +1, +1}T
Evaluate the 5-by-5 synaptic weight matrix of the network
(b) Contrast and compare a recurrent network configuration with a feed forward
network. [8+8]

⋆⋆⋆⋆⋆

1 of 1
Code No: R05320505 Set No. 2
III B.Tech II Semester Regular Examinations, Apr/May 2009
NEURAL NETWORKS
(Computer Science & Engineering)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆⋆⋆⋆⋆

1. (a) Explain tapped-delay-line model


(b) lattice filter model. [8+8]

2. (a) Explain in detail about Bolltzmann learning.


(b) Explain in detail about competitive learning. [8+8]

3. Explain the following:

(a) Independence theory


(b) Learning rate. [8+8]

4. Explain the following

(a) Solution for XOR problem


(b) Sequential mode of training for back propagation. [8+8]

5. What is multi layer network and explain the two layer Back propagation network.
[16]

6. (a) Illustrate the behavior of Self organization maps algorithm by using computer
simulations to study network with neurons arranged in the form of single
dimensional lattice.
(b) What are the applications of self organizing map in image processing and
pattern recognition? Explain. [8+8]

7. Explain stability of equilibrium states of an autonomous dynamical system. [16]

8. What is gradient type Hopfield network? Differentiate between discrete time Hop-
field network and gradient type Hopfield network. [16]

⋆⋆⋆⋆⋆

1 of 1
Code No: R05320505 Set No. 3
III B.Tech II Semester Regular Examinations, Apr/May 2009
NEURAL NETWORKS
(Computer Science & Engineering)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆⋆⋆⋆⋆

1. Explain the following


(a) Acyclic network with a single layer neuron
(b) Acyclic network with one hidden layer and one output layer. [8+8]
2. Write about associative memory models. [16]
3. Consider the cost function
2 T T 2
ξ(w) = 1/2σ −rxd w + 1/2w Rx w where σ is constant, and
0.8182
rxd
0.354
 
1 0.8182
Rx
0.8182 1
(a) Find optimum value w∗ for which ξ(w) reaches its minimum value
(b) Use the method of steepest descent to compute w∗ for the learning rate para-
meter = 0.3. [16]
4. Investigate the use of back-propagation learning using a sigmoidal nonlinearity to
achieve ono-to-one mapping as given below
f(x) = log10x 1 ≤ x ≤ 10
compute the following
(a) Set up two sets of data, one for network training, and other for testing.
(b) Use the training data set to compute the synaptic weights of the network,
assumed to have a single hidden layer. [8+8]
5. What is multi layer network and explain the two layer Back propagation network.
[16]
6. (a) Write about Willshaw-Vonder malsburg’s model of self organized feature map
(b) Write short notes on parameter specifications for the computer simulations of
self organization map algorithm. [8+8]
7. What are stability of equlibrium states. [16]
8. Explain relation between the stable states of the dicrete and contineous versions of
the Hopfield model. [16]

⋆⋆⋆⋆⋆

1 of 1
Code No: R05320505 Set No. 4
III B.Tech II Semester Regular Examinations, Apr/May 2009
NEURAL NETWORKS
(Computer Science & Engineering)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆⋆⋆⋆⋆

1. Draw and Explain in detail about single loop feed back system. [16]

2. Explain the geometric interpretation of competitive learning process. With an


example. [10+6]

3. Consider the cost function


2 T T 2
ξ(w) = 1/2σ −rxd w + 1/2w Rx w where σ is constant, and
0.8182
rxd
0.354
 
1 0.8182
Rx
0.8182 1

(a) Find optimum value w∗ for which ξ(w) reaches its minimum value
(b) Use the method of steepest descent to compute w∗ for the learning rate para-
meter = 0.3. [16]

4. Explain in detail about forward pass and backward pass of back-propagation. [16]

5. (a) Explain multi layer perceptron with two hidden layers and one output neuron.
(b) Implement a Back propagation network to simulate the Exclusive-OR func-
tion.
Input1 Input2 Input3
1 1 0
0 1 1 [8+8]
1 0 1
0 0 0
6. Mention some of the feature mapping capabilities of neural networks and explain
any two of them in detail. [16]

7. What are stability of equlibrium states. [16]

8. (a) What is the Hopfield network? Explain.


(b) Describe how Hopfield network can be used to have analog to digital conver-
sion. [4+12]

⋆⋆⋆⋆⋆

1 of 1

Das könnte Ihnen auch gefallen