Sie sind auf Seite 1von 6

A Weightless Neural Node based on a Probabilistic Quantum Memory

Adenilton Silva

, Wilson de Oliveira

and Teresa Ludermir

Universidade Federal de Pernambuco


Centro de Informatica
Recife, PE - Brazil
Email: {ajs3, tbl}@cin.ufpe.br

Universidade Federal Rural de Pernambuco


Departamento de Estatstica e Inform atica
Recife, PE - Brazil
Email: wrdo@gmail.com
AbstractThe success of quantum computation is most
commonly associated with speed up of classical algorithms,
as the Shors factoring algorithm and the Grovers search al-
gorithm. But it should also be related with exponential storage
capacity such as the superdense coding. In this work we use
a probabilistic quantum memory proposed by Trugenberger,
where one can store 2
n
patterns with only n quantum bits
(qbits). We dene a new model of a quantum weightless
neural node with this memory in a similar fashion as to
the classical Random Access Memory (RAM) node is used in
classical weightless neural networks. Some advantages of the
proposed model are that the memory of the node does not grow
exponentially with the number of inputs and that the node can
generalise.
Keywords-Neural Computing; Quantum Computing; RAM
Based Neural Networks; Weightless Neural Networks; Proba-
bilistic Quantum Memories
I. INTRODUCTION
The main aim of this work is to propose a novel quantum
weightless neural node and to investigate it properties,
individually and collectively. This new model can be seen
as quantisation of the classical RAM node. As such it will
be shown that it overcomes some of the limitations of its
classical sibling. The study of the quantum weightless neural
networks was started by Oliveira et al. in [1], where it is
shown that in contrast to the MCP node [2], the probabilistic
logic node PLN [3] and the multi valued probabilistic logic
node MPLN [3] can be easily quantised and implemented in
a quantum computer. The structure of the node used in [1],
[4] differs substantially from the one used here. In op.cit. the
nodes are especial kinds of matrices whereas here we use a
state or quantum register. Their matrices grow exponentially
with the input size while our registers grow linearly.
The rst model of (classical) weightless neuron, denom-
inated RAM node, was proposed by Igor Aleksander [5]
in 1966. Several variations of this model were afterward
proposed such as the PLN, the MPLN and the goal seeking
neuron GSN [6]. A serious drawback of the RAM based
nodes is that its memory size grows exponentially with the
number of inputs. In this paper we propose a new model of
quantum weightless neural networks based on a probabilistic
quantum memory (PQM) [7], in the same way that the RAM
node is based in the RAM memory.
In contrast to the others classical and quantum RAM
nodes, the PQM node memory does not grow exponentially
with the number of inputs. The number of inputs in the
RAM based nodes must be limited due to the exponential
grow up. This fact reects in the kinds of architecture used
in networks of RAM based nodes and in several works they
usually have been partially connected. With the PQM node
any kind of architecture can now be used.
Another point in favor of the PQM node is that a single
PQM node is able to generalise while a single RAM node
cannot. Only a combination of RAM nodes in a network
generalises. One PQM node can generalise in a probabilistic
mode.
The PQM node also has advantages over the PQM
memory. In [8] Trugenber uses the PQM memory as an
associative memory, but the memory needs to be cloned. But
it is known that a quantum memory can be probabilistically
cloned by a general unitary-reduction operation if and only
if the patterns stored are linearly independent [9]. This is a
strong limitation for the use of the PQM memory in machine
learning. We shall see that with a small modication in the
original algorithm employed in [8] the PQM node does not
suffer with this limitation.
The remainder of this paper is divided in seven sections.
The aim of Sections 2, 3 and 4 is to present the basic deni-
tions for weightless neural networks, quantum computation
and quantum neural network. In Section 5 we describe the
probabilistic quantum memory, in Section 6 the quantum
weightless neural node is proposed and the Section 7 is the
conclusion.
II. WEIGHTLESS NEURAL NETWORK
In contrast to biologically motivated nodes, RAM nodes
were designed as engineering tool to solve pattern recog-
nition problems efciently.
An n input RAM node has 2
n
memory locations, ad-
dressed by the n-bit string a = (a
1
a
2
. . . a
n
), a
i
{0, 1}.
A binary signal x = (x
1
x
2
. . . x
n
), x
i
{0, 1}, on the input
lines will access only one of these locations resulting in
y = C [x] [10]. In Figure 1 below s and d are respectively
the learning strategy and the desired output to be learned.
Learning in a RAM node takes place simply by writing
into the corresponding look-up table entries. This learning
!010 Eleventh Brazilian Symposium on Neural Networks
!78-0-76!5-4210-2/10 $26.00 2010 IEEE
DOI 10.110!/SBRN.2010.52
25!
process is much simpler than the adjustment of weights.
The RAM node, as dened above, can compute all binary
functions of its input while the weighted nodes can only
compute linearly separable functions.
s 11 . . . 1 C [2
n
1]
d 11 . . . 0 C [2
n
2]
x
1

.
.
.
.
.
. y

.
.
. 00 . . . 1 C[1]
x
n

00 . . . 0 C[0]
Figure 1. Ram Node
A Probabilistic Logic Node (PLN) differs from a RAM
node in that a 2-bit number (rather than a single bit) is now
stored at the addressed memory location. The content of
this location is turned into the probability of ring (i.e.,
generating 1) at the overall output of the node. In other
words, a PLN consists of a RAM node, where now a
2-bit number is stored at the addressed memory location
(0, 1 or u), respectively represented as say 00, 11 and 01
(or 10), with a probabilistic output generator. The output of
the PLN Node is given by [10]:
y =
_

_
0, if C[x] = 0
1, if C[x] = 1
random(0, 1), if C[x] = u
The Multi-Valued Probabilistic Logic Node (MPLN) dif-
fers from PLN by allowing a wider but still discrete range
of probabilities to be stored at each memory content.
III. QUANTUM COMPUTATION
Quantum computing [11] was originally proposed by
Richard Feynman [12] in the 1980s and had its formalisation
with David Deutsch which proposed the quantum Turing
machine [13]. Quantum Computing has been popularised
through the quantum circuit model [14] which is a math-
ematical quantisation [15] of the classical boolean circuit
model of computation. Quantum computers, if ever built,
can also be seen as a parallel device for potentially improve
the computational efciency of neural networks [16].
The quantum information unit is the quantum bit or
qubit. A very intuitive view of the quantisation procedure
used almost everywhere is put forward by Nik Weaver in the
Preface of his book Mathematical Quantization [15] with
says in a nutshell: The fundamental idea of mathematical
quantisation is sets are replaced with Hilbert spaces. In or-
der to properly deal with operations we would add: functions
(morphisms) are replaced with linear operators in general -
and in some cases, with unitary operators in particular.
The quantisation of the boolean circuit logic starts by
simply embedding the classical bits {0, 1} in a convenient
Hilbert space. The natural way of doing this is to rep-
resent them as (orthonormal) basis of a Complex Hilbert
space. In this context these basis elements are called the
computational-basis states [17]. Linear combinations (from
Linear Algebra [18]) of the basis spans the whole space
whose elements, called states, are said to be in super-
position. The general state of a qubit is a superposition
(linear combinations) of the two computational-basis states:
| = |0 + |1, where , are complex coefcients
(called probability amplitudes) constrained by the normal-
isation condition: ||
2
+ ||
2
= 1; and |0, |1 are a pair
of orthonormal basis vectors representing each classical bit,
or cbit, as column vector. For example the cbits can be
represented as
|0 =

1
0

, |1 =

0
1

It should be clear that any pair of orthonormal basis


vectors would work but those above are the canonical ones
mostly employed in the majority of works. Thus qubits live
in the bi-dimensional complex Hilbert space C
2
. Tensor
products are used in order to represent more than one qubit
as follows: |i |j = |i |j = |ij , where i, j {0, 1}.
The n-qubits live in the space C
2
n
Recall that the tensor
product of two bi-dimensional vectors
| =

, | =

is the 4-dimensional vector:


| | =
_

_
=
_

1
_

_
Which obviously generalises to any pair of n and
mdimensional vectors producing a nm-dimensional vec-
tor or more generally a (n
0
, m
0
)dimensional ma-
trix by a (n
1
, m
1
)dimensional one producing a third
(n
0
m
0
, m
0
n
0
)dimensional matrix and so on. Operations
on qubits are carried out by unitary operators. So quantum
algorithms on n bits are represented by unitary operators
U over the 2
n
-dimensional complex Hilbert space: |
U |.
In quantum computation, all operators on qubits are
reversible, except for the process called measurement, that
loses the information about the superposition of states. To
measure a general state | collapses (projects) it into either
the |0 state or the |1 state, with probabilities ||
2
or ||
2
respectively.
260
Others widely used operators and their corresponding
matrix operators are [11]:
I, identity operator: which is particularly useful when
combined with other in the case one wants to manip-
ulate a particular bit in a register leaving the others
intact.
I =

1 0
0 1

I |0 = |0
I |1 = |1
X, ip operator: behaves as the classical NOT on the
computational basis.
X =

0 1
1 0

X|0 = |1
X|1 = |0
H, Hadamard transformation: generates superposition
of states.
H =
1

1 1
1 1

H|0 = 1/

2(|0 + |1)
H|1 = 1/

2(|0 |1)
W, operator used in the PQM node.
W =

e
i

2n
0
0 1

H|0 = 1/

2(|0 + |1)
H|1 = 1/

2(|0 |1)
Quantum operators also may be represented in quantum
circuits by corresponding quantum gates. Figure 2
shows an n-qubit controlled gate U whose action on
the target qubit (bottommost) is active or not by n 1
(topmost) control qubits [11]. The output is checked by
measurement gates.


.
.
..
.
.

where

=
X X
Figure 2. A quantum circuit
IV. QUANTUM NEURAL NETWORKS
The concept of quantum neural computation was rst intro-
duced by Kak in 1995, creating a new paradigm using neural
networks and quantum computation which opens several
new directions in neural network research [19]. It is ex-
pected that quantum neural networks are more efcient than
classical neural networks, parallel to what is expected from
quantum computation in relation to Classical Computation.
In that same year, Menneer and Narayanan proposed
a Quantum Inspired Neural Network applied the multiple
universes view from quantum theory to one-layer articial
neural networks [20]. In 1998 Ventura and Martinez in-
troduced a quantum associative memory with a capacity
exponential in the number of neurons [21].
The non-linear activation functions used in models of neu-
ral networks delayed further development of their quantum
analogue. But in 2001 Altaisky proposed a quantum system
where the rules of perceptron learning were proposed [22].
Several Quantum Weighted Neural Networks models have
been proposed but there remains the challenge of direct
implementation in quantum circuits, natural adaptation of
the learning algorithms and quantum learning algorithms
respecting the postulates of Quantum Mechanics. These are
characteristics not altogether found in any of the proposed
Quantum Weighted Neural Networks models. The rst really
quantum neural networks is the weightless neural networks
based on quantum RAM in [1].
V. PROBABILISTIC QUANTUM MEMORIES
The probabilistic quantum memory PQM was proposed by
Trugenberger in 2001 [7]. The advantage of the PQM mem-
ory is its exponential capacity of storage. In [8] Trugenberger
use this memory to dene a quantum associative memory
taking beneces of this memory capacity.
In the PQM memory the information retrieval step seems
to contradict the general idea of a memory [23], since after
the retrieval step the memory needs to be prepared again.
Trungerberg argues that its model is efcient if the number
of inputs is polynomial [24]. The trouble with the PQM
memory is the need to measure the memory register in order
to recover the data stored in the memory.
The PQM memory is a state which represents a super-
position of the p patterns p
j
to be stored, as showed in
equation (1). The algorithm to create this state is described
in [7] with details; With this algorithm 2
n
patterns with n
qubits can be stored in n qubits.
|m =
1

p
p

j=1

p
j

(1)
The retrieval algorithm of the PQM requires three regis-
ters: the rst i will receive the input pattern, the second m
will contain the memory |m and the third c is a auxilary
quantum bit |c initializated as H|0.
|
0
=
1

2p
p

k=1

i
1
, , i
n
; p
k
1
; ; p
k
n
; 0

+
1

2p
p

k=1

i
1
, , i
n
; p
k
1
; ; p
k
n
; 1

(2)
After the deterministic part of the retrieval algorithm, the
quantum state will be as indicated in the equation (3), where
d
H

i, p
k

denotes the Hamming distance between i and


p
k
. A measurement in register |c says if the pattern is
261
recognized |c = |0 or unrecognized |c = |1. This process
can be repeated T times. If one gets the value |c = |0,
classies i as recognized and measure the memory register
to identify it [7].
| =
1

p
p

k=1
cos


2n
d
H

i, p
k

i
1
, , i
n
; p
k
1
; ; p
k
n
; 0

+
1

p
p

k=1
sin


2n
d
H

i, p
k

i
1
, , i
n
; p
k
1
; ; p
k
n
; 1

(3)
VI. PQM NODE
The measurement of the memory register in a PQM memory
is a negative characteristic since the memory collapse makes
it loose its future uses and one then need to prepare its state
again. In this section we propose a weightless neural node
based on the PQM memory and without the need to measure
the memory register.
In the RAM node the decision about the number of inputs
n
1
has directly inuence in the generalisation ability of the
network, however one of the main disadvantage of the RAM
node is that its memory grows exponentially with n
1
. In
this section will be dened a quantum node similar to the
RAM node, but with architecture based in the PQM memory
instead of the RAM memory, in such a way that the PQM
node will have the size of the memory equal to its number
of inputs.
A PQM node is implemented as a PQM memory. As in
the retrieval algorithm of the PQM memory, a PQM node
with n inputs has three registers: the rst is the register i with
n qubits in which we feed the input of the node; the second
memory register is m with n qubits to hold the memory of
the node |m and the third register is c with a single qubit,
the measure of c will determine the output of the node in a
probabilistic way. A PQM node with n inputs is represented
by a quantum state with three registers containing 2n + 1
qubits.
As the RAM node the PQM node is used to recognize only
one class, a new input pattern will make the node answer
positively if the pattern is in the class that the node was
trained for or otherwise negatively rejecting the pattern.
The learning algorithm of a PQM node is to create a
superposition of all the patterns in the training set |m in the
memory register m of the node. This process can be realized
using the storage algorithm of the memory PQM [7]. The
cost of the training is linear in the number n
2
of patterns in
the training set, the computational cost is (n
2
).
After feeding one new pattern in the input register of
the node, the deterministic part of the retrieval algorithm
of the PQM memory is applied to the state (Fig. 3), at
this moment the equation (3) describes the state of the
node. A measurement of the value in the memory register
c will determine the output of the node in a probabilistic
way described in the equation (4), where d
H
(i, p
k
) is the
Hamming distance between the input pattern and the kth
pattern stored in the memory.

P (|c = |0) =

p
k=1
1
p
cos
2


2n
d
H
(i, p
k
)

P (|c = |1) =

p
k=1
1
p
sin
2


2n
d
H
(i, p
k
)

(4)
For instance, if one class is represented in the training set
by {(00000000), (00000001)}, then it is used the algorithm
proposed in [7] to create the state described in the equation
(5).
| =
1

2
(|00000000 + |00000001) (5)
Feeding the pattern (11111111) in the input of the net-
work and initializing the register c with the state |c =
1

2
(|0 + |1) the state of the node at this moment is
described in the equation (6).
| =
1
2
[|11111111 (|00000000 + |00000001) |0 +
|11111111 (|00000000 + |00000001) |1]
(6)
After the action of the node, described in the Fig. 3 the
state | will be in the state described by the equation
| =
1
2

cos

8pi
16

|11111111 |00000000 |0 +
cos

7pi
16

|11111111 |00000001 |0 +
sin

8pi
16

|11111111 |00000000 |1 +
sin

7pi
16

|11111111 |00000001 |1

(7)
A measure in the register c of the node will result in |0,
the node recognize the pattern, with probability

2
cos

8pi
16

2
+

2
cos

7pi
16

2
= 1.9 10
2
(8)
and will result in |1 with probalility

2
sin

8pi
16

2
+

2
sin

7pi
16

2
= 98.1 10
2
(9)
and the pattern (11111111) will not be recognized.
The probabilities in the equations (8) and (9) can be
estimated repeating the execution of the node T times or
using a register c with more qbits [8].
Other disadvantage of the RAM node is the inability to
generalise; the ability of generalisation is achieved only with
RAM networks. As can be seen in the equation (4) the
output of a PQM node considers the Hamming distance,
in this equation the output |0 means that the pattern was
262
|i
1

|i
2

.
.
.

|i
n

|m
1


X W W
2
X

|m
2


X W W
2
X

.
.
.

W W
2

|m
n


X W W
2
X

|c H
Figure 3. PQM node
recognized, the output |0 will occur with high probability
if the input is near to the values stored in |m and the output
equal to |1 means that the pattern was not recognized, this
will occur if the input is distant to the values in |m. A new
pattern |i

not presented to the node in the learning phase


can be recognized by the node if the values of d
H
(i

, p
k
)
are near to zero. On the other hand, the storage of a pattern
|p

distant to the others storage data in the memory register,


will have a low probability of being recognised.
One of the limitations of the PQM node is its probabilistic
output. To a given input i, different outputs can be generated
by the same PQM node. This probabilistic behavior is
derived from the probabilistic quantum memory. To reduce
this limitation one can allow the node to produce b temporary
outputs and the most frequently output is used as the output
of the node [25].
In [8], Trugenberger uses the probabilistic quantum mem-
ory as an associative memory, however in his works after
the measurement of the output register c if the pattern was
recognized the memory register m is measured, then the
memory register collapses and looses its utility. The memory
needs to be recreated or probabilistically cloned.
Trugenberger does not dene a neural structure, he di-
rectly use the PQM memory as an associative memory. The
PQM node dened above can be used in neural networks
in exactly the same kinds of architectures as networks with
RAM nodes, for example the architecture with discriminator
[10], but with more exibility as to choose the number of
inputs of each node.
Experiments with the PQM node requires the simulation
of a PQM memory. However the simulation of quantum
systems in actual classical computers has exponential cost
in relation to the number of qubits used which restrict the
kind and size of experiments until a quantum computer is
built.
VII. CONCLUSION
The result obtained in this paper shows that the quantum
computation can be used to overcome problems in classical
models of neural networks, this same conclusion was veried
in others works as in [26], [27]. Particularly, in this article
was dened the PQM node, a quantum weightless neural
node similar to the RAM node where the memory has not
exponential cost in relation to the number of inputs of the
node.
Another point to be considered, applicable to any model
of quantum neural network, is the creation of a quantum
state |m where all the patterns in the training set are in
superposition. This allows the design of learning algorithms
for neural networks where all patterns are presented at the
same time, implying in a drastic reduction of the execu-
tion time of the learning algorithms. This reduction of the
learning algorithms time makes possible to extend the use
of neural networks where the costs of classical learning is
prohibitive.
In particular a learning algorithm for the q-PLN [1]
with one single execution of the network could be created
using the PQM memory. It is sufce to to superpose the
patterns of the training set in only one input |m to be
presented to the network and use the retrieval algorithm
or the Grover algorithm to obtain the desired values to the
network parameters.
ACKNOWLEDGMENT
This work is supported by research grants from
MCT/CNPq (Edital Universal) and PRONEX/FACEPE.
263
REFERENCES
[1] W. Oliveira, A. J. Silva, T. B. Ludermir, A. Leonel,
W. Galindo, and J. Pereira, Quantum logical neural net-
works, Neural Networks, 2008. SBRN 08. 10th Brazilian
Symposium on, pp. 147152, oct. 2008.
[2] W. S. McCulloch and W. Pitts, A logical calculus of the
ideas imminent in nervous activity, Bulletin of Mathematical
Biophysics, vol. 5, pp. 115133, 1930.
[3] I. Aleksander, M. de Gregorio, F. Frana, P. Lima, and H. Mor-
ton, A brief introduction to weightless neural systems, in
European Symposium on Articial Neural Networks, 2009.
Proceedings., April 2009, pp. 299305.
[4] W. Oliveira, Quantum ram based neural networks, in
European Symposium on Articial Neural Networks, 2009.
Proceedings., April 2009, pp. 331336.
[5] I. Aleksander, Self-adaptive universal logic circuits, Elec-
tronics Letters, vol. 2, no. 8, pp. 321322, 1966.
[6] W. Martins and C. G. Pinheiro, On-line expansion of goal
seeking neuron networks, Neural Networks, 2000. IJCNN
2000, Proceedings of the IEEE-INNS-ENNS International
Joint Conference on, vol. 4, pp. 523526, 2000.
[7] C. Trugenberger, Probabilistic quantum memories, Phys.
Rev. Lett., vol. 87, no. 6, p. 067901, Jul 2001.
[8] C. A. Trugenberger, Quantum pattern recognition, Quantum
Information Processing, vol. 1, pp. 471493, 2002.
[9] L.-M. Duan and G.-C. Guo, Probabilistic cloning and iden-
tication of linearly independent quantum states, Phys. Rev.
Lett., vol. 80, no. 22, pp. 49995002, Jun 1998.
[10] T. B. Ludermir, A. Carvalho, A. P. Braga, and M. C. P.
Souto, Weightless neural models: A review of current and
past works, Neural Computing Surveys, vol. 2, pp. 4161,
1999.
[11] M. A. Nielsen and I. L. Chuang, Quantum Computation and
Quantum Information. Cambridge University Press, 2000.
[12] R. Feynman, Simulating physics with computers, Interna-
tional Journal of Theoretical Physics, vol. 21, pp. 467488,
1982.
[13] D. Deutsch, Quantum theory, the Church-Turing principle
and the universal quantum computer, Proceedings of the
Royal Society of London Ser. A, vol. 400, no. 1818, pp. 97
117, 1985.
[14] , Quantum computational networks, Proceedings of the
Royal Society of London, vol. A 425, pp. 7390, 1989.
[15] N. Weaver, Mathematical Quantization, ser. Studies in Ad-
vanced Mathematics. Boca Raton, Florida: Chapman &
Hall/CRC, 2001.
[16] J. Faber and G. Giraldi, Quantum models of arti-
cial neural networks, in I Meeting of Quantum In-
formation, Belo Horizonte, 2002, electronically available:
http://arquivosweb.lncc.br/pdfs/QNN-Review.pdf.
[17] N. D. Mermin, From cbits to qbits: Teaching computer
scientists quantum mechanics, American Journal of Physics,
vol. 71, p. 23, 2003. [Online]. Available: http://www.citebase.
org/abstract?id=oai:arXiv.org:quant-ph/0207118
[18] K. Hoffman and R. Kunze, Linear Algebra. Prentic-Hall,
1971.
[19] S. C. Kak, On quantum neural computing, Information
Sciences, vol. 83, no. 3, pp. 143160, 1995.
[20] T. Menneer and A. Narayanan, Quantum-inspired
neural networks, Department of Computer Sci-
ence, University of Exeter, Exeter,United Kingdom,
Technical Report R329, 1995. [Online]. Available:
citeseer.ist.psu.edu/menneer95quantuminspired.html
[21] D. Ventura and T. Martinez, Quantum associative memory,
Information Sciences, vol. 124, no. 1-4, pp. 273 296, 2000.
[22] M. V. Altaisky, Quantum neural network, Joint Institute for
Nuclear Research, Russia, Technical report, 2001.
[23] T. Brun, H. Klauck, A. Nayak, M. R otteler, and C. Zalka,
Comment on probabilistic quantum memories, Phys. Rev.
Lett., vol. 91, no. 20, p. 209801, Nov 2003.
[24] C. A. Trugenberger, Trugenberger replies, Phys. Rev. Lett.,
vol. 91, no. 20, p. 209802, Nov 2003.
[25] C. Trugenberger, Phase transitions in quantum pattern recog-
nition, Phys. Rev. Lett., vol. 89, no. 27, p. 277903, Dec 2002.
[26] D. Ventura and T. Martinez, An articial neuron with quan-
tum mechanical properties, in Proceedings of the Interna-
tional Conference on Articial Neural Networks and Genetic
Algorithms, 1997, pp. 482485.
[27] R. Zhou and Q. Ding, Quantum m-p neural network, Int J
Theor Phys, vol. 46, pp. 3209 3215, 2007.
264

Das könnte Ihnen auch gefallen