Sie sind auf Seite 1von 13

PRESENTED BY:Diwakar Tyagi (0813310408) Sarwesh Singh (0813310071) Namit Varshney

pRoject guide :Mr. Suryaprakash

In symmetric cryptography, the sender and receiver use a shared key to encode and decode plain text. In asymmetric algorithms, users have their own private and public keys. There are some advantages and disadvantages in both methods in terms of speed and level of security. Symmetric algorithms are a lot faster than asymmetric ones. But they do need a shared key. How can we get a shared key via a public channel and protect it against opponents? There are many ways to do this, which are more or less effective, but We want to offer a rather new and absolutely secure method: Neural cryptography.

The objective is to develop the neural network one for each sender and receiver. Then after synchronization the two networks will share the key on public channel.

Input 0

Input 1

A Neural Network generally maps a set of inputs to a set of outputs Number of inputs/outputs is variable The Network itself is composed of an arbitrary number of nodes with an arbitrary topology

...

Input n

Neural Network

Output 0

Output 1

...

Output m

Input 0

Input 1

... ...

Input n

Definition of a node: A node is an element which performs the function y = fH((wixi) + Wb)

W0

W1

Wn

Wb

+
fH(x)

Connection
Output

Node

Binary logic application fH(x) = u(x) [linear threshold] Wi = random(-1,1) Y = u(W0X0 + W1X1 + Wb)

Input 0

Input 1

W0

W1

Wb

+
fH(x)

Output

Here we will use a neural network which consist of an input vector X, a hidden layer Sigma s, a weights coefficients W between the input vector and the hidden layer, and an activation procedure ? that counts the result value t. Let's call such a neural network a neural machine. It can be described by three parameters: K, the number of hidden neurons, N, the number of input neurons connected to each hidden neuron, and L, the maximum value for weight {-L..+L}

To count the

output value, we use a simple method:

How do we update the weights? We update the weights only if the output values of the neural machines are equal. There are three different rules: Hebbian learning rule

Anti-Hebbian learning rule Random-walk learning rule

This application will allow you to share the key on public channel. Provide a secure method to share the key. Enhance the security level in sharing the secrete data over public channel.

All the members of team are well familiar with windows. All the members are familiar with C# and some skills required by Neural networks.

http://en.wikipedia.org/wiki/Neural_cryptography

http://cryptome.org/neuralsub.ps

http://theorie.physik.uni-wuerzburg.de/~ruttor/neurocrypt.html

Das könnte Ihnen auch gefallen