Beruflich Dokumente
Kultur Dokumente
Universit Paris 8
Dpartement Micro-Informatique Micro-Electronique
2, rue de la Libert 93526 SAINT DENIS CEDEX 02
Applied Neuro-cryptography
by
Sbastien Dourlens
Summary
This memory is carried out in the context of the master MIME at the Paris 8 University.
Its purpose is to do research on the neuronal applications to Cryptography.
Searching for existing shows that no thesis, no Conference summary report, no work and no
internet (Web pages and news) user information and applied neural networks to
Cryptography.
We then thought it would be interesting to define a new field called neuro-Cryptography
whose aim is the use of neural networks to encrypt a message, decrypt a message or Exchange
messages in the network. The cryptogaphie contains another area to study in a probabilistic
manner the strength and weaknesses of an encryption algorithm, it comes to cryptanalysis.
Neural networks can play a decisive role in this area, this is why we have also defined the
neuro-cryptanalysis.
The areas of Artificial Intelligence with the networks of neurons, cryptography and
cryptanalysis have long been highly studied by universities around the world and, among
other things, by enterprises in electronic circuit design.
We begin by choosing the model and the most efficient learning of neural networks from its
qualities of synthesis of complex functions and statistical analyses. This model is the network
of perceptrons with back-propagation of the gradient. The hardware realization should not be
neglected since cryptography requires a great speed of learning, which is a function of the
number of keys and possible texts.
We have added elements allowing the creation of hardware architectures.
Then we choose the field of cryptographic applications, it is primarily the study of des (Data
Encryption Standard) and its cryptanalysis.
Then we test and measure the performance of neuro-Cryptography and neuro-cryptanalysis
which prove be quite interesting from all points of view. The calculation time can be
improved by design of machine architecture dedicated to the learning of cryptographic
algorithms through arsenide-based components or of massively parallel machines as it has
already been done for neural networks and the D.E.S. but separately.
In regards to neuro-Cryptanalysis of des, realize us a neuro-cryptanalyseurs differential and
linear studying the probabilities to get the entries of the S-tables based on outputs, allowing us
to obtain characteristics for an unknown subkey.
This line of research is open now, should continue to the coherence between the neural
network to learning of the globality of the cryptosystem and the neuro-cryptanalyseurs of the
internal structure of this cryptosystem which are very fast in learning. Another reason is the
ability of synthesis of the gradient back-propagation network.
Thanks
I want to thank my research director Mr. Christian Riesner researcher in Artificial
Intelligence specializing in neural networks.
Thanks to teachers - researchers from the Department of Micro computer - Micro Electronics
of the University Paris 8.
Thanks again to students, researchers and professors of universities that have provided me
information valuable and useful for this memory.
Table of contents
1 Introduction
1.1 Searching for existing
1.2 Neural networks
1.3 Contemporary Cryptography
1.4 The neuro-Cryptography applied
1.5 The memory map
2. Neural networks
2.1 Introduction
2.2 Basic concepts and terminology
2.3 The situation presents
2.4 Neural networks are used in Cryptography?
2.5 What types of neural networks use in Cryptography?
2.6 The model structure of perceptrons with back-propagation of the gradient
2.7 The gradient back-propagation algorithm
2.8 Analysis of linear multi-layer networks
2.8.1 Problem of the linear perceptron multilayer
2.8.2 Discriminant analysis of rank p
2.8.3 Incremental learning of the hidden layer
2.8.4 Relations with the principal component analysis
2.9 Material
2.10 Conclusion
3. The Cryptography
3.1 Introduction
3.2 Definitions
3.3 Contemporary Cryptography
3.3.1 The cryptosystem and strength
3.3.2 Protocols
3.3.3. The types of attacks in cryptanalysis
3.4 Cryptographic algorithms
3.4.1 The coding of blocks and the stream encoding
3.4.2 The number of Vigenre
3.4.3 The strong figures
3.5 Reference: the Data Encryption Standard (des)
3.5.1 History
3.5.2 Architecture
3.5.3 Cryptanalysis
3.5.4 The physical aspect
3.6 The Cryptanalysis of des
3.6.1 Differential cryptanalysis
3.6.2 Linear cryptanalysis
3.7 Conclusion
4. The Neuro-Cryptography
4.1 Introduction
Can 4.2 I bind the Cryptography and neural networks?
4.3 The new definitions
4.3.1 The neuro-encryption or neuro-encryption
4.3.2 The neuro-decryption or neuro-decryption
4.3.3 The neuro-generator
4.3.4 Neuro-cryptanalysis
4.4 The generation of bases of learning
4.4.1 Examples
4.4.2 Order of presentation
4.4.3 Automatic generation of texts
4.4.4 The coefficient of learning
4.5 Self-learning
4.6 The realization of applications
4.6.1 The learning of the exclusive or (XOR)
4.6.2 The learning of cryptographic algorithms
4.6.3 Key learning
4.7 The advantages and disadvantages
4.8 Conclusion
5. The Neuro-cryptanalysis
5.1 Introduction
5.2 Definition
5.3 General principle
5.4 Applied Neuro-cryptanalysis
5.4.1 The Neuro-Cryptanalysis of the Vigenre figure
5.4.2 The Neuro-differential cryptanalysis of des
5.4.3 The Neuro-linear Cryptanalysis of des
5.4.4 Overall Neuro-Cryptanalysis of the crypt (3) UNIX
5.5 Analysis of the results of cryptanalysis
5.6 Hardware implementations
5.6.1 Dedicated Machine
7 Conclusion
Bibliography
Neural networks
Cryptography
Mathematics
Chapter 1 - Introduction
1.1 Searching for existing
The purpose of the memorandum is to research on neuronal applications allowing for
Cryptography.
Searching for existing shows that no thesis, no Conference summary report, no work and no
internet (Web pages and news) user information and applied neural networks to
Cryptography.
Indeed, David Pointcheval de l'Ecole Normale Suprieure de Paris is served by the problem of
the perceptron to create an authentication protocol or it was an only mathematical and
theoretical study.
The areas of Artificial Intelligence with the networks of neurons, cryptography and
cryptanalysis have long been very studied by researchers at universities around the world and
among other electronic circuits design firms.
We then thought it would be interesting to define a new field called neuro-Cryptography
whose aim is the use of neural networks to encrypt a message, decrypt a message or Exchange
messages in the network. Cryptography contains another area to study in a probabilistic
manner the strength and weaknesses of an encryption algorithm, it comes to cryptanalysis.
Neural networks can play a decisive role in this area, this is why we have also defined the
neuro-cryptanalysis.
ease of use
the important signal-to-noise ratio
an easy-to-implement cascade circuit
a high adaptation (these circuits allow to solve various tasks)
a reduced price of manufacturing
the function of non-linear transfer with a full circuit of calculation or a table that
contains the values of the function approximations, or a circuit of calculation of
approximations (for the sigmoid with 1/5th of pas and a error of less than 13%.) Just 4
comparators and a few logic gates (ALIPPI 1990))
memorization of values (S-RAM or D - RAM memories)
We present then the three types of existing components on the market or research laboratory:
1. components dedicated to digital neural which speeds network go up to 1 GB of
connections processed per second
2. the digital coprocessors particular purpose (also called neuro-accelerators) are special
circuitry that can be connected to hosts (PCs or workstations), they work with a neurosimulator program. The mix of hardware and software aspects gives these benefits:
accelerated speed, flexibility and improved user interface
3. networks of neurons on massively parallel machines
An implementation of the above mentioned algorithm has been developed on the Connection
Machine CM-2 (created by THINKING MACHINES Corp.) with a topology hypercube 64 k
processors, which gave 180 million interconnections calculated per second (IPS) are 40
million weight updated per second.
Here is the performance measured by machine in interconnections calculated by seconds
(figure below).
CM-2
180 millions
CRAY X-MP
50 millions
WARP (10)
17 millions
ANZA PLUS
10 millions
The use of such configurations would get good results in learning of cryptographic ciphers.
to ciphertext only : the attacker must find the cleartext with the encrypted text. A
ciphertext attack is practically impossible, everything depends on the encryption.
to known-plaintext : the attacker has the plaintext and corresponding ciphertext. The
ciphertext was not chosen by the attacker but anyway the message is compromised. In
some cryptosystem, a pair of encrypted text - plaintext can compromise the security of
the system as well as the transmission medium.
to chosen plaintext : the attacker has the ability to find the ciphertext corresponding to
an arbitrary plaintext of his choice.
to chosen ciphertext : the attacker can arbitrarily choose and find the corresponding
unencrypted clear text. This attack may show weaknesses in the systems public key,
and even to find the private key.
to suitable chosen plaintext : the attacker can determine the ciphertexts of plaintexts
chosen in an iterative process or interactive based on the results previously found. An
example is the differential.
We quickly describe modes of encryption with Ci which is the i-th message Mi encrypted, E
the encryption function, D the function reverse for the key (or subkey) K and Vi an
intermediate encrypted message:
The ECB (Electronic Code Book) mode where Ci = EK(Mi) and Mi = DK(Ci)
CBC (Cipher Block Chaining) mode where Ci = EK(Mi XOR Ci-1) and Mi = DK(Ci)
XOR Ci-1
The OFB (Output FeedBack) mode where Vi = EK(Vi-1) and (c)i= Mi XOR Vi
The Cipher FeedBack (CFB) mode where Ci = Mi XOR EK(Ci-1) and Mi = Ci XOR
EK(C- i-1)
withstand differential cryptanalysis which was classified by the army and unknown to
researchers.
It uses blocks of 64-bit input L0 and R0, the length of the key K is 56-bit (8-byte without the
last bit used for parity). This key allows to generate 16 different sub-keys of 48-bit K1 to K16
. Contrary to appearances, it was highly enough and it is a little less these days because it
takes 256 ciphers to find the key with an exhaustive search.
The function f is called a round,i-th round receives inputs the right part Ri (or 32 bits of the text
to be encrypted) and the Ki subkey (48 bits). The rounds of des are detailed below. He gets out
of 32 bits that are added to Li. While Ri is passed as what Li + 1, the encrypted bits are
transmitted to Ri + 1 (except for the final round).
The physical aspect is very important for the speed of execution. The VLSI components are
widespread and effective but there are even more interesting technology-based components
that should not be disregarded: the Gallium Arsenic (GaAs) or arsenide technology. It has
already been included in supercomputers.
The major differences between GaAs and VLSI are:
With regard to the D.E.S., there is a circuit running at 50 MHz performing encryption in 20
ns, which allows to make 50 million of ciphers in a second.
Since late 1995, AMD sells a circuit encrypting at 250 MHz.
In August 1993, the Canadian Michael J. WIENER described how to build a machine for $ 1
million that performs a comprehensive search of des keys to find the right key in 3.5 hours.
Each of its basic circuits has power equivalent to 14 million stations SUN.
It seems so obvious that the exhaustive search is faster to perform types of cryptanalysis
because even if the number of attempts is less, the search time is much longer, cryptanalysis is
still very interesting to measure the performance of cryptographic algorithms.
We analyze then both as successful cryptanalysis against des.
Differential cryptanalysis is to look at the specifics of a pair of ciphertexts for a pair of
plaintexts with a particular difference.
It analyses the evolution of these differences when the plaintexts spread through rounds of
DES to be encrypted with the same key.
After randomly choosing a pair of plaintexts with a difference set, calculate the difference in
the resulting ciphertexts. Using these differences, it is possible to associate different
probabilities to various bits of the sub-keys. Plus a large number of ciphertexts is analyzed,
most most likely encryption key will emerge.
Force of residing in his rounds and all operations of a round being completely linear except Stables, Eli BIHAM and Adi SHAMIR analyzed 8 S-tables for text input differences and
differences in output texts, these information are synthesized in 8 tables called Tables of
distribution of differences of the (see 8 tables in annex 3). We realized the algorithm to
generate these tables.
Linear cryptanalysis is to study the statistical linear relationships between a plaintext bits, the
bits of the ciphertext and key which allowed to encrypt. These relationships allow for some
bits of the key values when we know the plaintexts and ciphertexts associated.
It deduced the linear relationships of each S-table by choosing a subset of bits of input and
output bits, calculating parity (Xor) of these bits with parity of the subset is zero. In general, a
subset will be entries with parity 0 (linear) and others with parity 1 (affine).
MATSUI has calculated the number of zero-parities of each subset of input bits and output for
each S-table amongst the 64 x 16 = 1024 possible subsets. It is possible to associate different
probabilities to various bits of the sub-keys. Probabilities for a parity-zero (linear relationship)
are synthesized in 8 tables called Tables of linear approximations of a (see 8 tables in annex
4). We realized the algorithm to generate these tables.
0,05
0,06
0,06
0,05
0,08
0,07
0,05
0,08
At the level of the automatic generation of contiguous texts, we present an algorithm that can
generate clear examples regardless of the number of nested loops to a single body of loop
which will be executed on each iteration of the innermost loop.
The coefficient of learning, usually noted Epsilon and also called learning rate, allows a more
or less rapid learning with opportunities for convergence of the network to an inversely
proportional solution due to local minima of the curve of error measured by the basis for
learning and values output calculated by the neural network. Should empirically vary Epsilon
between 0.1 and 2.0. If the network doesn't want any similarly converge, this is certainly due
to the problem of the non-linearly separable, which is the case of learning of the XOR. You
should then use a Momentum Term whose real value is between 0.1 and 1.0 and which will
aim to avoid local minima by deriving the error function because it allows to take into account
in the current step of learning from previous steps.
Self-study can be interesting for neuronal learning of cryptographic algorithms. The neuronal
system has two parts: the emulator and the controller whose learning are carried out
separately.
The task of the emulator is to simulate the complex function or the encryption algorithm.
There therefore its entry State at any given time and an input at this time and its output is the
output of the algorithm to the following time. The input of the controller is the State of the
system at time k, its output is the value to be input to the algorithm or the function complex.
The proper role of the controller is to learn the law of adaptive control. But for this learning,
the error signal is not calculated on the order but on its result, the gap between actual
condition and current state. It comes to the idea of a guided rather than supervised learning
because no Professor learns the System Control Act. In fact, the system learns itself in dealing
with the information he receives in return for shares. To make possible learning through
backpropagation and retropropager error on the position, the structure of the emulator must be
homogeneous at the controller.
Another quality of this device is its ability to e-learning. Learning of the controller is fast. In
addition, the law of synthesized control is sufficiently robust to small random perturbations. It
is therefore possible to perform neural networks for self-learning on a line of communication
for encryption as for authentication of messages in real time.
We present several different applications.
On learning of the XOR, i.e. to achieve C = A XOR B, need us a network 16-bit input (i.e. 2
bytes A and B) and 8-bit output (a byte C). The network must therefore be 16 neurons input,
16 minimum layer (s) (s) hidden neurons and 8 output neurons. The broadcast consists of
65536 causes - effects. After various tests, the success to the XOR learning rate is very close
to 100% depending on the random weight initialisation and the number of submissions. More
the number of entries and hidden layer neurons are great, plus the number of presentations of
the base can be reduced. If the random initialization of the weight is correct, a single
submission can be sufficient and better quality.
The learning of cryptographic algorithms to determine a function or an algorithm for
combining data entries (causes) for output data (effects). It is therefore to determine input and
output of the network structures and to find a basis of causes and associated effects sufficient
to learning of the network converges to a minimal amount of errors, or even almost.
The question that arises is to know how to make the neural network can memorize the
algorithm. The answer is to present virtually all possible encryption keys (e.g. 64 bits) and all
possible plaintexts (e.g. 64 bits) input and calculate all resulting ciphertexts with the
encryption algorithm. Thus, the neural network will be synthesized algorithm since when it
presents him an encryption key and a plain text input, it will give us output the ciphertext
whereas.
If the encryption algorithm is bijectif (that is, if are presented as input encrypted text it gets
output the plaintext) then the encryption algorithm is the same as the decryption algorithm
and the neural network also decrypts.
With regard to key learning, an encryption key must be linked to an encryption or decryption
algorithm and a clear or encrypted text.
If key has a fixed size of N -bit, should be N -bit output of the neural network and M bits input
equal to two times the number of bits of text blocks clear and encrypted text.
In fact, the neural network realizes a function that finds the key directly from a plaintext and
encrypted text.
We present then the advantages and disadvantages of the neuronal methods used. Learning of
neural networks time remains long enough on the basis of the number of bits of the key and
encrypted and clear texts, this time can be optimized if the neural network is implemented on
a parallel machine.
Regards memorizing keys and encryption algorithms, neural networks are high achievers with
over 90% success in learning of weak ciphers. A strong encryption algorithm, to rapid
learning. Neural networks are widely used in image recognition, it is so simple to perform
authentication. At the level of the hardware architecture, it is easy to parallliser the
algorithms. As well as at the level of networks of neurons and ciphers based on hardware
architectures. But this solution is quite expensive. The design of neuro-ciphers can be useful
in cases where a secret key and an encryption algorithm are taught how to network to hide
information to the user, in particular, at the level of the key generator that could be kept secret
by a distributor body. It would be messy to a cryptanalyst to discover the function of the
generator algorithm of encryption keys. Neuro-cryptanalysis seems to be a lot more
application to neural networks due to their emergent properties of massively parallel statistical
analysis and their ownership of concentration of information or approximations of statistical
matrices.
An application of the most important of neuro-Cryptography is neuro-cryptanalysis. Neurocryptanalysis is to perform the Cryptanalysis of cryptographic algorithms with the use of
neural networks. I.e. to achieve one or more neural networks to find or help find the key of an
encryption algorithm. The important principle is the presentation to the network of neurons a
ciphertext and the encryption algorithm.
In neuro-cryptanalysis, the neural network to help find the encryption key used in the cipher
text.
As a neural network can learn a cryptographic algorithm or can 'remember' (by a function
approximation) a set of keys. This neural network structure is identical to that of the self-
study. It is clear that neural networks can take an important place in cryptography in the
design, use, and verification of protocols.
We test and present possible forms of neuro-as cryptanalysis.
To neuro-cryptanalyser a Vigenre figure, it would take that our neural network either a
frequency analysis or one analysis of a subset of n characters of a given language, and then
measure the correlation between the plaintext and the ciphertext learned for all subsets of n
characters. This type of problem is resolvable by a neural network but would be very long in
supervised learning. However, it is possible to carry it out in self-learning mode but the
ciphertext should be large enough.
We measure the performance of neural networks at the statistical level by neuro-differential
cryptanalysis and linear neuro-Cryptanalysis of des according to the following scheme:
We have added a visualization program of the first graphical statistics. The second provides
information quickly.
We can deduce the following results.
The neuro-crypanalyses differential and linear methods are methods of probabilistic
calculations to quickly get information about a part of des. They allow to perform the opposite
function of a S-table for a difference of texts chosen for one and for a linear relationship with
a subkey selected for the other. Such neural networks learning is very fast.
It is possible to gather for a method given, differential or linear, 8 x 16 = 128 neural networks
(one for each S-tablesnew each round) and to operate in parallel to the information given by
the ciphertext output of des to the plaintext input. Thus these networks may be supervisors of
other neural networks learning unsupervised amending the key bits that different texts pass
through the D.E.S.. Would be a self-learning of the sub-keys. From the sub-keys, we find the
encryption key.
Statistical analysis of the program under MS-DOS version results are surprising with 90% of
the encryption function for the base found by the network of neurons and about 80% of bits to
a close this basic example but not submitted to the network. This proves that for a low basis of
learning, it is easy to a neural network to find a clear password from a password encrypted
without taking into account the salt included by the Unix system.
We present then two architectures.
The first is a dedicated parallel architecture as a neuro-cryptanalyseur of strong ciphers needs
a very fast supervised learning. It is necessary to present all plaintexts, ciphertexts and keys to
the neural network. The following figure shows the overview of learning dedicated to an
encryption algorithm.
A complete machine can be constructed on the same pattern with a large number of units of
binary counters and circuits with the encryption algorithm. This number is limited by the time
of learning of the single neural circuit of approximately 1 s. It is preferable for the D.E.S.,
treat a fixed data subset as we have done in past applications.
In the second, we present our algorithms written to the distributed architecture of the CM-5
using 3 layers of processors with a processor for a neuron. The first is used to initialize (clear
text) input and output (ciphertext) of neural network which is located on layers 2 and 3. It is
likely that examples learning time is longer than for the dedicated machine of the preceding
paragraph.
input
hidden
ouput
Neural networks
Advantages / disadvantages
WERBOS, PARKER,
RUMELHART. 1987
Bidirectional associative
memory
KOSKO. 1987
CAUCHY machine
CAUCHY. 1986
Brain-state-in-a-box
ANDERSON. 1977
Unknown performance.
HOPFIELD. 1982
Low memory.
KOHONEN. 1981
KOHONEN. 1981
and learning
Self-Association memory
HOPFIELD
Self-Association memory
KOHONEN
The learning vector
quantization
Self-organization
Auto-organisatrices cards
Figure 2.3.1 - models of neural networks
Among these networks should take to cryptography that allows us a quick learning with little
memory capacity because the purpose of the use of such a network is a transfer function
approximation or synthesis of cryptographic algorithms.
Perceptrons neural network has the advantage of being currently well known and to meet our
needs, it is easy to implement, and his performances are very interesting.
2.4 Neural networks are they used in Cryptography?
There are a few applications that have been studied in the context of the compression of
images or files and the identification of messages (completed application no) (PATHMA
1995). We believe that apart from secret military projects, no neural network is used for
encryption, decryption and cryptanalysis. However some students specialized in cryptography
in France and Belgium appear to be interested. But no literature or media contains information
on this subject.
2.5 What types of neural networks are used in Cryptography?
As we have seen in paragraph 2.3, the model of perceptrons with back-propagation of the
gradient is the most studied and demonstrated reliability with respect to the learning of the
XOR, these networks are simple to implement and have a fast learning.
This model is more suited to the synthesis and looking for associations or recognition. In
addition, all States and the outputs of the neurons of these networks can be updated
simultaneously. (See the code in annex 1 of the learning of the XOR). A critique of learning
algorithms lets say our choice for this model: (CAMARGO 1990). Paragraph 2.8 show these
benefits specifically.
2.6 The model structure of perceptrons with back-propagation of the gradient
Figure 2.6.1 on the next page shows the structure of the model of perceptrons in backpropagation of the gradient. There are input bits, the hidden layer, and the output layer. The
deltas of the hidden layer, those of the output layer and activations for learning.
The choice of the number of hidden layer neurons necessary must obey a compromise
optimizing learning avoiding the overfitting which would be the consequence of a too large
number of hidden units. This choice is often the result of know-how and practical experience.
It can be guided by statistical considerations.
Supervised learning in this case is to measure the error between the inputs and outputs and
then perform the propagation of the error to neurons in the hidden layers and those entries. F
transfer function is a sigmoid function which the differentiability plays an important role.
Figure 2.7.1 shows (a) layer architecture and function of transfer, (b) the calculation of the
error signal to one output device and (c) the calculation by backpropagation of error of a
hidden unit signal.
In the next parts of this memory, neural networks which is discussed will be networks of
perceptrons with back-propagation of the gradient.
Linear analysis of multilayer networks 2.8s
The success of the gradient back-propagation algorithm led researchers to analyze in detail the
process. They showed analogies with different statistical methods of analysis of data, in
particular linear regression and discriminant analysis. In this paragraph, we rely on
P.GALLINARI publications, F.FOGELMAN - SOULI (GALLINARI 1988) that carry out a
comparison of the classical method of discriminant analysis and the linear multi-layer
perceptron (with a layer of hidden units). In the linear case, it is shown that the
backpropagation is a discriminant analysis of a population of N individuals (N being the
number of examples included in learning) described by n parameters (wheren is the number of
input neurons) and projected onto a hyperplane of dimension p (wherep is the number of
hidden units).
These results are then used to validate an incremental construction of the hidden layer. It is
thus shown that when we add a set of q hidden units, it is not necessary to repeat all the
learning, simply freeze the existing connections and make learning about connections relating
to units just to add. We can consider an incremental construction of the layer of hidden
neurons that saves a precious learning time but it means a variable structure.
The general interest of this approach is to show how comparison algorithms Connectionist
and classical methods suggests a permanent enrichment of the first allowing them to increase
their performance.
ease of use
the important signal-to-noise ratio
an easy-to-implement cascade circuit
a high adaptation (these circuits allow to solve various tasks)
a reduced price of manufacturing
For more details, it should read reports written by Dr. VALERIU BEIU for the implementation
and optimization of VLSI neural networks (BEIU 1995a), (BEIU 1995 b).
Figure 2.9.1 below shows comparison of different materials for the implementation of neural
networks.
In regards to the backpropagation, NIGRI completed a circuit containing a table for all real
values of the sigmoid between-2 and 2 with 8-bit precision what is regarded as sufficiently
precise (NIGRI 1991).
Here are the three types of existing components on the market or research laboratory:
1. components dedicated to digital neural which speeds network go up to 1 GB of
connections processed per second
2. the digital coprocessors particular purpose (also called neuro-accelerators) are special
circuitry that can be connected to hosts (PCs or workstations), they work with a neurosimulator program. The mix of hardware and software aspects gives these benefits:
accelerated speed, flexibility and improved user interface.
For more information or the references of the machines above (with an asterisk), you can
consult (Beiu 1995 c).
You will find in annex 2 a set of electronics manufacturers who directed networks of neurons
in Silicon.
An implementation of the above mentioned algorithm has been developed on the Connection
Machine CM-2 (created by THINKING MACHINES Corp.) with a topology hypercube 64 k
processors, which gave 180 million interconnections calculated per second (IPS) or 40 million
weight updated per second.
Here is the performance measured by machine in interconnections calculated by seconds
(figure 2.9.3).
CM-2
180 millions
CRAY X-MP
50 millions
WARP (10)
17 millions
ANZA PLUS
10 millions
The use of such configurations would allow to obtain excellent results in learning of
cryptographic ciphers.
You will find in chapters 4 and 5 How to use the implementation of neural networks on the
Connection machine CM-2 or CM-5 in Cryptography.
We detail the functioning of the MASPAR machines CM-5 in annex 9.
2.10 Conclusion
In this chapter, we see that the neural network model most interesting model is the perceptron
in back-propagation of the gradient and supervised learning is the most suitable. In addition,
the use of the networks of neurons in cryptography is very low and even very little known while
the study which has been made so far of neural networks allows to say that perceptrons networks
are able to learn to synthesize a transfer function fairly easily. They allow to give statistics, as
well as more traditional statistical methods, based on the values of entries making it very useful
in Cryptography. It also emerges that neural networks are currently at the level of hardware
implementing comprehensive enough and made at the industrial level. These networks can be
perfectly parallel and excessively fast.
Everything shows that should bind neural networks to cryptography, but what is Cryptography
appropriate? And what cryptographic tools use? The answers are the following chapters.
Chapter 3 - Cryptography
3.1 Introduction
We give in this chapter of the important definitions to understand the continuation of our
work as well as clarification regarding the current situation of the world "known"
Cryptography then we describe the composition of cryptographic algorithms, weak and
strong. We specifically detail the D.E.S. because, after more than 20 years of existence, it
remains the most used and the most studied, especially at the level of its cryptanalysis which
is very difficult.
3.2 Definitions
However, even if the cryptosystem meet the previous criteria, it cannot conclude that this
system is infallible!
The cryptosystems are of two types: public key or private key.
A cryptosystem private key K is defined by DK(= me CK(M)) = M where C is the function of
encryption and decryption function, M D a clear message and me the encrypted message.
3.3.2 Protocols
The protocols are a series of steps to human beings (at least two) to accomplish a task.
Cryptographic protocols allow participants to exchange secret information between them.
Applications using them are data communications, authentication, management of private and
public keys, cutting messages, mix of messages, access to databases, dating services,
subliminal messages, digital signatures, collective signatures, pledging, playing heads or tails,
playing poker blind evidence disclosure void, silver electronics and anonymous messages.
The best would be a protocol to intrinsic discipline because he himself would ensure the
integrity of the transaction (without intervenor or "arbitrator"), its construction would make
impossible challenges; There are no!
The study of the protocols is very documented in (SCHNEIER 95). We will attach in the
pages that follow to the neuronal development and neuro-Cryptanalysis of cryptosystems that
looking for protocols making it more secure exchange of information between participants.
3.3.3. The types of attacks in cryptanalysis
Cryptanalysis distinguishes between the following different types of possible attacks:
to ciphertext only : the attacker must find the plaintext having only the ciphertext. A
ciphertext attack is practically impossible, everything depends on the encryption.
to known-plaintext : the attacker has the plaintext and corresponding ciphertext. The
ciphertext was not chosen by the attacker but anyway the message is compromised. In
some cryptosystem, a pair of encrypted text - plaintext can compromise the security of
the system as well as the transmission medium.
to chosen plaintext : the attacker has the ability to find the ciphertext corresponding to
an arbitrary plaintext of his choice.
to chosen ciphertext : the attacker can arbitrarily choose and find the corresponding
unencrypted clear text. This attack may show weaknesses in the systems public key,
and even to find the private key.
to suitable chosen plaintext : the attacker can determine the ciphertexts of chosen
plaintexts in an iterative and interactive process based on the results previously found.
An example is the differential.
Some of these attacks can be interesting when they are used against ciphers strong. See (FAQ
96) and (SCHNEIER 95) for details of these attacks.
3.4 Cryptographic algorithms
3.4.1 The coding of blocks and the stream encoding
In general, the plaintext M is divided into blocks of bits of fixed length: M =1M2M...MN.
Each Mi block is encrypted: Ci = Ek(Mi) and the result is added to the ciphertext C =12C C
...CN .
There are 2 main types of coding: coding blocks and the stream encoding.
In the coding of blocks, the size of a block must be high to prevent an attack: it is usual to use
64-bit to be 264 research opportunities. The transformation function T (M) = C is the same for
each block which can memory and goes relatively quickly to encode.
In the stream encoding, blocks are encoded sequentially and each block is encoded by a
separate transformation which depends on:
1. previous coded blocks, and/or
2. previous processing, and/or
3. the number of blocks
This information must be in memory between each coding of blocks. If the transformation
varies in each block, the block size can be short (usually between 1 and 8 bits).
The same clear text or message M won't give so necessarily the same ciphertext C.
Block coding is a coding of substitution in which the plaintext and ciphertext blocks are
binary vectors of length N. For each key, the encryption function EK(M) is a permutation of
the set {0,1}N to itself. DK (C) is the decryption function (inverse permutation) such as
DK(EK(M)) = EK(DK(C)) = identity.
There are 4 modes of encryption which are ECB, CBC, OFB, CFB
This algorithm is trivial has broken, if we accept that the characters are ASCII and the length
of the key is unknown:
1. You must first discover the key by a process called counting of coincidences
(FRIEDMAN 1920): compare the text encrypted to itself but shifted a given number
of bytes: count the number of identical bytes. If the two blocks of text put face to face
have been encoded with the same key, more than 6% of the bytes will be equal. If they
have been encoded with a different key so less than 0.4% of the bytes will be equal.
The smallest movement indicating a high coincidence is the length of the desired key.
2. Then he must offset the ciphertext of this length and apply the XOR between
ciphertext and thereby offset text. This operation removes the key and leave you with
the result of the XOR of the plaintext with itself shifted. The English language rate is
between 1 and 1.5 bit/letter, 1.2 for Shannon; the French is between 1 and 1.8 bit/letter
(see Chapter 6). There is enough redundancy to choose the correct decryption.
The code in C of this program is in the annex.
This figure is too low to be sure!
3.4.3 The strong figures
There are two kinds of strong encryption algorithms: only XOR operation between text and
code-based ciphers at base of very large prime numbers and others.
An example of the first case is the R.S.A. (RIVEST, SHAMIR and ALDEMAN) which is
PKI.
Here is the algorithm:
1. Decompose data into blocks of length equal to the length of the code word
2. Make a XOR between the block (modified by a given encryption) and code (key or
subkey encrypted)
3. Write the encrypted block
4. Repeat step 2 for each block
This algorithm is the same as almost all encryption algorithms, the differences come from the
generation of the keys to encrypt or decrypt.
In the R.S.A., it is necessary to generate codes (2 public codes and 3 secret codes) to encrypt
and decrypt, so the authors had to:
1.
2.
3.
4.
The R.S.A. is based on the theory of numbers (see chapter V), in particular, the difficulty of
factorization of a number into its prime factors. Its effectiveness lies in the proliferation of
these factors. For more details on the R.S.A. should absolutely read (ALDEMAN 78).
PGP (Pretty Good Privacy) Zimmerman combines the RSA and the use of very long primes.
In the second case, it has the D.E.S. we describe in the next paragraph, it works with a private
key. (LUCIFER is the ancestor of REDOC II, SNEFRU, KHAFRE, IDEA, LOKI and
FEAL are of the same type and weaker algorithms that the of).
We realize that all of the encryption is based on expansions, reductions and permutations of
bits. Apart from the round, these operations are linear.
L, R: part low and high current text block
Separation of the 16 sub-keys (48
bits per round) (56-bit) key
C (0), D (0) = PC1 (key)
LS: offset
L (i) = R (i-1)
R (i-1) = L (i)
}
3.5.2.2 Figure (b) - des algorithms
The D.E.S. combines 2 mathematical techniques: confusion and dissemination (see Chapter
6). The round f apply the text substitution (8 S - boxes or S-tables) followed by a permutation
(P-boxing or P-table) based on the text and the key.
3.5.2.3 Figure which follows presents the synopsis of a round (the function f).
The content of this round is otherwise presented in figure the following paragraph 3.5.3.1.
Various standards have emerged to standardize the exchange of encrypted information D.E.S.;
ANSI standards references are X3.92.digital: D.E.S., X3.106: modes of operation, X3.105:
network, X9.19: authentication, X9.24: distribution of keys; the references of standards of the
Federal standard are 1027 and 1028.
3.5.3 Cryptanalysis
3.5.3.1 Figure following shows the architecture of a round with its S-tables which, unlike
other operations, are more or less mi-lineaires/mi-affines. If they were completely ripened,
des would be very easy to break, but they have been selected to withstand attacks. The subkey
bits, and those of the once expanded text block are added, and substituted through S-tables
then swapped.
Current research to break des, without exhaustive research, have managed to weaken the
D.E.S. but little. The results are in figure 3.5.3.2 and (SCHNEIER, 1996).
Exhaustive search
Differential cryptanalysis
A clear texts
chosen
256
247
A known plaintexts
256
255
Des operations
256
237
Linear
cryptanalysis
243
There are two types of cryptanalysis: differential cryptanalysis and linear cryptanalysis are
described in paragraph 3.6.
The complete and commented code in C to the D.E.S. is located in Appendix 1.
3.5.4 The physical aspect
The physical aspect is very important for the speed of execution. The VLSI components are
very widespread and effective but there are even more interesting technology-based
components that should not be disregarded: the Gallium Arsenic (GaAs) or arsenide
technology. It has already been included in supercomputers.
The major differences between GaAs and VLSI are:
The GaAs (DCFL E/D-MESFET) Gates times are less than or equal to 50 picoseconds, while
it takes at least a nanosecond in Silicon (NMOS).
The access time to memory RAM GaAs takes approximately 500 picosecond and 10
nanoseconds in Silicon. This indicates that the performance of computers based on the GaAs
technology should be 20 times higher than the fastest silicon-based supercomputers. On the
other hand, the level of integration GaAs is of about 50,000 transistors per integrated circuit
while it is 1 million in Silicon due to the problem of heat dissipation. This problem is greater
the number of GaAs circuits required to design a computer and a high-performance computer
is to optimize the number of circuits integrated on the motherboard.
GaAs circuits with outside communication is another factor. The problem is the downturn
forced by other components. However, the signal propagation is not very different between
silicon and GaAs. The only solution to solve this exchange rate is to introduce a memory with
a multi-level hierarchy. However it does not exist for the moment which works with the GaAs
technology.
Although the GaAs technology cannot be fully exploited for the moment, it is certainly a very
interesting technology of the future for the Cryptography due to its excellent performance. If
the CM - 2 has its equivalent in arsenide, is the property of the military.
With regard to the D.E.S., there is a circuit running at 50 MHz performing encryption in 20
ns, which allows to make 50 million of ciphers in a second.
Since late 1995, AMD sells a circuit encrypting at 250 MHz.
In August 1993, the Canadian Michael J. WIENER described how to build a machine for $ 1
million that performs a comprehensive search of des keys to find the right key in 3.5 hours.
Each of its basic circuits has power equivalent to 14 million stations SUN.
See (WIENER 1993) for more details on this machine.
It seems so obvious that the exhaustive search is faster to perform types of cryptanalysis
because even if the number of attempts is less, the search time is much longer, cryptanalysis is
still very interesting to measure the performance of cryptographic algorithms.
You will find in annex 9, the characteristics of the MASPAR machines CM-5.
3.6 The Cryptanalysis of des
3.6.1 Differential cryptanalysis
It is an attack to clear texts chosen on the rounds of des to find the key. (the presentation of
the various attacks was made in paragraph 3.3.3 In 1990 and 1991, Eli BIHAM and Adi
SHAMIR create differential cryptanalysis, this method is to look at the specifics of a pair of
ciphertexts for a pair of plaintexts with a particular difference.
Differential cryptanalysis analyzes the evolution of these differences when the plaintexts
spread through rounds of DES to be encrypted with the same key.
After randomly choosing a pair of plaintexts with a difference set, calculate the difference in
the resulting ciphertexts. Using these differences, it is possible to associate different
probabilities to various bits of the sub-keys. Plus a large number of ciphertexts is analyzed,
most most likely encryption key will emerge.
Force of the resident in his rounds and all operations of a round being completely linear
except S-tables (or S-boxes), Eli BIHAM and Adi SHAMIR analyzed 8 S-tables for text input
differences and differences in output texts, these information are synthesized in 8 tables called
Tables of distribution of differences of of (see the 8 tables in annex 3). We realized the
algorithm to generate these tables in figure 3.6.1.1. P is a plaintext, P* is another clear text, X
is the encrypted text of P, X* is the encrypted text of P*, P' is the difference of P and P*, X' X
and X*
Once these tables is generated, pictured in figure 3.6.1.2 on next page it is possible to have
information about B ( B = B xor B *) according to C ( C = C xor C *). So for a has
known text ( A = A xor A *), the combination of A and C suggests bit a xor Ki values
and has ' xor Ki which allows to have information on a few bits of the Kt subkey.
With this information, it is possible to overlook a large number of chosen plaintexts.
3.7 Conclusion
In this chapter, you have seen a terminology and a set of points on which it is interesting to
consider neuro-cryptography, especially in the study of their as cryptanalysis and encryption
algorithms, in means of hardware and software of the Cryptography implementation. The des
and its cryptanalysis, study with neural network architecture should prove their effectiveness
of memorization and probabilistic research for complex encryption algorithms. Found in the
following chapters of the theories and applications implemented to prove these theories.
Chapter 4 - Neuro-Cryptography
4.1 Introduction
In this chapter, we define the possible association between neural networks and
Cryptography. We then present the neuro-Cryptography as well as the range of possible
applications to perform encryption, decryption and Cryptanalysis of a chosen algorithm. Also
found in this chapter the formation of a learning base and different parameters related to the
learning of ciphers and discuss self-study as part of a line of communications information
control applications.
Begin
d =Integer(p * k/n);
return ((p- Integer(d*n/k)) * k) + d;
End
0.05
0.06
0.06
0.05
0.08
0.07
0.05
0.08
Figure 4.4.3.2 - variable nested loops for the generation of ordained texts
In this case, the body function has arguments the values of the counters of loops and return a
Boolean value to indicate whether or not exit loops. b is the value of the current loop. An
example of C source code is located in Appendix 1 (Automatic Generation of basis for
learning of des).
4.4.4 The coefficient of learning
This coefficient, generally noted Epsilon and also called learning rate, allows a more or less
rapid learning with opportunities for convergence of the network to an inversely proportional
solution due to local minima of the curve of error measured by the basis for learning and
values output calculated by the neural network.
Should empirically vary Epsilon between 0.1 and 2.0. If the network doesn't want any
similarly converge, it is certainly due to the problem of the non-linearly separable, which is
the case of learning of the XOR. Should then use a Momentum Term whose real value is
between 0.1 and 1.0 and which will aim to avoid local minima by deriving the error function,
meaning that it allows to take into account in the current step of learning from previous steps.
4.5 Self-learning
Self-study can be interesting for neuronal learning of cryptographic algorithms. The neuronal
system consists of two parts the emulator and the controller whose learning are carried out
separately.
The task of the emulator is to simulate the complex function or the encryption algorithm.
There therefore its entry State at any given time and an input at this time and its output is the
output of the algorithm to the following time. Learning is done by presenting every moment a
different input (figure 4.5.1).
Once completed the learning of the emulator, it is connected to the controller (figure 4.5.2).
To achieve C = A XOR B, need us a network 16-bit input (i.e. 2 bytes A and B) and 8-bit
output (a byte C). The network must therefore be 16 neurons input, 16 minimum layer (s) (s)
hidden neurons and 8 output neurons. The broadcast consists of 65536 causes - effects.
You can find the code in C of this network in annex 1. (The coefficient of learning is referred
by EPSILON). The rate of success at learning of the XOR is very close to 100% depending on
the random weight initialisation and the number of submissions.
More the number of entries and hidden layer neurons are great, plus the number of
presentations of the base can be reduced. If the random initialization of the weight is correct, a
single submission can be sufficient and better quality.
The table in annex 8 the measurement error rate for each presentation.
4.6.2 The learning of cryptographic algorithms
Just as the previous paragraph, to determine a function or an algorithm for combining data
entries (causes) for output data (effects).
It is therefore to determine input and output of the network structures and to find a basis of
causes and associated effects sufficient to learning of the network converges to a minimal
amount of errors, or even almost.
Any encryption algorithm consists as in figure 4.6.2.1.
The question that arises is to know how to make the neural network can memorize the
algorithm. The only answer is to present virtually all possible encryption keys (e.g. 64 bits)
and all possible plaintexts (e.g. 64 bits) input and calculate all resulting ciphertexts with the
encryption algorithm.
Thus, the neural network will be synthesized algorithm since when it presents him an
encryption key and a plain text input, it will give us output the ciphertext whereas.
If the encryption algorithm is bijectif (that is, if are presented as input encrypted text it gets
output the plaintext) then the encryption algorithm is the same as the decryption algorithm
and the neural network also decrypts.
algorithm being quite long and requiring the use of parallel machines, to use neural networks
to synthesize an encryption algorithm with a given key, this algorithm and this key being kept
secret for example by a distributor body.
Chapter 5 - Neuro-cryptanalysis
5.1 Introduction
In this chapter, we present the neuro-Cryptanalysis of strong encryption, the general principle
being the search for key by a neural networks-based study, whether learning the functions of
the texts clear and encrypted keys. Then we describe applications. We present differential
neuro-cryptanalysis and linear neuro-Cryptanalysis of des, allowing us to measure the
statistical performance of neural networks. A dedicated hardware application is described as a
last resort.
5.2 Definition
Neuro-cryptanalysis is to perform the Cryptanalysis of cryptographic algorithms with the use
of neural networks. I.e. to achieve one or more neural networks to find or help find the key of
an encryption algorithm.
The reader will find in Chapter 3 an introduction to applications in neuro-cryptanalysis.
Neuro-cryptanalyseur means then a system performing the Cryptanalysis of a cryptographic
algorithm, this system is a hardware or software program containing at least a neural network
useful in cryptanalysis in question.
5.3 General principle
The important principle is the presentation to the network of neurons a ciphertext and the
encryption algorithm.
In neuro-cryptanalysis, the neural network must help find the encryption key used in the
cipher text, figure 5.3.1 shows a possible architecture of neuro-cryptanalyseur.
For examples using learning algorithm and realization of this neural network, you can read the
code C in annex 1. 5.4.2.1 Figure presents the neuro-cryptanalyseur after learning, it returns
information about the probability of having one bit to 1 of P' . One gets not directly of
probabilities on the bits of the subkey, just make a XOR between the bits of the input text pair
and those calculated for information on the bits of the subkey.
The neural network, at the end of 10 presentations of 4096 examples (pairs of texts among 64
S-table entry texts), gives the results contained in the table in annex 6. Just increase the
number of presentations to get more accurate probability values. Note that the obtained
probability exactly match the values given by the classical method of differential
cryptanalysis.
The advantage of the neural network is its concentration of the set of S-tables-specific
statistical matrices and massively parallel operation which allows to calculate the 8 S 8
cryptanalyseurs neuro - tables simultaneously.
5.4.3 Neuro-linear Cryptanalysis of DES
Linear cryptanalysis is described in section 3.6.2.
The neural network will generate all quadratic forms for obtaining information outputs on the
basis of its inputs, which amounts to generalized linear Cryptanalysis of of.S., generalized
linear cryptanalysis looks up information about the key from the study of the rounds of des
and more precisely of its S-tables which is different from the global study of the cryptosystem
by our neuro-cryptanalyzer.
probabilities on the bits of the subkey, just make a XOR between plaintext bits, and those
calculated for information on the bits of the subkey.
The results are given in annex 7. Just increase the number of presentations to get more
accurate probability values.
5.4.4 Overall Neuro-Cryptanalysis of the crypt (3) UNIX
The command of Unix crypt (3) or ufc_crypt (ultra fast crypt) is an implementation of the des
used in the encryption of passwords stored in the/etc/passwd file, a little special in the
direction where the key is unknown to the user, no one has the ability to perform decryption
of password. This key is specific to the Unix system in use. The goal is not to find the clear
password. It is encrypted with the same key given clear password and compare it with the
password from the/etc/passwd file. If they are identical, the user is authenticated and access to
its own account.
Crack is an application seeking the passwords of users on a Unix server. Its role is to generate
a clear passwords set on the basis of a multitude of syntactic rules and/or from a dictionary. It
takes several hours to several days to penetrate a system and then retrieve the password file
and search out others.
We thought it would be interesting to learn a certain amount of passwords clear and encrypted
passwords corresponding to a neural network. The basis of learning should be large enough so
the D.E.S. learning does not become a memorization of the examples of this basis, what
makes that the network would be unable to find the solutions to other nearby examples of the
database.
We have therefore made two applications. A UNIX (or GNU Linux) synthesizing the crypt
function of unix for password clear of 4 characters whose values are a lowercase letter or a
point or a division, or about 615000 passwords bar and 2 hours of calculations per
presentation. The other is MS-DOS, she realizes learning 1024 clear passwords of 7
characters and passwords encrypted in 11 characters (we remove the first 2 characters of salt
used to re-encrypt the password encrypted for 65536 encrypted different passwords for the
same clear text).
We have added a visualization program of the first graphical statistics. The second provides
quick information.
The source and the results are available in the annex.
5.5 Analysis of the results of cryptanalysis
The neuro-crypanalyses differential and linear methods are methods of probabilistic
calculations to quickly get information about a part of des. They allow to perform the opposite
function of a S-table for a difference of texts chosen for one and for a linear relationship with
a subkey selected for the other. Such neural networks learning is very fast.
It is possible to gather for a method given, differential or linear, 8 x 16 = 128 neural networks
(one for each S-tablesnew each round) and to operate in parallel to the information given by
the ciphertext output of des to the plaintext input. Thus these networks may be supervisors of
other non-supervised learning neural networks amending the bits of the key as and as different
texts passes through the D.E.S.. Would be a self-learning of the sub-keys. From the sub-keys,
we find the encryption key.
Statistical analysis of the program under MS-DOS version results are surprising with 90% of
the encryption function for the base found by the network of neurons and about 80% of bits to
a close this basic example but not submitted to the network. This proves that for a low basis of
learning, it is easy for a neural network to find a clear password from a password encrypted
without taking into account salt included by the Unix system.
5.6 Hardware implementations
There are 2 possible hardware implementations. One is based on existing architectures and
more precisely consists of an implementation on machine massively parallel type MASPAR
or Connection Machine (characteristics of these machines are given in annex 9).
The other is based on the design of architecture dedicated cryptanalyser encryption algorithm.
5.6.1 Dedicated Machine
The idea is to present a strong ciphers with a very fast supervised learning neurocryptanalyseur. As we show in paragraph 4.6.2 it is necessary to present all plaintexts,
ciphertexts and keys to the neural network. 5.6.1.1 Figure shows the overview of learning
dedicated to an encryption algorithm.
A complete machine can be constructed on the same pattern with a large number N of units of
binary counters (120 bits: 64 bits of text and 56 bits of key) and circuits with the encryption
algorithm (for the D.E.S., AMD has built a circuit at a clock frequency of 250 MHz arsenide
approximately 5.109 encryptions per second). The number N is limited by the time of learning
of the single neural circuit of approximately 1 s. Each unit has less than 14 ns.
For des, the time interval between each unit is necessarily 1 s. What gives 106 learning per
second to learn the 256 possible keys. Either 1030 s for all possible values of text and key, or
422 years for a presentation. If the neural circuit took 14 ns, should be 318 years.
In the case of a single key, it would take 41 years and for a single text should be 2 months.
While the exhaustive search for a key takes 3.5 hours for a dedicated non-neuronal machine
which would cost 5 million francs.
Nevertheless, it is possible that the neural circuits of the future go much faster. It is preferable
for the D.E.S., treat a fixed data subset as we have done in paragraph 5.4.4.
Repeat to infinity do
generate key & texte_clair in M
For i = 0 to NB_CACHES-1 Do issue M to all layer 2 end processors
encrypt M with the encryption algorithm in C
For i = 0 to NB_SORTIES-1 Do issue C to all layer 3 end processors
Finrepeter
Figure 5.6.2.1 - algorithm of the first layer of processors
It defines a small macro: bit (i, m) {return (! ())}m & (1 < < i))); } for the following
algorithm.
output = 0.0
For i = 0 to NB_ENTREES-1 If (bit (i,M)) then exit += poids_cachee [i];
tempo [i] =M end
Activation_cachee= sigmoid (output-seuil_cachee)
For i = 0 to NB_SORTIES-1 do activation_cachee transmitting to layer 3 end
error= 0.0
For i = 0 to NB_SORTIES-1 do
receive M layer 3 / * poids_sortie for this hidden below neuron *
error=error+ receive M layer 3 / * delta_sortie * /.
End
delta_cachee = error*activation_cachee*(1-activation_cache ))
For i = 0 to NB_ENTREES do poids_cachee[i] =poids_cachee[i] +EPSILON
* delta_cachee * tempo[i]
seuil_cachee = seuil_cachee - EPSILON * delta_cachee
Finrepeter
Figure 5.6.2.2 - algorithm of the second layer of processors
Also PSPACE : problems that can be solved in polynomial space and variable time.
Also PSPACE-complete : problems that can be solved in polynomial space and variable time.
EXP TIME : problems that can be solved in exponential time
If first n, (n) = n-1 and if n = p * q where p and q are first then (n) =(p-1) *(q-1).
Or pgcd(a,n) = 1 and (a * x) mod n = b, calculate x:
-by the generalization of Euler: x = (b * exp(a (n)-1 mod n)) mod n
-by Euclid's algorithm: x = (b * reverse (a, n)) mod n.
see (SCHNEIER 1995, pages 212-213)
The Chinese remainder theorem
A few are has and (b) such as a < p and b < q (p and q first), there are unique x such as x < p *
q and as x = a mod p and x = b mod q.
By Euclid, calculating u as u * q = 1 mod p which gives us x = (((a-b) * u) mod p) * q + b
Details and code in C (SCHNEIER 1995, pages 213-214)
The residuals squared modulo p
If p Prime, has < p then a is residual squared modulo p If x2= a mod p for some x.
The LEGENDRE symbol
It is noted L (a, p) or (a/p) with a whole natural and p Prime > 2.
We then obtain: L (a, p) = 0 if a is divisible by p.
L (a, p) = 1 if a is a square modulo p
L(a,p) =-1 if a is not a residue quadratic modulo p
To calculate, it has the formula L(a, p) = a(p-1) / 2 mod p
There are also the following recursive expressions:
If a = 1, L (a, p) = 1
If a is even, L (a, p) = L (a/2, p) *(-1)(p * p - 1) / 8 else L (a, p) = L(p mod a,a) *(-1)(a-1) *(p-1)/4
The JACOBI symbol (Jacobian)
Noted J (a, n), it is a generalization of L (a, n). To compute,
If n is Prime, J (a, n) = 1 if a is residual squared modulo n
J(a,n) =-1 if a is not residual squared modulo n
If n = p1*... * pm (pm is a factor n Prime),.
Screened on digital bodies: the number of operations is e(ln n) 1/3 * (ln(ln n)) 2/3, see (LENSTRA
1993).
Methods of elliptic curves. See (MONTGOMERY 1987) and (MONTGOMERY 1990).
Algorithm Monte Carlo of POLLARD. See (POLLARD, 1975), (BRENT, 1980), (KNUTH,
1981, page 370).
Algorithm of continued fractions. See (KNUTH, 1981, pages 381-382)
Attempt of divisions: divisions of the number by all lower primes.
Chapter 7 - Conclusion
We presented the neural networks, defined and determined which model of neural networks
the most appropriate Cryptography on algorithmic learning plan and material terms as regards
architectures already carried out and observed performance.
The most interesting Connectionist model turns out be the network of perceptrons with backpropagation of the gradient through the various properties that were analyzed and
demonstrated by different scientists:
This architecture can also be software than hardware. Neural networks have already been
implemented on machines massively parallel.
An analysis of linear multilayer networks showed us the analogies with different statistical
methods of analysis of the data, in particular linear regression and discriminant analysis. It has
been shown that the backpropagation is a discriminant analysis of a population of N
individuals (N being the number of examples included in learning) described by n parameters
(wheren is the number of input neurons) and projected in a hyperplane of dimension p
(wherep is the number of hidden units). It is therefore possible to use non-linearly separable
problem to build a classifier where a probabilistic model. Which proves the interest of such an
algorithm in cryptography and especially cryptanalysis.
On the hardware side, the benefits of the VLSI components are:
ease of use
the important signal-to-noise ratio
an easy-to-implement cascade circuit
a high adaptation (these circuits allow to solve various tasks)
a reduced price of manufacturing
We presented then the three types of existing components on the market or research
laboratory:
1. components dedicated to digital neural which speeds network go up to 1 GB of
connections processed per second.
2. the digital coprocessors particular purpose (also called neuro-accelerators) are special
circuitry that can be connected to hosts (PCs or workstations), they work with a neurosimulator program. The mix of hardware and software aspects gives these benefits:
accelerated speed, flexibility and improved user interface.
3. networks of neurons on massively parallel machines.
An implementation of the algorithm has been developed on the Connection Machine CM-2
(created by THINKING MACHINES Corp.) with a topology hypercube 64 k processors,
which gave 180 million interconnections calculated per second (IPS) or 40 million weight
updated per second.
Here is the performance measured by machine in interconnections calculated by seconds
(figure below).
CM-2
180 million
CRAY X MP
50 million
WARP (10)
17 million
ANZA
MORE
10 million
The use of such configurations would allow to obtain excellent results in learning of
cryptographic ciphers.
We have seen that Cryptography is a very large and popular area of mathematicians and
computer scientists. We had the force of a cryptosystem which depends entirely on the used
key whether it be public or private and exchanges cryptographic protocols. We have chosen to
focus on the realization of neural and neuro-Cryptanalysis of cryptosystems.
Our work specifically concerned the ECB mode which is more suitable for learning of the
networks of neurons with an entry and a number of bits output fixed and not loop re-inbound.
It is also possible to connect one or more networks of neurons in this way.
We have chosen to tackle the D.E.S. because it is the older standard of encryption and the
most studied algorithms.
The physical aspect is very important for the speed of execution. The VLSI components are
widespread and effective but there are even more interesting technology-based components
that should not be disregarded: the Gallium Arsenic (GaAs) or arsenide technology. It has
already been included in supercomputers.
With regard to the D.E.S., there is a circuit running at 50 MHz performing encryption in 20
ns, which allows to make 50 million of ciphers in a second. Since late 1995, AMD sells a
circuit encrypting the of 250 MHz.
In August 1993, the Canadian Michael J. WIENER described how to build a machine for $ 1
million that performs a comprehensive search of des keys to find the right key in 3.5 hours.
Each of its basic circuits has power equivalent to 14 million stations SUN.
We analyzed both as successful cryptanalysis against des.
Differential cryptanalysis that is to look at the specifics of a pair of ciphertexts for a pair of
plaintexts with a particular difference.
Force of residing in his rounds and all operations of a round being completely linear except Stables, Eli BIHAM and Adi SHAMIR analyzed 8 S-tables for text input differences and
differences in output texts, these information are synthesized in 8 tables called Tables of
distribution of differences of the (see 8 tables in annex 3). We realized the algorithm to
generate these tables.
Linear cryptanalysis is to study the statistical linear relationships between a plaintext bits, the
bits of the ciphertext and key which allowed to encrypt. These relationships allow for some
bits of the key values when we know the plaintexts and ciphertexts associated. It deduced the
linear relationships of each S-table by choosing a subset of bits of input and output bits,
calculating parity (Xor) of these bits with parity of the subset is zero. In general, a subset will
be entries with parity 0 (linear) and others with parity 1 (affine). MATSUI has calculated the
number of parity zero of each subset of bits of input and output for each S-table amongst the
64 x 16 = 1024 possible subsets. It is possible to associate different probabilities to various
bits of the sub-keys. Probabilities of obtaining parity zero (linear relationship) are synthesized
in 8 tables called Tables of linear approximations of a (see 8 tables in annex 4). We realized
the algorithm to generate these tables.
After showing the possible association between neural networks and cryptography, we
defined the field of neuro-Cryptography.
We then identified some important points for the correct use of neural networks. How the
basis of learning will be generated is very important for the realization of neural applications.
Learning depends on random initialization of weights the network as well as the number of
examples, the order of presentation of these examples then the consistency in the choice of a
set of examples.
We have seen that a sample consists of a value to be presented at the entrance to the network
of neurons and a value to present output of this network, output based on the input value. If
the number of examples is too low, it is clear that the network will not seek a transfer function
of the studied cryptosystem but will instead store the examples given and cannot therefore in
any way find a result for an input value different from those given in the basis of examples. In
cryptography to present more than half of all possible to be certain of the results examples
even if it is true that in strong cryptography, the number of possible input values is very large.
Then we realized an algorithm to present the examples in a more or less complete mess. It's
cutting the base k sub-bases then in turn present the elements of each of the sub-bases (k can
be even or odd). The following figure shows the error rate final Tss for k different values (the
number of presentation being fixed at 500 and 256 examples).
We note that the order of presentation of the basis of learning is not useful.
k
TSS
0.05
0.06
0.06
0.05
0.08
0.07
0.05
0.08
At the level of the automatic generation of contiguous texts, we presented an algorithm that
can generate clear examples regardless of the number of nested loops to a single body of loop
which will be executed on each iteration of the innermost loop.
We analyzed the coefficient of learning to enable a more or less rapid learning with
opportunities for convergence of the network to an inversely proportional solution due to local
minima of the curve of error measured by the basis for learning and values output calculated
by the neural network.
Should empirically vary Epsilon between 0.1 and 2.0. If the network doesn't want any
similarly converge, it is certainly due to the problem of the non-linearly separable, which is
the case of learning of the XOR. Should then use a Momentum Term whose real value is
between 0.1 and 1.0 and which will aim to avoid local minima by deriving the error function,
meaning that it allows to take into account in the current step of learning from previous steps.
We presented the self-study which is interesting for neuronal learning of cryptographic
algorithms. The neuronal system has two parts: the emulator and the controller whose
learning are carried out separately.
The task of the emulator is to simulate the complex function or the encryption algorithm.
There therefore its entry State at any given time and an input at this time and its output is the
output of the algorithm to the following time. The input of the controller is the State of the
system at time k, its output is the value to be input to the algorithm or the function complex.
The proper role of the controller is to learn the law of adaptive control. But for this learning,
the error signal is not calculated on the order but on its result, the gap between actual
condition and current state. It comes to the idea of a guided rather than supervised learning
because no Professor learns the System Control Act. In fact, the system learns itself in dealing
with the information he receives in return for shares. To make possible learning through
backpropagation and retropropager error on the position, the structure of the emulator must be
homogeneous at the controller.
Another quality of this device is its ability to e-learning. Learning of the controller is fast. In
addition, the law of synthesized control is sufficiently robust to small random perturbations. It
is therefore possible to perform neural networks for self-learning on a line of communication
for encryption as for authentication of messages in real time.
We have shown how to make the neural network can memorize the algorithm. The answer is
to present virtually all possible encryption keys (e.g. 64 bits) and all possible plaintexts (e.g.
64 bits) input and calculate all resulting ciphertexts with the encryption algorithm. Thus, the
neural network will be synthesized algorithm since when it presents him an encryption key
and a plain text input, it will give us output the ciphertext whereas. If the encryption algorithm
is bijectif then the encryption algorithm is the same as the decryption algorithm and the neural
network also decrypts.
We have seen that with regard to key learning, an encryption key must be linked to an
encryption or decryption algorithm and a plaintext or encrypted. If key has a fixed size of N
bits, then N bits in the neural network outputs and M bits input equal to two times the number
of bits of the plaintext and ciphertext blocks.
In fact, the neural network realizes a function that finds the key directly from a plaintext and
encrypted text.
We presented then the advantages and disadvantages of the neuronal methods used. Learning
of neural networks time remains long enough on the basis of the number of bits of the key and
encrypted and clear texts, this time can be optimized if the neural network is implemented on
a parallel machine.
In regards to memorizing keys and ciphers, neural networks are high achievers with over 90%
success in learning of weak ciphers. A strong encryption algorithm, to rapid learning. Neural
networks are used extensively in recognition of images, they is so easy to perform
In neuro-cryptanalysis, the neural network to help find the encryption key used in the cipher
text. As a neural network can learn a cryptographic algorithm or can 'remember' (by a
function approximation) a set of keys, we infer that the neuro-cryptanalyseur can be broken
down into 2 subnets of neurons as in the figure on the next page. This neural network
structure is identical to that of the self-study. It is clear that neural networks must take an
important place in cryptography in the design, use, and verification of protocols.
stored in the/etc/passwd file. It is a little special in the sense where the key is unknown to the
user, no one has the ability to perform decryption of password. This key is specific to the
Unix system in use. We thought it would be interesting to learn a certain amount of passwords
clear and encrypted passwords corresponding to a neural network. The basis of learning
should be large enough so the D.E.S. learning does not become a memorization of the
examples of this basis, what makes that the network would be unable to find the solutions to
other nearby examples of the database.
We have therefore made two applications. A UNIX (or GNU Linux) synthesizing the crypt
function of Unix for password clear of 4 characters whose values are a miniscule letter or a
point, a division, or about 615000 passwords bar and 2 hours of calculations per presentation.
The other is MS-DOS, she realizes learning 1024 clear passwords of 7 characters and
passwords encrypted in 11 characters (we remove the first 2 characters of salt used to reencrypt the password encrypted for 65536 encrypted different passwords for the same clear
text).
We have added a visualization program of the first graphical statistics. The second provides
quick information.
Statistical analysis of the program under MS-DOS version results are surprising with 90% of
the encryption function for the base found by the network of neurons and about 80% of bits to
a close this basic example but not submitted to the network. That proves that for a low basis
of learning, it is easy for a neural network to find a clear password from a password encrypted
without taking into account the salt included by the Unix system.
We then felt two implementations on two types of hardware architectures. The first is a
dedicated parallel architecture as a neuro-cryptanalyseur of strong ciphers needs a very fast
supervised learning. It is necessary to present all plaintexts, ciphertexts and keys to the neural
network.
A complete machine has been studied with a large number of units of binary counters and
circuits with the encryption algorithm. This number is limited by the time of learning of the
single neural circuit of approximately 1 s. It is preferable for the D.E.S., treat a fixed data
subset as we have done in past applications.
In the second, we presented our algorithms written to the distributed architecture of the CM-5
using 3 layers of processors with a processor for a neuron. The first is used to initialize (clear
text) input and output (ciphertext) network of neurons located on layers 2 and 3. It is likely
that examples learning time is longer than for the dedicated machine of the preceding
paragraph.
Neuro-Cryptography and neuro-cryptanalysis are two areas very interesting and helpful for
Cryptography. We hope that various studies and research will be done to refine the set of our
conclusions.
Software and hardware applications that we have studied or made can be implemented and
optimized.
The results obtained are very promising for the future of neural networks.
Bibliographie
Neural networks
(Aleksander) I. Aleksander, H. Morton, An introduction to neural computing, Editions
CHAPMAN & HALL
(Alippi 1990a) C. Alippi, S. Bonfanti, G. Storti-Gajani, " Some simple bounds for
approximations of sigmoidal functions in layered neural nets ", Report n90-022,
Dipartimento di elettronica politecnico di Milano, 1990, pages 1-25
(Alippi 1990b) C. Alippi, S. Bonfanti, G. Storti-Gajani, " Approximating sigmoidal functions
for VLSI implementations of neural nets ", Proceedings MicroNeuro90, 1990, pages 165-170
(Beiu 1995a) V.Beiu et J.G. Taylor, " Optimal mapping of neural netwaorks onto FPGA ",
Lectures Notes in Computer Science : Proceedings. of the intl. Workshop on Artif. Neural
Networks (IWANN95), Springer Verlag, Mlaga, Espagne, 1995, pages 822-829
(Beiu 1995b) V.Beiu, " Constant fan in neural networks are VLSI optimal ", First Intl. Conf.
on Mathematics of Neural Networks and applications (MANNA95), Oxford,UK, 1995
(Bourret 1991) P. Bourret, J. Reggia, M. Samuelides , RESEAUX NEURONAUX - Une
approche connexionniste de lintelligence artificielle, Editions TEKNEA, Toulouse, Octobre
1991, ISBN 2-87717-016-0
(Camargo 1990) F.A. Camargo, " Learning algorithms in neural networks ", DCC
Laboratory, Columbia University, NY, 1990.
(GALLINARI 1988) GALLINARI P., FOGELMAN-sOULIE F., " Progressive Design of
M.L.P Architecture ", Neuro-Nmes, pages 171-182, 1988
(Grossberg 1986) Carpenter, Grossberg, " Neural dynamics of category learning and
recognition in brain structure, learning and memory ", AAAS Symposium Series, 1986
(Hebb 1975) Hebb, The organization of the behavior, JOHN WILEY & SONS, NY, 1975
(Hopfield 1982) Hopfield, " Neural networks ", Proc. Natinal Academy. Sciences USA, vol.
79, Avril 1982 , pages 2554-2558
(Kohonen 1984) Kohonen, Self organization and associative memory, SPRINGER
VERLAG, BERLIN, 1984
(Lippman 1987) Lippman, " Introduction to computing with neural nets ", IEEE ASSP
MAGAZINE, Avril 1987, Pages 4-22
(Maren) A.J. Maren, Handbook of neural computing applications, Editions ACADEMIC
PRESS INC.
(McCulloch 1943) Mc Culloch et Pitts, " A logical calculus of the ideas imminent in nervous
activity ", BULLETIN OF MATHS BIOPHYSICS, vol.5, 1943, pages 115-133
(Nigri 1991) M.E. Nigri, " Harware emulation of backpropagation neural nets ", Research
Notes RN/91 /2 1, Departement of Computer Science, University of College London, Fvrier
1991
(Rosenblatt 1959) Rosenblatt, Principles of neurodynamics, SPARTAN BOOKS, NY, 1959
(Rumelhart 1986) Rumelhart, McClelland, Parallel distributed processing exploration in
the micro-structure of cognition, 2 Volumes, MIT PRESS, 1986
(Weisbuch 1989) Weisbuch, Dynamique des systmes complexes, INTEREDITIONS, 1989
Cryptography
(ALDEMAN 1978) RIVEST, SHAMIR, ALDEMAN, " A method for obtaining digital
signature and public key cryptosystems ", CACM, Vol. 21, N2, pages 120-126, Fvrier 1978
(Biham 1990) E. BIHAM et A. SHAMIR, " Differential cryptanalysis of DES-like
cryptosystems ", Advances in Cryptology CRYPTO90 Proceedings, Editions SpringerVerlag, Berlin, 1990 , pages 2-21
(BIHAM 1993a) E. BIHAM et A. SHAMIR, Differential cryptanalysis of Data Encryption
Standart, Editions Springer-Verlag, Berlin, 1993
(BIHAM 1993b) E. BIHAM et A. SHAMIR, " Differential cryptanalysis of the full 16 round
DES ", Advances in Cryptology CRYPTO92 Proceedings, Editions Springer-Verlag, Berlin,
1993
(Diffie 1992) W.Diffie, The first ten years of public key cryptography - Contemporary
cryptology: The science of information integrity, IEEE Press, Piscatoway, NJ, 1992, pages 65134
(Friedman 1920) W.F. Friedman, " The index of coincidence and its applications in
cryptography ", RIVERBANK PUBLICATION, N22, Riverbank Labs, 1920
(Harpes 1995) C. Harpes, G. Kramer et J.L. Massey, " A generalization of Linear
Cryptanalysis and the Applicability of Matsui's Piling-Up Lemma ", Advances in Cryptology EUROCRYPT '95, lecture notes in computer science, vol. 921, Springer-Verlag, New York,
1995, Pages 24-38
(MATSUI 1994) M. MATSUI, " Linear cryptanalysis method for DES cipher "Advances in
Cryptology EUROCRYPT93 Proceedings, Editions Springer-Verlag, BERLIN, 1994
(Meier 1994) W.Meier, " On the security of the {IDEA} block cipher ", Advances in
Cryptology: EUROCRYPT '93, Lecture Notes in Computer Science, Vol. 765, SpringerVerlag, Berlin, 1994, Pages 371-385
(Pointcheval 1995) David Pointcheval, " Les rseaux de neurones et leurs applications
cryptographiques ", LIENS 95-2, Mmoire effectu au dpartement Mathmatiques et
Informatique, Ecole Normale Suprieure, PARIS, Fvrier 1995
(SCHNEIER 1994) B. SCHNEIER, Applied cryptography, Editions JOHN WILEY &
SONS INC. , U.S., 1994
(SCHNEIER 1995) B. SCHNEIER, Cryptographie applique, Editions INTERNATIONAL
THOMSON PUBLISHING, Paris, 1995, ISBN 2-84180-000-8
(SCHNEIER 1996) B. SCHNEIER, " Differential and linear cryptanalysis : attacking the
Data Encryption Standart ", DOCTOR DOBBS JOURNAL, US, Janvier 1996, Page 42
(WIENER 1993) M.J. WIENER, " Efficient DES key search "
BELL-Nothern Research,P.O. Box 3511 Station C, Ottawa, Ontario,K1Y 4H7, CANADA
Mathematics
(Brent 1980) R.P.Brent, " An improved Monte Carlo factorization algorithm ", BIT, vol.20,
1980, pages 176-184
(Kranakis 1986) E. Kranakis, Primality and cryptography, Editions WILER-TUEBNER
Series in Computer Science, 1986
(KNUTH 1981) D. KNUTH, The art of computer programming, Volume 2 - Seminumerical
Algorithms, Editions Addison-Wesley, Reading, MA, 2e dition, 1981
(Lenstra 1993) A.K. Lenstra, H.W. Lenstra, M.S. Manasse, J.M. Pollard, " The factorization
of the ninth FERMAT number ", Mathematics of Computation, vol. 67, n20, juillet 1993,
pages 319-350
(Montgomery 1987) P.Montgomery, " Speeding the pollard and elliptic curve methods of
factorization ", Mathematics of Computation, vol. 48, n177, janvier 1987, pages 243-264
(Montgomery 1990) P.Montgomery, R.Silverman, " An FFT extension to the P-1 factoring
algorithm ", Mathematics of Computation, vol. 54, n190, 1990, pages 839-854
(Pollard 1975) P.Montgomery, " A Monte Carlo method for factorization ", BIT, vol.15,
1975, pages 331-334
(Pomerance 1985) C. Pomerance, " The quadratic sieve factoring algorithm ", Advances in
Cryptology: Proceedings of EUROCRYPT 84, Editions Springer Verlag, berlin, 1985, pages
169-182
(Pomerance 1988) C. Pomerance, JW Smith, R.Tuler, " A pipe-line architecture for factoring
large integers with the quadratic sieve factoring algorithm ", SIAM Journal, vol.17, n2, Avril
1988, pages 387-403
(Wunderlich 1983) M.C. WUNDERLICH, " Recent advances in the design and
implementation of large integer factorization algorithms ", Proceedings of the 1983
Symposium on Security and Privacy, IEEE Press, Piscatoway, NJ, 1983, pages 67-71
Annexes
1 C source code
gradient backpropagation of neural networks
(XOR learning on 8 bits)
/*
XOR.C : 8 bits (8:entre1,8:entre2,8:sortie) - Apprentissage avec Momentum Term
Sbastien DOURLENS - V1.00 - compilateur BORLAND C 3.1 pour DOS
*/
#include <stdio.h>
#include <math.h>
#include <stdlib.h>
#include <time.h>
#define NB_ENTREES
16
/* entres du rseau */
#define NB_CACHEES
16
#define NB_SORTIES
/* neurones de sortie */
#define NB_EXEMPLES
65536L
#define NB_PRESENTATIONS
500
#define EPSILON
0.9
/* coefficient dapprentissage */
#define MOMENTUM
0.3
#define MAX_ALEA
0.3
#define MIN_ALEA
-0.3
#define TYPE_REEL
double
/****************************************************************/
/* fonction de transfert du neurone */
TYPE_REEL sigmoid(TYPE_REEL x)
{
return( 1.0/(1.0+exp(-1.0*x)) );
}
/****************************************************************/
/* donne un nombre alatoire compris entre min et max */
TYPE_REEL calculer_nb_aleatoire(TYPE_REEL min,TYPE_REEL max)
{
TYPE_REEL r=((rand()*(max-min))/32767.0)+min;
return r;
}
/****************************************************************/
/* initialise les poids de chaque connexion alatoirement */
void initialiser_poids(void)
{
for(k=0;k<NB_SORTIES;k++) {
for(j=0;j<NB_CACHEES;j++) poids_sorties[k][j]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
seuils_sorties[k]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
}
for(j=0;j<NB_CACHEES;j++) {
for(i=0;i<NB_ENTREES;i++) poids_cachees[j][i]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
seuils_cachees[j]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
}
}
/****************************************************************/
/* calcule les activations des neurones de sorties */
void calculer_sorties(void)
{
register int i,j,k;
TYPE_REEL net;
for(j=0;j<NB_CACHEES;j++) {
net=0.0;
for(i=0;i<NB_ENTREES;i++) net+=activations_entrees[i]*poids_cachees[j][i];
activations_cachees[j]=sigmoid(net-seuils_cachees[j]);
}
for(k=0;k<NB_SORTIES;k++) {
net=0.0;
for(j=0;j<NB_CACHEES;j++) net+=activations_cachees[j]*poids_sorties[k][j];
activations_sorties[k]=sigmoid(net-seuils_cachees[k]);
}
}
/****************************************************************/
/* calcule les variations entre valeurs dapprentissage et valeur courante */
void calculer_deltas(void)
{
register int j,k;
TYPE_REEL erreur;
for(k=0;k<NB_SORTIES;k++) {
erreur=activations_apprentissage[k]-activations_sorties[k];
delta_sorties[k]=erreur*activations_sorties[k]*(1.0-activations_sorties[k]);
}
for(j=0;j<NB_CACHEES;j++) {
for(erreur=0.0,k=0;k<NB_SORTIES;k++) erreur+=delta_sorties[k]*poids_sorties[k][j];
delta_cachees[j]=erreur*activations_cachees[j]*(1.0-activations_cachees[j]);
}
}
/****************************************************************/
/* modifie les poids des connexions en fonction des variations mesures */
void changer_poids(void)
{
register int i,j,k;
for(k=0;k<NB_SORTIES;k++) {
for(j=0;j<NB_CACHEES;j++) {
delta_poids_sorties[k][j]=EPSILON*delta_sorties[k]*activations_cachees[j]+
MOMENTUM*delta_poids_sorties[k][j];
poids_sorties[k][j]+=delta_poids_sorties[k][j];
}
delta_seuils_sorties[k]=-1.0*EPSILON*delta_sorties[k]+MOMENTUM*delta_seuils_sorties[k];
seuils_sorties[k]+=delta_seuils_sorties[k];
}
for(j=0;j<NB_CACHEES;j++) {
for(i=0;i<NB_ENTREES;i++) {
delta_poids_cachees[j][i]=EPSILON*delta_cachees[j]*activations_entrees[i]+
MOMENTUM*delta_poids_cachees[j][i];
poids_cachees[j][i]+=delta_poids_cachees[j][i];
}
delta_seuils_cachees[j]=-1.0*EPSILON*delta_cachees[j]+MOMENTUM*delta_seuils_cachees[j];
seuils_cachees[j]+=delta_seuils_cachees[j];
}
}
/****************************************************************/
/* mesure lerreur Pss (somme des carrs des diffrences) des valeurs dactivations des neurones de sorties*/
TYPE_REEL calculer_pss(void)
{
register int k;
TYPE_REEL res=0.0;
for(k=0;k<NB_SORTIES;k++)
res+=(activations_apprentissage[k]-activations_sorties[k])*
(activations_apprentissage[k]-activations_sorties[k]);
return res;
/****************************************************************/
/* affichage des seuils des neurones et des poids des connexions */
void afficher_seuils_et_poids(void)
{
register int i,j,k;
for(k=0;k<NB_SORTIES;k++) {
printf("seuils_sorties(%2d)=%5.2f ",k,seuils_sorties(k]);
for(j=0;j<NB_CACHEES;j++) printf("%5.2f ",poids_sorties[k][j]); printf("\n");
}
printf("\n");
for(k=0;k<NB_CACHEES;k++) {
printf("seuils_cachees(%2d)=%5.2f ",k,seuils_cachees[k]);
for(j=0;j<NB_ENTREES;j++) printf("%5.2f ",poids_cachees[k][j]); printf("\n");
}
printf("\n");
}
/****************************************************************/
/* sauvegarde des seuils et des poids pour une rutilisation */
void sauver_seuils_et_poids(char *fichier)
{
register int i,j,k;
FILE *fp=fopen(fichier,"wt");
fprintf(fp,"seuils_sorties\n");
for(k=0;k<NB_SORTIES;k++) {
fprintf(fp,"%5.2f\n",seuils_sorties[k]);
for(j=0;j<NB_CACHEES;j++) fprintf(fp,"%5.2f\n",poids_sorties[k][j]);
}
fprintf(fp,"seuils_cachees\n");
for(k=0;k<NB_CACHEES;k++) {
fprintf(fp,"%5.2f\n",k,seuils_cachees[k]);
for(j=0;j<NB_ENTREES;j++) fprintf(fp,"%5.2f\n",poids_cachees[k][j]);
}
fclose(fp);
}
/****************************************************************/
void main(void)
{
register int nb_presentations;
long p;
unsigned char i,s[30],a,b;
TYPE_REEL tss,pss;
randomize();
initialiser_poids();
for(nb_presentations=0;nb_presentations<NB_PRESENTATIONS;nb_presentations++){
printf("Prsentation: %3d ",nb_presentations+1);
tss=0.0;
a^=b;
sauver_seuils_et_poids("XOR.PDS");
do {
printf("Valeur pour a (8 bits) et b (8 bits) ? "); gets(s);
for(i=0;i<NB_ENTREES;i++) activations_entrees[i]=(TYPE_REEL)(s[i]-'0');
calculer_sorties();
printf(" a XOR b=");
for(p=0;p<NB_SORTIES;p++) printf("%1.1f ",activations_sorties[p] );
printf("\tTouche ... (ECHAP pour quitter)\n");
}
while(getch()!=27);
if (argc==1) {
if ((fi=fopen(argv[2], rb ))==NULL) return 1;
if ((fo=fopen(argv[3], wb ))==NULL) return 2;
while ((c=getc(fi)) != EOF) {
if (!*cp) cp=argv(1);
c ^= *(cp++);
putc(c,fo);
}
fclose(fo);
fclose(fi);
}
return 0;
}
Cryptanalysis of Vigenere Ciffer
/*
Use: DEVIGENE <encrypted file(input)> <decrypted file (output)> (<most used char >)
Mot used char is an integer : 32 for textfile, 0 for binary file
*/
#include <stdio.h>
#include <stdlib.h>
#include <math.h>
#include <alloc.h>
#define TRUE
-1
#define FALSE
#define BLOCK
#define NB_BYTES
256
byte, byte_num;
num_bytes;
file_size;
FILE
*infile, *outfile;
if (file_size < (unsigned long) BLOCK*20L) return 3; /* taille de fichier non suffisante pour decoder */
if (argc == 4) most_common=(unsigned char) atoi(argv[3]); else most_common=32;
fclose(outfile);
fclose(infile);
for (byte_num=(BLOCK/2)-1; byte_num >= 0; byte_num--) free(freq[byte_num]);
return 0;
}
D.E.S. Code
/*
CODEDES.C - D.E.S. Encryption of string
Dourlens Sbastien
V1.0
04.01.1996
*/
#define PERMUT_OK 0
14,4,13,1,2,15,11,8,3,10,6,12,5,9,0,7}, {0,15,7,4,14,2,13,1,10,6,12,11,9,5,3,8},
{4,1,14,8,13,6,2,11,15,12,9,7,3,10,5,0}, {15,12,8,2,4,9,1,7,5,11,3,14,10,0,6,13}},
{{15,1,8,14,6,11,3,4,9,7,2,13,12,0,5,10}, {3,13,4,7,15,2,8,14,12,0,1,10,6,9,11,5},
{0,14,7,11,10,4,13,1,5,8,12,6,9,3,2,15}, {13,8,10,1,3,15,4,2,11,6,7,12,0,5,14,9}},
{{10,0,9,14,6,3,15,5,1,13,12,7,11,4,2,8}, {13,7,0,9,3,4,6,10,2,8,5,14,12,11,15,1},
{13,6,4,9,8,15,3,0,11,1,2,12,5,10,14,7}, {1,10,13,0,6,9,8,7,4,15,14,3,11,5,2,12}},
{{7,13,14,3,0,6,9,10,1,2,8,5,11,12,4,15}, {13,8,11,5,6,15,0,3,4,7,2,12,1,10,14,9},
{10,6,9,0,12,11,7,13,15,1,3,14,5,2,8,4}, {3,15,0,6,10,1,13,8,9,4,5,11,12,7,2,14}},
{{2,12,4,1,7,10,11,6,8,5,3,15,13,0,14,9}, {14,11,2,12,4,7,13,1,5,0,15,10,3,9,8,6},
{4,2,1,11,10,13,7,8,15,9,12,5,6,3,0,14}, {11,8,12,7,1,14,2,13,6,15,0,9,10,4,5,3}},
{{12,1,10,15,9,2,6,8,0,13,3,4,14,7,5,11}, {10,15,4,2,7,12,9,5,6,1,13,14,0,11,3,8},
{9,14,15,5,2,8,12,3,7,0,4,10,1,13,11,6}, {4,3,2,12,9,5,15,10,11,14,1,7,6,0,8,13}},
{{4,11,2,14,15,0,8,13,3,12,9,7,5,10,6,1}, {13,0,11,7,4,9,1,10,14,3,5,12,2,15,8,6},
{1,4,11,13,12,3,7,14,10,15,6,8,0,5,9,2}, {6,11,13,8,1,4,10,7,9,5,0,15,14,2,3,12}},
{{13,2,8,4,6,15,11,1,10,9,3,14,5,0,12,7}, {1,15,13,8,10,3,7,4,12,5,6,11,0,14,9,2},
{7,11,4,1,9,12,14,2,0,6,10,13,15,3,5,8}, {2,1,14,7,4,10,8,13,15,12,9,0,3,5,6,11}}};
/* permutation initiale IPERM du texte donn */
char perm[65]={ 58,50,42,34,26,18,10,2,60,52,44,36,28,20,12,4,62,54,46,38,30,22,14,6,64,56,48,40,32,24,16,8,
57,49,41,33,25,17,9,1,59,51,43,35,27,19,11,3,61,53,45,37,29,21,13,5,63,55,47,39,31,23,15,7};
/* permutation finale I-1PERM du texte donn */
char invperm[65]={
40,8,48,16,56,24,64,32,39,7,47,15,55,23,63,31,38,6,46,14,54,22,62,30,37,5,45,13,53,21,61,29,
36,4,44,12,52,20,60,28,35,3,43,11,51,19,59,27,34,2,42,10,50,18,58,26,33,1,41,9,49,17,57,25 };
/* Permutation expansive */
char select[49]={ 32,1,2,3,4,5,4,5,6,7,8,9,8,9,10,11,12,13,12,13,14,15,16,17,
16,17,18,19,20,21,20,21,22,23,24,25,24,25,26,27,28,29,28,29,30,31,32,1 };
/* Permutation-P */
char perm3[33]={16,7,20,21,29,12,28,17,1,15, 23,26,5,18,31,10,2,8,24,14,32,27,3,9,19,13,30,6,22,11,4,25 };
char bloc[8][6]={ {1,2,3,4,5,6}, {7,8,9,10,11,12}, {13,14,15,16,17,18}, {19,20,21,22,23,24},
{25,26,27,28,29,30}, {31,32,33,34,35,36},{37,38,39,40,41,42}, {43,44,45,46,47,48} };
/* clef code */
int Cl[17][49];
/* calcul de la clef K */
for(i=0;i<=64;i++) { p=bi[i-1]; bk[i]=(CleDess[(i-1)/8]&(1<<p))>>p; }
/* permutation initiale de la clef */
for(i=1;i<=56;i++) CleDessk[i]=bk[s_CleDess[i-1]];
/* calculer les 16 blocs de clef */
for(iter=1;iter<=16;iter++) {
for(i=1;i<=28-decal[iter-1];i++) {
l[i]=CleDessk[i+decal[iter-1]];
r[i]=CleDessk[i+28+decal[iter-1]]; }
for(i=28-decal[iter-1]+1;i<=28;i++) {
l[i]=CleDessk[i-28+decal[iter-1]];
r[i]=CleDessk[i+decal[iter-1]]; }
for(i=1;i<=28;i++){ CleDessk[i]=l[i]; CleDessk[i+28]=r[i]; }
/* permutation compressive d'un bloc de la clef */
for(i=1;i<=48;i++) Cl[iter][i]=CleDessk[perm2[i-1]];
}
}
{
int i,j,iter; /* compteurs */
long lgr; /* compteur */
int b[8],lig,col,e=0,f=1,k=0,p;
int res[8],g[33],temp[33],d[33],s[49];
int bk[65],bk1[65];
char *ptr;
unsigned char buf[2048]; /* buffer tampon */
/* buffer */
ptr=buf;
/* init codage ou decodage */
if (a==100) { e=1;f=-1;} k=0;
for(lgr=0L;lgr<taille;) {
b[k]=buffer[lgr++];
if (k<=6) k++;
else {
k=0;
for(i=1;i<=64;i++) {
p=bi[i-1];
#if (PERMUT_OK)
bk1[i]=(b[(i-1)/8]&(1<<p))>>p;
#else
bk[i]=(b[(i-1)/8]&(1<<p))>>p;
#endif
}
#if (PERMUT_OK)
/* permutation initiale */
for(i=1;i<=64;i++) bk[i]=bk1[perm[i-1]];
#endif
/* les 16 rondes */
for(iter=1;iter<=16;iter++) {
for(i=1;i<=32;i++) { g[i]=bk[i]; d[i]=bk[i+32]; }
for(i=1;i<=48;i++) s[i]=d[select[i-1]];
j=e*17+f*iter;
for(i=1;i<=48;i++) s[i]=s[i]^Cl[j][i];
/* 8 S-tables */
for(j=0;j<8;j++){
lig=(s[bloc[j][0]]<<1)+s[bloc[j][5]];
col=(s[bloc[j][1]]<<3)+(s[bloc[j][2]]<<2)+(s[bloc[j][3]]<<1)+s[bloc[j][4]];
res[j]=table[j][lig][col];
}
for(i=1;i<=32;i++) { p=bj[i]; s[i]=(res[(i-1)/4]&(1<<p))>>p; }
if (iter!=16) for(i=1;i<=32;i++) { temp[i]=s[perm3[i-1]]^g[i];g[i]=d[i];d[i]=temp[i]; }
else for(i=1;i<=32;i++){ temp[i]=s[perm3[i-1]]^g[i];g[i]=temp[i]; }
for(i=1;i<=32;i++){ bk[i]=g[i];bk[i+32]=d[i]; }
#if (PERMUT_OK)
/* Permutation finale */
if (iter==16) {
for(i=1;i<=64;i++) bk1[i]=bk[invperm[i-1]];
for(i=1;i<=64;i++) bk[i]=bk1[i];
}
#endif
}
/* ecriture d'un bloc */
for(i=0;i<=7;i++) {
for(p=0,j=1;j<=8;j++) p+=(1<<(8-j))*bk[8*i+j];
*ptr++=p;
}}}
/* copie */
ptr=buf;
for(lgr=0L;lgr<taille;lgr++) buffer[lgr]=*ptr++;
}
#define NB_ENTREES
#define NB_CACHEES
#define NB_SORTIES
500
#define EPSILON
2.0
#define MOMENTUM
0.1
#define MAX_ALEA
0.3
#define MIN_ALEA
-0.3
#define TYPE_REEL
double
"010110101111","110110100111","001110101001","101110100001","011110101101","111110100101","00000
1100110","100001101110","010001100010",
"110001101010","001001100100","101001101100","011001100000","111001101000","000101100111","10010
1101111","010101100011","110101101011",
"001101100101","101101101101","011101100001","111101101001","000011101110","100011100110","01001
1101010","110011100010","001011101100",
"101011100100","011011101000","111011100000","000111101111","100111100111","010111101011","11011
1100011","001111101101","101111100101",
"011111101001","111111100001","000000010001","100000011001","010000010101","110000011101","00100
0010011","101000011011","011000010111",
"111000011111","000100010000","100100011000","010100010100","110100011100","001100010010","10110
0011010","011100010110","111100011110",
"000010011001","100010010001","010010011101","110010010101","001010011011","101010010011","01101
0011111","111010010111","000110011000",
"100110010000","010110011100","110110010100","001110011010","101110010010","011110011110","11111
0010110","000001010101","100001011101",
"010001010001","110001011001","001001010111","101001011111","011001010011","111001011011","00010
1010100","100101011100","010101010000",
"110101011000","001101010110","101101011110","011101010010","111101011010","000011011101","10001
1010101","010011011001","110011010001",
"001011011111","101011010111","011011011011","111011010011","000111011100","100111010100","01011
1011000","110111010000","001111011110",
"101111010110","011111011010","111111010010","000000110011","100000111011","010000110111","11000
0111111","001000110001","101000111001",
"011000110101","111000111101","000100110010","100100111010","010100110110","110100111110","00110
0110000","101100111000","011100110100",
"111100111100","000010111011","100010110011","010010111111","110010110111","001010111001","10101
0110001","011010111101","111010110101",
"000110111010","100110110010","010110111110","110110110110","001110111000","101110110000","01111
0111100","111110110100","000001110111",
"100001111111","010001110011","110001111011","001001110101","101001111101","011001110001","11100
1111001","000101110110","100101111110",
"010101110010","110101111010","001101110100","101101111100","011101110000","111101111000","00001
1111111","100011110111","010011111011",
"110011110011","001011111101","101011110101","011011111001","111011110001","000111111110","10011
1110110","010111111010","110111110010",
"001111111100","101111110100","011111111000","111111110000" };
/****************************************************************/
TYPE_REEL sigmoid(TYPE_REEL x)
{
return( 1.0/(1.0+exp(-1.0*x)) );
}
/****************************************************************/
for(k=0;k<NB_SORTIES;k++) {
for(j=0;j<NB_CACHEES;j++)
poids_sorties[k][j]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
seuils_sorties[k]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
}
for(j=0;j<NB_CACHEES;j++) {
for(i=0;i<NB_ENTREES;i++)
poids_cachees[j][i]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
seuils_cachees[j]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
}
}
/****************************************************************/
void calculer_sorties(void)
{
int i,j,k;
TYPE_REEL net;
for(j=0;j<NB_CACHEES;j++) {
net=0.0;
for(i=0;i<NB_ENTREES;i++) net+=activations_entrees[i]*poids_cachees[j][i];
activations_cachees[j]=sigmoid(net-seuils_cachees[j]);
}
for(k=0;k<NB_SORTIES;k++) {
net=0.0;
for(j=0;j<NB_CACHEES;j++) net+=activations_cachees[j]*poids_sorties[k][j];
activations_sorties[k]=sigmoid(net-seuils_sorties[k]);
}
}
/****************************************************************/
void calculer_deltas(void)
{
int j,k;
TYPE_REEL erreur;
for(k=0;k<NB_SORTIES;k++) {
erreur=activations_apprentissage[k]-activations_sorties[k];
delta_sorties[k]=erreur*activations_sorties[k]*(1.0-activations_sorties[k]);
}
for(j=0;j<NB_CACHEES;j++) {
erreur=0.0;
for(k=0;k<NB_SORTIES;k++) erreur+=delta_sorties[k]*poids_sorties[k][j];
delta_cachees[j]=erreur*activations_cachees[j]*(1.0-activations_cachees[j]);
}
}
/****************************************************************/
void changer_poids(void)
{
int i,j,k;
for(k=0;k<NB_SORTIES;k++) {
for(j=0;j<NB_CACHEES;j++) {
delta_poids_sorties[k][j]=EPSILON*delta_sorties[k]*activations_cachees[j]+
MOMENTUM*delta_poids_sorties[k][j];
poids_sorties[k][j]+=delta_poids_sorties[k][j];
}
delta_seuils_sorties[k]=-1.0*EPSILON*delta_sorties[k]+MOMENTUM*delta_seuils_sorties[k];
seuils_sorties[k]+=delta_seuils_sorties[k];
}
for(j=0;j<NB_CACHEES;j++) {
for(i=0;i<NB_ENTREES;i++) {
delta_poids_cachees[j][i]=EPSILON*delta_cachees[j]*activations_entrees[i]+
MOMENTUM*delta_poids_cachees[j][i];
poids_cachees[j][i]+=delta_poids_cachees[j][i];
}
delta_seuils_cachees[j]=-1.0*EPSILON*delta_cachees[j]+MOMENTUM*delta_seuils_cachees[j];
seuils_cachees[j]+=delta_seuils_cachees[j];
}
}
/****************************************************************/
TYPE_REEL calculer_pss(void)
{
int k;
TYPE_REEL res=0.0;
for(k=0;k<NB_SORTIES;k++)
res+=(activations_apprentissage[k]-activations_sorties[k])*
(activations_apprentissage[k]-activations_sorties[k]);
return res;
}
/****************************************************************/
void afficher_seuils_et_poids(void)
{
int i,j,k;
for(k=0;k<NB_SORTIES;k++) {
printf("seuils_sorties[%2d]=%5.2lf ",k,seuils_sorties[k]);
for(j=0;j<NB_CACHEES;j++) printf("%5.2lf ",poids_sorties[k][j]);
printf("\n");
}
printf("\n");
for(k=0;k<NB_CACHEES;k++) {
printf("seuils_cachees[%2d]=%5.2lf ",k,seuils_cachees[k]);
for(j=0;j<NB_ENTREES;j++) printf("%5.2lf ",poids_cachees[k][j]);
printf("\n");
}
printf("\n");
}
/****************************************************************/
choose_sample(int p)
{
int K=8; /* separer l'ensemble des exemples en K sous-ensembles */
int d,n=NB_EXEMPLES/K;
d=(int) (p/n);
return( ((p-(d*n))*K)+d );
}
/****************************************************************/
void main(void)
{
int nb_presentations,p,n;
unsigned char i,s[10];
TYPE_REEL tss,pss[NB_EXEMPLES];
clrscr();
randomize();
initialiser_poids();
/* afficher_seuils_et_poids(); */
for(nb_presentations=0;nb_presentations<NB_PRESENTATIONS;nb_presentations++){
gotoxy(1,1); printf("Prsentation: %3d ",nb_presentations);
tss=0.0;
for(p=0;p<NB_EXEMPLES;p++) {
n=choose_sample(p);
for(i=0;i<NB_ENTREES;i++) activations_entrees[i]=(TYPE_REEL)(ch[n][i]-'0');
for(i=0;i<NB_SORTIES;i++) activations_apprentissage[i]=(TYPE_REEL)(ch[n][i+NB_ENTREES]-'0');
calculer_sorties();
calculer_deltas();
changer_poids();
pss[p]=calculer_pss();
tss+=pss[p];
}
printf("\ttss(%d)=%7.6f\n",nb_presentations,tss); /*getch();*/
}
afficher_seuils_et_poids();
#define NB_BCL 8
/*
entrees: valeurs des compteurs de boucles
retour : 0=continuer, 1=terminer
*/
corps(int i_bcl[NB_BCL])
{
int i,c;
static long n=0L;
unsigned char s[256];
for(i=0;i<NB_BCL;i++) s[i]=alphabet[i_bcl[i]];
for(c=0;c<8;c++) printf("%c",s[c]);
coder_des(s);
printf(" ");
for(c=0;c<8;c++) printf("%02X",s[c]);
printf("\n");
n++;
if (n==64267L) { printf("%ld clefs testees\n",n); return 1; }
if (kbhit()) { if (!getch()) getch(); printf("%ld clefs testees\n",n); return 1;}
return 0;
}
clrscr();
traiter_clef(maclef);
gettime(&T);
while(1) {
b=NB_BCL-1;
if (corps(i_bcl)) break;
if (i_bcl[b]<f_bcl[b]) i_bcl[b]++;
else {
precedent:
i_bcl[b]=d;
if (b==0) break; else b--;
if (i_bcl[b]<f_bcl[b]) i_bcl[b]++;
else goto precedent;
}
}
gettime(&F);
printf("\nHeure de debut: %2d:%02d:%02d.%02d\n",
T.ti_hour, T.ti_min, T.ti_sec, T.ti_hund);
printf("Heure de fin: %2d:%02d:%02d.%02d\n",
F.ti_hour, F.ti_min, F.ti_sec, F.ti_hund);
}
main()
{
int l=strlen(alphabet);
compter(0,l);
}
#include <stdio.h>
/* les 8 S-tables */
char Table[8][4][16]={
{{14,4,13,1,2,15,11,8,3,10,6,12,5,9,0,7},{0,15,7,4,14,2,13,1,10,6,12,11,9,5,3,8},
{4,1,14,8,13,6,2,11,15,12,9,7,3,10,5,0},{15,12,8,2,4,9,1,7,5,11,3,14,10,0,6,13}},
{{15,1,8,14,6,11,3,4,9,7,2,13,12,0,5,10},{3,13,4,7,15,2,8,14,12,0,1,10,6,9,11,5},
{0,14,7,11,10,4,13,1,5,8,12,6,9,3,2,15},{13,8,10,1,3,15,4,2,11,6,7,12,0,5,14,9}},
{{10,0,9,14,6,3,15,5,1,13,12,7,11,4,2,8},{13,7,0,9,3,4,6,10,2,8,5,14,12,11,15,1},
{13,6,4,9,8,15,3,0,11,1,2,12,5,10,14,7},{1,10,13,0,6,9,8,7,4,15,14,3,11,5,2,12}},
{{7,13,14,3,0,6,9,10,1,2,8,5,11,12,4,15},{13,8,11,5,6,15,0,3,4,7,2,12,1,10,14,9},
{10,6,9,0,12,11,7,13,15,1,3,14,5,2,8,4},{3,15,0,6,10,1,13,8,9,4,5,11,12,7,2,14}},
{{2,12,4,1,7,10,11,6,8,5,3,15,13,0,14,9},{14,11,2,12,4,7,13,1,5,0,15,10,3,9,8,6},
{4,2,1,11,10,13,7,8,15,9,12,5,6,3,0,14},{11,8,12,7,1,14,2,13,6,15,0,9,10,4,5,3}},
{{12,1,10,15,9,2,6,8,0,13,3,4,14,7,5,11},{10,15,4,2,7,12,9,5,6,1,13,14,0,11,3,8},
{9,14,15,5,2,8,12,3,7,0,4,10,1,13,11,6},{4,3,2,12,9,5,15,10,11,14,1,7,6,0,8,13}},
{{4,11,2,14,15,0,8,13,3,12,9,7,5,10,6,1},{13,0,11,7,4,9,1,10,14,3,5,12,2,15,8,6},
{1,4,11,13,12,3,7,14,10,15,6,8,0,5,9,2},{6,11,13,8,1,4,10,7,9,5,0,15,14,2,3,12}},
{{13,2,8,4,6,15,11,1,10,9,3,14,5,0,12,7},{1,15,13,8,10,3,7,4,12,5,6,11,0,14,9,2},
{7,11,4,1,9,12,14,2,0,6,10,13,15,3,5,8},{2,1,14,7,4,10,8,13,15,12,9,0,3,5,6,11}}};
/****************************************************************/
/* Entres : 0<=tb<=7, 0<=val<=63 */
/* note: la ligne correspond au bits 0 et 5 de val et la colonne aux bits de 1 a 4 */
return Table[tb][lig][col];
}
/****************************************************************/
void main(void)
{
char p,pe,pp,t,x,xe,xp;
char tab_dif[64][16],tb;
for(tb=0;tb<8;tb++) {
printf("\nS-Table %d\n\n",tb+1);
/* initialiser la table a 0 */
for(p=0;p<64;p++) for(t=0;t<16;t++) tab_dif[p][t]=0;
/* les 8 S-tables */
char Table[8][4][16]={
{{14,4,13,1,2,15,11,8,3,10,6,12,5,9,0,7},{0,15,7,4,14,2,13,1,10,6,12,11,9,5,3,8},
{4,1,14,8,13,6,2,11,15,12,9,7,3,10,5,0},{15,12,8,2,4,9,1,7,5,11,3,14,10,0,6,13}},
{{15,1,8,14,6,11,3,4,9,7,2,13,12,0,5,10},{3,13,4,7,15,2,8,14,12,0,1,10,6,9,11,5},
{0,14,7,11,10,4,13,1,5,8,12,6,9,3,2,15},{13,8,10,1,3,15,4,2,11,6,7,12,0,5,14,9}},
{{10,0,9,14,6,3,15,5,1,13,12,7,11,4,2,8},{13,7,0,9,3,4,6,10,2,8,5,14,12,11,15,1},
{13,6,4,9,8,15,3,0,11,1,2,12,5,10,14,7},{1,10,13,0,6,9,8,7,4,15,14,3,11,5,2,12}},
{{7,13,14,3,0,6,9,10,1,2,8,5,11,12,4,15},{13,8,11,5,6,15,0,3,4,7,2,12,1,10,14,9},
{10,6,9,0,12,11,7,13,15,1,3,14,5,2,8,4},{3,15,0,6,10,1,13,8,9,4,5,11,12,7,2,14}},
{{2,12,4,1,7,10,11,6,8,5,3,15,13,0,14,9},{14,11,2,12,4,7,13,1,5,0,15,10,3,9,8,6},
{4,2,1,11,10,13,7,8,15,9,12,5,6,3,0,14},{11,8,12,7,1,14,2,13,6,15,0,9,10,4,5,3}},
{{12,1,10,15,9,2,6,8,0,13,3,4,14,7,5,11},{10,15,4,2,7,12,9,5,6,1,13,14,0,11,3,8},
{9,14,15,5,2,8,12,3,7,0,4,10,1,13,11,6},{4,3,2,12,9,5,15,10,11,14,1,7,6,0,8,13}},
{{4,11,2,14,15,0,8,13,3,12,9,7,5,10,6,1},{13,0,11,7,4,9,1,10,14,3,5,12,2,15,8,6},
{1,4,11,13,12,3,7,14,10,15,6,8,0,5,9,2},{6,11,13,8,1,4,10,7,9,5,0,15,14,2,3,12}},
{{13,2,8,4,6,15,11,1,10,9,3,14,5,0,12,7},{1,15,13,8,10,3,7,4,12,5,6,11,0,14,9,2},
{7,11,4,1,9,12,14,2,0,6,10,13,15,3,5,8},{2,1,14,7,4,10,8,13,15,12,9,0,3,5,6,11}}};
/****************************************************************/
/* 0<=tb<=7, 0<=val<=63 */
/* note: la ligne correspond au bits 0 et 5 de val
et la colonne aux bits de 1 a 4 */
int calculer_s_table(char tb,char val)
{
char j,t[6],lig,col;
return Table[tb][lig][col];
}
/****************************************************************/
void main(void)
{
char parite;
unsigned tb,aa,k,i,j;
int tab_lin[64][16];
for(tb=0;tb<8;tb++) {
printf("\nS-Table %d\n\n",tb+1);
/* Remplir la table */
for(i=0;i<64;i++) {
for(j=0;j<16;j++) {
tab_lin[i][j]=-32;
for(k=0;k<64;k++) {
#ifndef __NEURONES__
#define __NEURONES__
/****************************************************************/
for(k=0;k<NB_SORTIES;k++) {
res += (activations_apprentissage[k] == 1.0 && activations_sorties[k] > 0.5) ||
(activations_apprentissage[k] == 0.0 && activations_sorties[k] < 0.5);
}
return res;
}
/****************************************************************/
/* retourne le nombre d'exemples realises
*/
int calculer_ok(void)
{
int k,res = 0;
for(k=0;k<NB_SORTIES;k++) {
res += (activations_apprentissage[k] == 1.0 && activations_sorties[k] > 0.5) ||
(activations_apprentissage[k] == 0.0 && activations_sorties[k] < 0.5);
}
return ((res==NB_SORTIES)?1:0);
}
/****************************************************************/
/* Fonction de transfert du reseau de neurones
*/
TYPE_REEL sigmoid(TYPE_REEL x)
{
if (x>10.0) return 0.9999999998;
if (x<-10.0) return 0.0000000001;
return( 1.0/(1.0+exp(-1.0*x)) );
}
/****************************************************************/
return r;
}
/****************************************************************/
/* initialise les poids de chaque liaison et les seuils de chaque
neurone en couche de sortie et en couche cachee avec une valeur
aleatoire comprise entre MIN_ALEA et MAX_ALEA
*/
void initialiser_poids(void)
{
int i,j,k;
for(k=0;k<NB_SORTIES;k++) {
for(j=0;j<NB_CACHEES;j++)
poids_sorties[k][j]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
seuils_sorties[k]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
delta_sorties[k]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
}
for(j=0;j<NB_CACHEES;j++) {
for(i=0;i<NB_ENTREES;i++)
poids_cachees[j][i]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
seuils_cachees[j]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
delta_cachees[j]=calculer_nb_aleatoire(MIN_ALEA,MAX_ALEA);
}
}
/****************************************************************/
/* Calcule les sorties du reseau en fonction des entrees donnees
a travers les couches du reseau de neurones
*/
void calculer_sorties(void)
{
int i,j,k;
TYPE_REEL net;
for(j=0;j<NB_CACHEES;j++) {
net=0.0;
for(i=0;i<NB_ENTREES;i++) net+=activations_entrees[i]*poids_cachees[j][i];
activations_cachees[j]=sigmoid(net-seuils_cachees[j]);
}
for(k=0;k<NB_SORTIES;k++) {
net=0.0;
for(j=0;j<NB_CACHEES;j++) net+=activations_cachees[j]*poids_sorties[k][j];
activations_sorties[k]=sigmoid(net-seuils_sorties[k]);
}
}
/****************************************************************/
/* Met jour les variations des sorties en fonction des valeurs de
sortie desirees et des valeurs de sortie du reseau de neurones
pour chaque neurone de couche de sortie et de couche cachee
=> c'est la retropropagation de l'erreur
*/
void calculer_deltas(void)
{
int j,k;
TYPE_REEL erreur;
for(k=0;k<NB_SORTIES;k++) {
erreur=activations_apprentissage[k]-activations_sorties[k];
delta_sorties[k]=erreur*activations_sorties[k]*(1.0-activations_sorties[k]);
}
for(j=0;j<NB_CACHEES;j++) {
erreur=0.0;
for(k=0;k<NB_SORTIES;k++) erreur+=delta_sorties[k]*poids_sorties[k][j];
delta_cachees[j]=erreur*activations_cachees[j]*(1.0-activations_cachees[j]);
}
}
#if AVEC_MOMENTUM
/****************************************************************/
/* Change les valeurs des poids de chaque liaison entre neurones
en fonction des variations "delta" sur les valeurs de sortie et
sur les valeurs des poids precedents avec un "momentum term".
=> le momentum permet d'eviter les minima locaux (blocage)
=> Epsilon est le learning rate (cad coeff. d'apprentissage)
*/
void changer_poids(void)
{
int i,j,k;
for(k=0;k<NB_SORTIES;k++) {
for(j=0;j<NB_CACHEES;j++) {
delta_poids_sorties[k][j]=EPSILON*delta_sorties[k]*activations_cachees[j]+
MOMENTUM*delta_poids_sorties[k][j];
poids_sorties[k][j]+=delta_poids_sorties[k][j];
}
delta_seuils_sorties[k]=-1.0*EPSILON*delta_sorties[k]+MOMENTUM*delta_seuils_sorties[k];
seuils_sorties[k]+=delta_seuils_sorties[k];
}
for(j=0;j<NB_CACHEES;j++) {
for(i=0;i<NB_ENTREES;i++) {
delta_poids_cachees[j][i]=EPSILON*delta_cachees[j]*activations_entrees[i]+
MOMENTUM*delta_poids_cachees[j][i];
poids_cachees[j][i]+=delta_poids_cachees[j][i];
}
delta_seuils_cachees[j]=-1.0*EPSILON*delta_cachees[j]+MOMENTUM*delta_seuils_cachees[j];
seuils_cachees[j]+=delta_seuils_cachees[j];
}
}
#else
/****************************************************************/
/* Change les valeurs des poids de chaque liaison entre neurones
uniquement en fonction des variations "delta" sur les valeurs
de sortie (pas de momentum term ici)
=> Epsilon est le learning rate (cad coeff. d'apprentissage)
*/
void changer_poids(void)
{
int i,j,k;
for(k=0;k<NB_SORTIES;k++) {
for(j=0;j<NB_CACHEES;j++)
poids_sorties[k][j]+=EPSILON*delta_sorties[k]*activations_cachees[j];
seuils_sorties[k]-=EPSILON*delta_sorties[k];
}
for(j=0;j<NB_CACHEES;j++) {
for(i=0;i<NB_ENTREES;i++) {
poids_cachees[j][i]+=EPSILON*delta_cachees[j]*activations_entrees[i];
}
seuils_cachees[j]-=EPSILON*delta_cachees[j];
}
#endif
/****************************************************************/
/* Calcule le taux d'erreur PSS entre la sortie desiree
et la sortie du reseau de neurones
*/
TYPE_REEL calculer_pss(void)
{
int k;
TYPE_REEL res=0.0;
for(k=0;k<NB_SORTIES;k++)
res+=(activations_apprentissage[k]-activations_sorties[k])*
(activations_apprentissage[k]-activations_sorties[k]);
return res;
}
/****************************************************************/
/* Affiche tous les seuils de chaque neurone et tous les poids
de chaque liaison entre neurones
*/
void afficher_seuils_et_poids(void)
{
int i,j,k;
for(k=0;k<NB_SORTIES;k++) {
printf("seuils_sorties[%2d]=%5.2f ",k,seuils_sorties[k]);
for(j=0;j<NB_CACHEES;j++) printf("%5.2f ",poids_sorties[k][j]);
printf("\n");
}
printf("\n");
for(k=0;k<NB_CACHEES;k++) {
printf("seuils_cachees[%2d]=%5.2f ",k,seuils_cachees[k]);
for(j=0;j<NB_ENTREES;j++) printf("%5.2f ",poids_cachees[k][j]);
printf("\n");
}
printf("\n");
}
/**************************************************************/
void sauver_poids(t)
int t;
{
FILE *fp;
char s[256];
sprintf(s,"PDS%05d.RDN",t);
fp=fopen(s,"wb");
if (fp==NULL) { printf("ERREUR SAUVEGARDE %s\n",s); return; }
fwrite(seuils_sorties,sizeof(TYPE_REEL),NB_SORTIES,fp);
fwrite(seuils_cachees,sizeof(TYPE_REEL),NB_CACHEES,fp);
fwrite(poids_cachees,sizeof(TYPE_REEL),NB_CACHEES*NB_ENTREES,fp);
fwrite(poids_sorties,sizeof(TYPE_REEL),NB_SORTIES*NB_CACHEES,fp);
fclose(fp);
}
/**************************************************************/
void relire_poids(t)
int t;
{
FILE *fp;
char s[256];
sprintf(s,"PDS%05d.RDN",t);
fp=fopen(s,"rb");
if (fp==NULL) { printf("FICHIER %s INTROUVABLE\n",s); return; }
fread(seuils_sorties,sizeof(TYPE_REEL),NB_SORTIES,fp);
fread(seuils_cachees,sizeof(TYPE_REEL),NB_CACHEES,fp);
fread(poids_cachees,sizeof(TYPE_REEL),NB_CACHEES*NB_ENTREES,fp);
fread(poids_sorties,sizeof(TYPE_REEL),NB_SORTIES*NB_CACHEES,fp);
fclose(fp);
}
#endif
Diffrential Neuro-generator for D.E.S.
// PROBDIF.C - Cryptanalyse differentielle des S-tables du D.E.S. par un rseau de neurones
// Dourlens Sebastien
// V1.0
#include <stdio.h>
#include <conio.h>
#include <math.h>
#include <stdlib.h>
#include <time.h>
#include <mem.h>
#include <alloc.h>
#include <dos.h>
#define NB_ENTREES 16
#define NB_CACHEES 16
#define NB_SORTIES 6
#define EPSILON 0.5
#define AVEC_MOMENTUM 0
#define MOMENTUM 0.3
#define MAX_ALEA 0.3
#define MIN_ALEA -0.3
#define TYPE_REEL double
char *tabexemples[]={
"1000000000000000","0100000000000000","0010000000000000","0001000000000000",
"0000100000000000","0000010000000000","0000001000000000","0000000100000000",
"0000000010000000","0000000001000000","0000000000100000","0000000000010000",
"0000000000001000","0000000000000100","0000000000000010","0000000000000001" };
int NB_PRESENTATIONS = 10;
TYPE_REEL activations_entrees[NB_ENTREES]; // init.
TYPE_REEL activations_cachees[NB_CACHEES];
TYPE_REEL activations_sorties[NB_SORTIES]; // init.
TYPE_REEL activations_apprentissage[NB_SORTIES]; // init.
TYPE_REEL delta_sorties[NB_SORTIES];
TYPE_REEL delta_cachees[NB_CACHEES];
TYPE_REEL seuils_sorties[NB_SORTIES];
TYPE_REEL seuils_cachees[NB_CACHEES];
TYPE_REEL poids_sorties[NB_SORTIES][NB_CACHEES];
TYPE_REEL poids_cachees[NB_CACHEES][NB_ENTREES];
#include "neurones.c"
/* 8 S-tables */
char Table[8][4][16]={{
{14,4,13,1,2,15,11,8,3,10,6,12,5,9,0,7},{0,15,7,4,14,2,13,1,10,6,12,11,9,5,3,8},{4,1,14,8,13,6,2,11,15,12,9,7,3,1
0,5,0},{15,12,8,2,4,9,1,7,5,11,3,14,10,0,6,13}},
{{15,1,8,14,6,11,3,4,9,7,2,13,12,0,5,10},{3,13,4,7,15,2,8,14,12,0,1,10,6,9,11,5},{0,14,7,11,10,4,13,1,5,8,12,6,9,
3,2,15},{13,8,10,1,3,15,4,2,11,6,7,12,0,5,14,9}},
{{10,0,9,14,6,3,15,5,1,13,12,7,11,4,2,8},{13,7,0,9,3,4,6,10,2,8,5,14,12,11,15,1},{13,6,4,9,8,15,3,0,11,1,2,12,5,1
0,14,7},{1,10,13,0,6,9,8,7,4,15,14,3,11,5,2,12}},
{{7,13,14,3,0,6,9,10,1,2,8,5,11,12,4,15},{13,8,11,5,6,15,0,3,4,7,2,12,1,10,14,9},{10,6,9,0,12,11,7,13,15,1,3,14,
5,2,8,4},{3,15,0,6,10,1,13,8,9,4,5,11,12,7,2,14}},
{{2,12,4,1,7,10,11,6,8,5,3,15,13,0,14,9},{14,11,2,12,4,7,13,1,5,0,15,10,3,9,8,6},{4,2,1,11,10,13,7,8,15,9,12,5,6,
3,0,14},{11,8,12,7,1,14,2,13,6,15,0,9,10,4,5,3}},
{{12,1,10,15,9,2,6,8,0,13,3,4,14,7,5,11},{10,15,4,2,7,12,9,5,6,1,13,14,0,11,3,8},{9,14,15,5,2,8,12,3,7,0,4,10,1,1
3,11,6},{4,3,2,12,9,5,15,10,11,14,1,7,6,0,8,13}},
{{4,11,2,14,15,0,8,13,3,12,9,7,5,10,6,1},{13,0,11,7,4,9,1,10,14,3,5,12,2,15,8,6},{1,4,11,13,12,3,7,14,10,15,6,8,
0,5,9,2},{6,11,13,8,1,4,10,7,9,5,0,15,14,2,3,12}},
{{13,2,8,4,6,15,11,1,10,9,3,14,5,0,12,7},{1,15,13,8,10,3,7,4,12,5,6,11,0,14,9,2},{7,11,4,1,9,12,14,2,0,6,10,13,1
5,3,5,8},{2,1,14,7,4,10,8,13,15,12,9,0,3,5,6,11}}};
/****************************************************************/
/* 0<=tb<=7, 0<=val<=63 */
/* note: la ligne correspond au bits 0 et 5 de val
et la colonne aux bits de 1 a 4 */
return Table[tb][lig][col];
}
/****************************************************************/
void main(void)
{
int nb_presentations;
char p,pe,pp,x,xe,xp,j,tb=4,i;
TYPE_REEL tss;
/* init reseau */
clrscr();
randomize();
initialiser_poids();
/* apprentissage */
for(tb=0;tb<8;tb++){
printf("S-table %d",tb+1);
for(nb_presentations=0;nb_presentations<NB_PRESENTATIONS;nb_presentations++){
tss=0.0;
gotoxy(1,5); fprintf(stderr,"Presentation: %3ld ",nb_presentations+1L);
for(j=0;j<NB_ENTREES;j++)
activations_entrees[j]=(TYPE_REEL) (tabexemples[p][j]-'0');
calculer_sorties();
printf("%1.3lf ",activations_sorties[i]);
}
printf("\n");
}
}
}
#define NB_ENTREES 16
#define NB_CACHEES 16
#define NB_SORTIES 6
#define EPSILON 0.5
#define AVEC_MOMENTUM 0
#define MOMENTUM 0.3
#define MAX_ALEA 0.3
#define MIN_ALEA -0.3
#define TYPE_REEL double
TYPE_REEL activations_entrees[NB_ENTREES];
TYPE_REEL activations_cachees[NB_CACHEES];
TYPE_REEL activations_sorties[NB_SORTIES];
TYPE_REEL activations_apprentissage[NB_SORTIES];
TYPE_REEL delta_sorties[NB_SORTIES];
TYPE_REEL delta_cachees[NB_CACHEES];
TYPE_REEL seuils_sorties[NB_SORTIES];
TYPE_REEL seuils_cachees[NB_CACHEES];
TYPE_REEL poids_sorties[NB_SORTIES][NB_CACHEES];
TYPE_REEL poids_cachees[NB_CACHEES][NB_ENTREES];
#if AVEC_MOMENTUM
TYPE_REEL delta_poids_sorties[NB_SORTIES][NB_CACHEES];
TYPE_REEL delta_seuils_sorties[NB_SORTIES];
TYPE_REEL delta_poids_cachees[NB_CACHEES][NB_ENTREES];
TYPE_REEL delta_seuils_cachees[NB_CACHEES];
#endif
#include "neurones.c"
/* 8 S-tables */
char Table[8][4][16]={{
{14,4,13,1,2,15,11,8,3,10,6,12,5,9,0,7},{0,15,7,4,14,2,13,1,10,6,12,11,9,5,3,8},{4,1,14,8,13,6,2,11,15,12,9,7,3,1
0,5,0},{15,12,8,2,4,9,1,7,5,11,3,14,10,0,6,13}},
{{15,1,8,14,6,11,3,4,9,7,2,13,12,0,5,10},{3,13,4,7,15,2,8,14,12,0,1,10,6,9,11,5},{0,14,7,11,10,4,13,1,5,8,12,6,9,
3,2,15},{13,8,10,1,3,15,4,2,11,6,7,12,0,5,14,9}},
{{10,0,9,14,6,3,15,5,1,13,12,7,11,4,2,8},{13,7,0,9,3,4,6,10,2,8,5,14,12,11,15,1},{13,6,4,9,8,15,3,0,11,1,2,12,5,1
0,14,7},{1,10,13,0,6,9,8,7,4,15,14,3,11,5,2,12}},
{{7,13,14,3,0,6,9,10,1,2,8,5,11,12,4,15},{13,8,11,5,6,15,0,3,4,7,2,12,1,10,14,9},{10,6,9,0,12,11,7,13,15,1,3,14,
5,2,8,4},{3,15,0,6,10,1,13,8,9,4,5,11,12,7,2,14}},
{{2,12,4,1,7,10,11,6,8,5,3,15,13,0,14,9},{14,11,2,12,4,7,13,1,5,0,15,10,3,9,8,6},{4,2,1,11,10,13,7,8,15,9,12,5,6,
3,0,14},{11,8,12,7,1,14,2,13,6,15,0,9,10,4,5,3}},
{{12,1,10,15,9,2,6,8,0,13,3,4,14,7,5,11},{10,15,4,2,7,12,9,5,6,1,13,14,0,11,3,8},{9,14,15,5,2,8,12,3,7,0,4,10,1,1
3,11,6},{4,3,2,12,9,5,15,10,11,14,1,7,6,0,8,13}},
{{4,11,2,14,15,0,8,13,3,12,9,7,5,10,6,1},{13,0,11,7,4,9,1,10,14,3,5,12,2,15,8,6},{1,4,11,13,12,3,7,14,10,15,6,8,
0,5,9,2},{6,11,13,8,1,4,10,7,9,5,0,15,14,2,3,12}},
{{13,2,8,4,6,15,11,1,10,9,3,14,5,0,12,7},{1,15,13,8,10,3,7,4,12,5,6,11,0,14,9,2},{7,11,4,1,9,12,14,2,0,6,10,13,1
5,3,5,8},{2,1,14,7,4,10,8,13,15,12,9,0,3,5,6,11}}};
/****************************************************************/
/* 0<=tb<=7, 0<=val<=63 */
/* note: la ligne correspond au bits 0 et 5 de val
et la colonne aux bits de 1 a 4 */
char calculer_s_table(char tb,char val)
{
char j,t[6],lig,col;
lig=(t[5]<<1)+t[0]; col=(t[4]<<3)+(t[3]<<2)+(t[2]<<1)+t[1];
return Table[tb][lig][col];
}
/****************************************************************/
/* calcule la parit des nbbits bits dune valeur val donne */
char calculer_parite(char val,char nbbits)
{
char j,n=0;
/****************************************************************/
void main(void)
{
int nb_presentations;
char p,aa,bb,i,j,k,tb;
TYPE_REEL tss;
/* init reseau */
clrscr();
for(tb=0;tb<8;tb++) {
randomize();
initialiser_poids();
/* apprentissage */
printf("S-table %d",tb+1);
for(nb_presentations=0;nb_presentations<NB_PRESENTATIONS;nb_presentations++){
tss=0.0;
gotoxy(1,5); fprintf(stderr,"Presentation: %3ld ",nb_presentations+1L);
for(i=0;i<64;i++) {
for(j=0;j<16;j++) {
for(k=0;k<64;k++) {
aa = calculer_s_table(tb,k) & j; // 4 bits
bb = k & i; // 6 bits
}}}
fprintf(stderr,"\nN %d: Tss=%lf \n",nb_presentations+1,tss);
}
/* RESULTATS */
printf(" "); for(p=0;p<16;p++) printf(" %Xx",p); printf("\n");
for(i=0;i<6;i++) {
printf("%Xx ",i);
for(p=0;p<16;p++) {
for(j=0;j<NB_ENTREES;j++)
activations_entrees[j]=(TYPE_REEL) (tabexemples[p][j]-'0');
calculer_sorties();
printf("%1.3lf ",activations_sorties[i]);
}
printf("\n");
}
printf("\n");
}
}
The SNAP was awarded IEEE 1993 Gordon Bell for the best quality/price in its class of
supercomputers.
Micro Devices
30 Skyline Drive
Lake Mary
FL 32746-6201
Tl. : (407) 333-4379
MicroDevices a fait le MD1220 - 'Neural Bit Slice'
Each of the completed circuits have very different uses.
They seem similar to those of Intel but architectures are not.
Intel Corp
2250 Mission College Blvd
NeuralWare, Inc
United Kingdom
Tl.: +44 730 60256
NeuroDynamX, Inc.
4730 Walnut St., Suite 101B
Boulder, CO 80301
Tl.: (303) 442-3539
Fax: (303) 442-2854
Internet: techsupport@ndx.com
NDX sells a lot of products to neural network:
Neural Accelerator NDX: a set of i860 to PC-based accelerator cards that give more
than 45 million connections per second.
The DynaMind neural network software.
The iNNTS: Intel's 80170NX (ETANN) Neural Network Training System.
Motherboard ICT1050
-IBM compatible PC
-map DANN050L
-digital interface
-demand configurations
In the table, each row corresponds to a value of XOR of entries and each column to a value of
XOR output (in hexadecimal). The value of each cell in a table counts the number of pairs (in
decimal) among the 64 x 64 = 4096 possible pairs) whose input XORs and XORs outputs are
as specified by the row and column of the cell. As there are only 64 x 16 = 1024 casesdans the
table, the average value of the number of pairs of each box is 4.
The first line of the table is special. As in the first line, the Xor of entries is 0, the XOR output
must be 0 also. In addition, box with an XOR of output to 0 account all 64 pairs the XOR of
entries is 0 and the other boxes of this line doesn't count a pair at all. Different values are
represented in the other lines.
You can also consult the C code that allowed to generate tables of distributions on the
following pages.
How to read these tables?
You will find easting 16 possible differences of output of the S-table texts, it is an XOR of a
pair of outputs. Orderly, you will find 64 possible differences of texts in S-table entry, it is an
XOR of a pair of inputs.
For example, in the distribution of the S table - table 1, for the XOR of entries 01 x, there are
11 possible output XORs. For the XOR of entries 34 x and the XOR output 02 x, the number
of possible pairs is 16 or of the pairs with the XOR of entries goes to the XOR output 02 x.
0Bx 02 04 00 10 02 02 04 00 02 06 02 06 06 04 02 12
0Cx 00 00 00 08 00 06 06 00 00 06 06 04 06 06 14 02
0Dx 06 06 04 08 04 08 02 06 00 06 04 06 00 02 00 02
0Ex 00 04 08 08 06 06 04 00 06 06 04 00 00 04 00 08
0Fx 02 00 02 04 04 06 04 02 04 08 02 02 02 06 08 08
10x 00 00 00 00 00 00 02 14 00 06 06 12 04 06 08 06
11x 06 08 02 04 06 04 08 06 04 00 06 06 00 04 00 00
12x 00 08 04 02 06 06 04 06 06 04 02 06 06 00 04 00
13x 02 04 04 06 02 00 04 06 02 00 06 08 04 06 04 06
14x 00 08 08 00 10 00 04 02 08 02 02 04 04 08 04 00
15x 00 04 06 04 02 02 04 10 06 02 00 10 00 04 06 04
16x 00 08 10 08 00 02 02 06 10 02 00 02 00 06 02 06
17x 04 04 06 00 10 06 00 02 04 04 04 06 06 06 02 00
18x 00 06 06 00 08 04 02 02 02 04 06 08 06 06 02 02
19x 02 06 02 04 00 08 04 06 10 04 00 04 02 08 04 00
1Ax 00 06 04 00 04 06 06 06 06 02 02 00 04 04 06 08
1Bx 04 04 02 04 10 06 06 04 06 02 02 04 02 02 04 02
1Cx 00 10 10 06 06 00 00 12 06 04 00 00 02 04 04 00
1Dx 04 02 04 00 08 00 00 02 10 00 02 06 06 06 14 00
1Ex 00 02 06 00 14 02 00 00 06 04 10 08 02 02 06 02
1Fx 02 04 10 06 02 02 02 08 06 08 00 00 00 04 06 04
20x 00 00 00 10 00 12 08 02 00 06 04 04 04 02 00 12
21x 00 04 02 04 04 08 10 00 04 04 10 00 04 00 02 08
22x 10 04 06 02 02 08 02 02 02 02 06 00 04 00 04 10
23x 00 04 04 08 00 02 06 00 06 06 02 10 02 04 00 10
24x 12 00 00 02 02 02 02 00 14 14 02 00 02 06 02 04
25x 06 04 04 12 04 04 04 10 02 02 02 00 04 02 02 02
26x 00 00 04 10 10 10 02 04 00 04 06 04 04 04 02 00
27x 10 04 02 00 02 04 02 00 04 08 00 04 08 08 04 04
28x 12 02 02 08 02 06 12 00 00 02 06 00 04 00 06 02
29x 04 02 02 10 00 02 04 00 00 14 10 02 04 06 00 04
2Ax 04 02 04 06 00 02 08 02 02 14 02 06 02 06 02 02
2Bx 12 02 02 02 04 06 06 02 00 02 06 02 06 00 08 04
2Cx 04 02 02 04 00 02 10 04 02 02 04 08 08 04 02 06
2Dx 06 02 06 02 08 04 04 04 02 04 06 00 08 02 00 06
2Ex 06 06 02 02 00 02 04 06 04 00 06 02 12 02 06 04
2Fx 02 02 02 02 02 06 08 08 02 04 04 06 08 02 04 02
30x 00 04 06 00 12 06 02 02 08 02 04 04 06 02 02 04
31x 04 08 02 10 02 02 02 02 06 00 00 02 02 04 10 08
32x 04 02 06 04 04 02 02 04 06 06 04 08 02 02 08 00
33x 04 04 06 02 10 08 04 02 04 00 02 02 04 06 02 04
34x 00 08 16 06 02 00 00 12 06 00 00 00 00 08 00 06
35x 02 02 04 00 08 00 00 00 14 04 06 08 00 02 14 00
36x 02 06 02 02 08 00 02 02 04 02 06 08 06 04 10 00
37x 02 02 12 04 02 04 04 10 04 04 02 06 00 02 02 04
38x 00 06 02 02 02 00 02 02 04 06 04 04 04 06 10 10
39x 06 02 02 04 12 06 04 08 04 00 02 04 02 04 04 00
3Ax 06 04 06 04 06 08 00 06 02 02 06 02 02 06 04 00
3Bx 02 06 04 00 00 02 04 06 04 06 08 06 04 04 06 02
3Cx 00 10 04 00 12 00 04 02 06 00 04 12 04 04 02 00
3Dx 00 08 06 02 02 06 00 08 04 04 00 04 00 12 04 04
3Ex 04 08 02 02 02 04 04 14 04 02 00 02 00 08 04 04
3Fx 04 08 04 02 04 00 02 04 04 02 04 08 08 06 02 02
00x 64 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
01x 00 00 00 04 00 02 06 04 00 14 08 06 08 04 06 02
02x 00 00 00 02 00 04 06 04 00 00 04 06 10 10 12 06
03x 04 08 04 08 04 06 04 02 04 02 02 04 06 02 00 04
04x 00 00 00 00 00 06 00 14 00 06 10 04 10 06 04 04
05x 02 00 04 08 02 04 06 06 02 00 08 04 02 04 10 02
06x 00 12 06 04 06 04 06 02 02 10 02 08 02 00 00 00
07x 04 06 06 04 02 04 04 02 06 04 02 04 04 06 00 06
08x 00 00 00 04 00 04 00 08 00 10 16 06 06 00 06 04
09x 14 02 04 10 02 08 02 06 02 04 00 00 02 02 02 04
0Ax 00 06 06 02 10 04 10 02 06 02 02 04 02 02 04 02
0Bx 06 02 02 00 02 04 06 02 10 02 00 06 06 04 04 08
0Cx 00 00 00 04 00 14 00 10 00 06 02 04 04 08 06 06
0Dx 06 02 06 02 10 02 00 04 00 10 04 02 08 02 02 04
0Ex 00 06 12 08 00 04 02 00 08 02 04 04 06 02 00 06
0Fx 00 08 02 00 06 06 08 02 04 04 04 06 08 00 04 02
10x 00 00 00 08 00 04 10 02 00 02 08 10 00 10 06 04
11x 06 06 04 06 04 00 06 04 08 02 10 02 02 04 00 00
12x 00 06 02 06 02 04 12 04 06 04 00 04 04 06 02 02
13x 04 00 04 00 08 06 06 00 00 02 00 06 04 08 02 14
14x 00 06 06 04 10 00 02 12 06 02 02 02 04 04 02 02
15x 06 08 02 00 08 02 00 02 02 02 02 02 02 14 10 02
16x 00 08 06 04 02 02 04 02 06 04 06 02 06 00 06 06
17x 06 04 08 06 04 04 00 04 06 02 04 04 04 02 04 02
18x 00 06 04 06 10 04 00 02 04 08 00 00 04 08 02 06
19x 02 04 06 04 04 02 04 02 06 04 06 08 00 06 04 02
1Ax 00 06 08 04 02 04 02 02 08 02 02 06 02 04 04 08
1Bx 00 06 04 04 00 12 06 04 02 02 02 04 04 02 10 02
1Cx 00 04 06 06 12 00 04 00 10 02 06 02 00 00 10 02
1Dx 00 06 02 02 06 00 04 16 04 04 02 00 00 04 06 08
1Ex 00 04 08 02 10 06 06 00 08 04 00 02 04 04 00 06
1Fx 04 02 06 06 02 02 02 04 08 06 10 06 04 00 00 02
20x 00 00 00 02 00 12 10 04 00 00 00 02 14 02 08 10
21x 00 04 06 08 02 10 04 02 02 06 04 02 06 02 00 06
22x 04 12 08 04 02 02 00 00 02 08 08 06 00 06 00 02
23x 08 02 00 02 08 04 02 06 04 08 02 02 06 04 02 04
24x 10 04 00 00 00 04 00 02 06 08 06 10 08 00 02 04
25x 06 00 12 02 08 06 10 00 00 08 02 06 00 00 02 02
26x 02 02 04 04 02 02 10 14 02 00 04 02 02 04 06 04
27x 06 00 00 02 06 04 02 04 04 04 08 04 08 00 06 06
28x 08 00 08 02 04 12 02 00 02 06 02 00 06 02 00 10
29x 00 02 04 10 02 08 06 04 00 10 00 02 10 00 02 04
2Ax 04 00 04 08 06 02 04 04 06 06 02 06 02 02 04 04
2Bx 02 02 06 04 00 02 02 06 02 08 08 04 04 04 08 02
2Cx 10 06 08 06 00 06 04 04 04 02 04 04 00 00 02 04
2Dx 02 02 02 04 00 00 00 02 08 04 04 06 10 02 14 04
2Ex 02 04 00 02 10 04 02 00 02 02 06 02 08 08 10 02
2Fx 12 04 06 08 02 06 02 08 00 04 00 02 00 08 02 00
30x 00 04 00 02 04 04 08 06 10 06 02 12 00 00 00 06
31x 00 10 02 00 06 02 10 02 06 00 02 00 06 06 04 08
32x 08 04 06 00 06 04 04 08 04 06 08 00 02 02 02 00
33x 02 02 06 10 02 00 00 06 04 04 12 08 04 02 02 00
34x 00 12 06 04 06 00 04 04 04 00 04 06 04 02 04 04
35x 00 12 04 06 02 04 04 00 10 00 00 08 00 08 00 06
36x 08 02 04 00 04 00 04 02 00 08 04 02 06 16 02 02
37x 06 02 02 02 06 06 04 08 02 02 06 02 02 02 04 08
38x 00 08 08 10 06 02 02 00 04 00 04 02 04 00 04 10
39x 00 02 00 00 08 00 10 04 10 00 08 04 04 04 04 06
3Ax 04 00 02 08 04 02 02 02 04 08 02 00 04 10 10 02
3Bx 16 04 04 02 08 02 02 06 04 04 04 02 00 02 02 02
3Cx 00 02 06 02 08 04 06 00 10 02 02 04 04 10 04 00
3Dx 00 16 10 02 04 02 04 02 08 00 00 08 00 06 02 00
3Ex 04 04 00 10 02 04 02 14 04 02 06 06 00 00 06 00
3Fx 04 00 00 02 00 08 02 04 00 02 04 04 04 14 10 06
11x 08 02 02 06 04 00 02 00 08 04 12 02 10 00 02 02
12x 00 02 08 02 04 08 00 08 08 00 02 02 04 02 14 00
13x 04 04 12 00 02 02 02 10 02 02 02 02 04 04 04 08
14x 00 06 04 04 06 04 06 02 08 06 06 02 02 00 00 08
15x 04 08 02 08 02 04 08 00 04 02 02 02 02 06 08 02
16x 00 06 10 02 08 04 02 00 02 02 02 08 04 06 04 04
17x 00 06 06 00 06 02 04 04 06 02 02 10 06 08 02 00
18x 00 08 04 06 06 00 06 02 04 00 04 02 10 00 06 06
19x 04 02 04 08 04 02 10 02 02 02 06 08 02 06 00 02
1Ax 00 08 06 04 04 00 06 04 04 08 00 10 02 02 02 04
1Bx 04 10 02 00 02 04 02 04 08 02 02 08 04 02 08 02
1Cx 00 06 08 08 04 02 08 00 12 00 10 00 04 00 02 00
1Dx 00 02 00 06 02 08 04 06 02 00 04 02 04 10 00 14
1Ex 00 04 08 02 04 06 00 04 10 00 02 06 04 08 04 02
1Fx 00 06 08 00 10 06 04 06 04 02 02 10 04 00 00 02
20x 00 00 00 00 00 04 04 08 00 02 02 04 10 16 12 02
21x 10 08 08 00 08 04 02 04 00 06 06 06 00 00 02 00
22x 12 06 04 04 02 04 10 02 00 04 04 02 04 04 00 02
23x 02 02 00 06 00 02 04 00 04 12 04 02 06 04 08 08
24x 04 08 02 12 06 04 02 10 02 02 02 04 02 00 04 00
25x 06 00 02 00 08 02 00 02 08 08 02 02 04 04 10 06
26x 06 02 00 04 04 00 04 00 04 02 14 00 08 10 00 06
27x 00 02 04 16 08 06 06 06 00 02 04 04 00 02 02 02
28x 06 02 10 00 06 04 00 04 04 02 04 08 02 02 08 02
29x 00 02 08 04 00 04 00 06 04 10 04 08 04 04 04 02
2Ax 02 06 00 04 02 04 04 06 04 08 04 04 04 02 04 06
2Bx 10 02 06 06 04 04 08 00 04 02 02 00 02 04 04 06
2Cx 10 04 06 02 04 02 02 02 04 10 04 04 00 02 06 02
2Dx 04 02 04 04 04 02 04 16 02 00 00 04 04 02 06 06
2Ex 04 00 02 10 00 06 10 04 02 06 06 02 02 00 02 08
2Fx 08 02 00 00 04 04 04 02 06 04 06 02 04 08 04 06
30x 00 10 08 06 02 00 04 02 10 04 04 06 02 00 06 00
31x 02 06 02 00 04 02 08 08 02 02 02 00 02 12 06 06
32x 02 00 04 08 02 08 04 04 08 04 02 08 06 02 00 02
33x 04 04 06 08 06 06 00 02 02 02 06 04 12 00 00 02
34x 00 06 02 02 16 02 02 02 12 02 04 00 04 02 00 08
35x 04 06 00 10 08 00 02 02 06 00 00 06 02 10 02 06
36x 04 04 04 04 00 06 06 04 04 04 04 04 00 06 02 08
37x 04 08 02 04 02 02 06 00 02 04 08 04 10 00 06 02
38x 00 08 12 00 02 02 06 06 02 10 02 02 00 08 00 04
39x 02 06 04 00 06 04 06 04 08 00 04 04 02 04 08 02
3Ax 06 00 02 02 04 06 04 04 04 02 02 06 12 02 06 02
3Bx 02 02 06 00 00 10 04 08 04 02 04 08 04 04 00 06
3Cx 00 02 04 02 12 02 00 06 02 00 02 08 04 06 04 10
3Dx 04 06 08 06 02 02 02 02 10 02 06 06 02 04 02 00
3Ex 08 06 04 04 02 10 02 00 02 02 04 02 04 02 10 02
3Fx 02 06 04 00 00 10 08 02 02 08 06 04 06 02 00 04
06x 00 08 08 04 08 08 00 00 08 00 08 00 04 00 00 08
07x 04 02 06 04 06 00 16 06 02 00 00 02 04 02 06 04
08x 00 00 00 04 00 08 04 08 00 04 08 08 04 08 08 00
09x 08 04 04 04 04 00 08 04 04 00 00 04 04 04 04 08
0Ax 00 06 06 00 06 04 04 06 06 04 04 06 00 06 06 00
0Bx 00 12 00 08 00 00 00 00 12 00 00 12 08 12 00 00
0Cx 00 00 00 04 00 08 04 08 00 04 08 08 04 08 08 00
0Dx 08 04 04 04 04 00 00 04 04 08 00 04 04 04 04 08
0Ex 00 06 06 04 06 00 04 06 06 04 00 06 04 06 06 00
0Fx 00 06 06 04 06 04 00 06 06 00 04 06 04 06 06 00
10x 00 00 00 00 00 08 12 04 00 12 08 04 00 04 04 08
11x 04 02 02 16 02 04 00 02 02 00 04 02 16 02 02 04
12x 00 00 00 08 00 04 04 08 00 04 04 08 08 08 08 00
13x 08 02 06 00 06 04 00 06 02 08 04 02 00 02 06 08
14x 00 08 08 00 08 00 08 00 08 08 00 00 00 00 00 16
15x 08 04 04 00 04 08 00 04 04 00 08 04 00 04 04 08
16x 00 08 08 04 08 08 00 00 08 00 08 00 04 00 00 08
17x 04 06 02 04 02 00 00 02 06 16 00 06 04 06 02 04
18x 00 08 08 08 08 04 00 00 08 00 04 00 08 00 00 08
19x 04 04 04 00 04 04 16 04 04 00 04 04 00 04 04 04
1Ax 00 06 06 04 06 00 04 06 06 04 00 06 04 06 06 00
1Bx 00 06 06 04 06 04 00 06 06 00 04 06 04 06 06 00
1Cx 00 08 08 08 08 04 00 00 08 00 04 00 08 00 00 08
1Dx 04 04 04 00 04 04 00 04 04 16 04 04 00 04 04 04
1Ex 00 06 06 00 06 04 04 06 06 04 04 06 00 06 06 00
1Fx 00 00 12 08 12 00 00 12 00 00 00 00 08 00 12 00
20x 00 00 00 08 00 00 00 12 00 00 00 12 08 12 12 00
21x 00 04 08 00 08 04 08 08 04 00 04 04 00 04 08 00
22x 08 02 02 00 02 04 08 06 02 08 04 06 00 06 06 00
23x 04 06 02 08 02 04 00 02 06 00 04 06 08 06 02 04
24x 00 06 06 04 06 04 00 06 06 00 04 06 04 06 06 00
25x 00 08 04 04 04 00 00 04 08 08 00 08 04 08 04 00
26x 00 06 06 00 06 04 08 02 06 08 04 02 00 02 02 08
27x 04 06 02 08 02 04 00 02 06 00 04 06 08 06 02 04
28x 16 04 04 00 04 04 04 04 04 04 04 04 00 04 04 00
29x 00 06 02 08 02 04 00 02 06 08 04 06 08 06 02 00
2Ax 00 02 02 16 02 04 04 02 02 04 04 02 16 02 02 00
2Bx 08 00 04 00 04 08 16 04 00 00 08 00 00 00 04 08
2Cx 08 04 04 04 04 00 08 04 04 08 00 04 04 04 04 00
2Dx 04 02 06 04 06 08 00 06 02 00 08 02 04 02 06 04
2Ex 16 00 00 00 00 16 00 00 00 00 16 00 00 00 00 16
2Fx 16 00 00 00 00 00 16 00 00 16 00 00 00 00 00 16
30x 00 06 06 04 06 04 00 06 06 00 04 06 04 06 06 00
31x 00 08 04 04 04 00 00 04 08 08 00 08 04 08 04 00
32x 16 06 06 04 06 00 04 02 06 04 00 02 04 02 02 00
33x 00 02 06 04 06 08 08 06 02 00 08 02 04 02 06 00
34x 00 12 12 08 12 00 00 00 12 00 00 00 08 00 00 00
35x 00 04 08 00 08 04 08 08 04 00 04 04 00 04 08 00
36x 00 02 02 04 02 00 04 06 02 04 00 06 04 06 06 16
37x 00 02 06 04 06 08 08 06 02 00 08 02 04 02 06 00
38x 00 04 04 00 04 04 04 04 04 04 04 04 00 04 04 16
39x 00 06 02 08 02 04 00 02 06 08 04 06 08 06 02 00
3Ax 00 04 04 00 04 08 08 04 04 08 08 04 00 04 04 00
3Bx 16 04 04 00 04 00 00 04 04 00 00 04 00 04 04 16
3Cx 00 04 04 04 04 00 08 04 04 08 00 04 04 04 04 08
3Dx 04 02 06 04 06 08 00 06 02 00 08 02 04 02 06 04
3Ex 00 02 02 08 02 12 04 02 02 04 12 02 08 02 02 00
3Fx 08 04 00 08 00 00 00 00 04 16 00 04 08 04 00 08
17x 00 06 08 04 08 04 00 02 08 04 00 02 02 08 06 02
18x 00 10 08 00 06 04 00 04 04 04 06 04 04 04 00 06
19x 00 04 06 02 04 04 02 06 04 02 02 04 12 02 10 00
1Ax 00 02 16 02 12 02 00 06 04 00 00 04 00 04 04 08
1Bx 02 08 12 00 00 02 02 06 08 04 00 06 00 00 08 06
1Cx 00 10 02 06 06 06 06 04 08 02 00 04 04 04 02 00
1Dx 04 06 02 00 08 02 04 06 06 00 08 06 02 04 02 04
1Ex 00 02 06 02 04 00 00 02 12 02 02 06 02 10 10 04
1Fx 00 06 08 04 08 08 00 06 06 02 00 06 00 06 02 02
20x 00 00 00 08 00 08 02 06 00 04 04 04 06 06 08 08
21x 00 00 00 06 06 02 06 04 06 10 14 04 00 00 04 02
22x 14 04 00 10 00 02 12 02 02 02 10 02 00 00 02 02
23x 02 00 00 04 02 02 10 04 00 08 08 02 06 08 00 08
24x 06 02 08 04 04 04 06 02 02 06 06 02 06 02 02 02
25x 06 00 00 08 02 08 02 06 06 04 02 02 04 02 06 06
26x 12 00 00 04 00 04 04 04 00 08 04 00 12 08 00 04
27x 12 02 00 02 00 12 02 02 04 04 08 04 08 02 02 00
28x 02 08 04 06 02 04 06 00 06 06 04 00 02 02 02 10
29x 06 04 06 08 08 04 06 02 00 00 02 02 10 00 02 04
2Ax 04 04 00 02 02 04 06 02 00 00 06 04 10 04 04 12
2Bx 04 06 02 06 00 00 12 02 00 04 12 02 06 04 00 04
2Cx 08 06 02 06 04 08 06 00 04 04 00 02 06 00 06 02
2Dx 04 04 00 04 00 06 04 02 04 12 00 04 04 06 04 06
2Ex 06 00 02 04 00 06 06 04 02 10 06 10 06 02 00 00
2Fx 10 04 00 02 02 06 10 02 00 02 02 04 06 02 02 10
30x 00 04 08 04 06 04 00 06 10 04 02 04 02 06 04 00
31x 00 06 06 04 10 02 00 00 04 04 00 00 04 06 12 06
32x 04 06 00 02 06 04 06 00 06 00 04 06 04 10 06 00
33x 08 10 00 14 08 00 00 08 02 00 02 04 00 04 04 00
34x 00 04 04 02 14 04 00 08 06 08 02 02 00 04 06 00
35x 00 04 16 00 08 04 00 04 04 04 00 08 00 04 04 04
36x 04 04 04 06 02 02 02 12 02 04 04 08 02 04 04 00
37x 04 02 02 02 04 02 00 08 02 02 02 12 06 02 08 06
38x 00 04 08 04 12 00 00 08 10 02 00 00 00 04 02 10
39x 00 08 12 00 02 02 02 02 12 04 00 08 00 04 04 04
3Ax 00 14 04 00 04 06 00 00 06 02 10 08 00 00 04 06
3Bx 00 02 02 02 04 04 08 06 08 02 02 02 06 14 02 00
3Cx 00 00 10 02 06 00 00 02 06 02 02 10 02 04 10 08
3Dx 00 06 12 02 04 08 00 08 08 02 02 00 02 02 04 04
3Ex 04 04 10 00 02 04 08 08 02 02 00 02 06 08 04 00
3Fx 08 06 06 00 04 02 02 04 04 02 08 06 02 04 06 00
0Cx 00 00 00 12 00 10 04 06 00 08 04 04 02 12 02 00
0Dx 12 00 02 10 06 04 04 02 04 02 06 00 02 06 00 04
0Ex 00 06 04 00 04 04 10 08 06 02 04 06 02 00 06 02
0Fx 02 02 02 02 06 02 06 02 10 04 08 02 06 04 04 02
10x 00 00 00 08 00 08 00 12 00 04 02 06 08 04 06 06
11x 06 02 06 04 06 02 06 04 06 06 04 02 04 00 06 00
12x 00 08 04 02 00 04 02 00 04 10 06 02 08 06 04 04
13x 06 06 12 00 12 02 00 06 06 02 00 04 00 02 04 02
14x 00 04 06 02 08 06 00 02 06 10 04 00 02 04 06 04
15x 02 02 06 06 04 04 02 06 02 06 08 04 04 00 04 04
16x 00 04 14 06 08 04 02 06 02 00 02 00 04 02 00 10
17x 02 06 08 00 00 02 00 02 02 06 00 08 08 02 12 06
18x 00 04 06 06 08 04 02 02 06 04 06 04 02 04 02 04
19x 02 06 00 02 04 04 04 06 04 08 06 04 02 02 06 04
1Ax 00 06 06 00 08 02 04 06 04 02 04 06 02 00 04 10
1Bx 00 04 10 02 04 04 02 06 06 06 02 02 06 06 02 02
1Cx 00 00 08 02 12 02 06 02 08 06 06 02 04 00 04 02
1Dx 02 04 00 06 08 06 00 02 06 08 06 00 02 04 00 10
1Ex 00 10 08 02 08 02 00 02 06 04 02 04 06 04 02 04
1Fx 00 06 06 08 06 04 02 04 04 02 02 00 02 04 02 12
20x 00 00 00 00 00 06 06 04 00 04 08 08 04 06 10 08
21x 02 08 06 08 04 04 06 06 08 04 00 04 00 02 02 00
22x 16 02 04 06 02 04 02 00 06 04 08 02 00 02 02 04
23x 00 04 00 04 04 06 10 04 02 02 06 02 04 06 06 04
24x 10 08 00 06 12 06 10 04 08 00 00 00 00 00 00 00
25x 00 02 04 02 00 04 04 00 04 00 10 10 04 10 06 04
26x 02 02 00 12 02 02 06 02 04 04 08 00 06 06 08 00
27x 08 04 00 08 02 04 02 04 00 06 02 04 04 08 02 06
28x 06 08 04 06 00 04 02 02 04 08 02 06 04 02 02 04
29x 02 04 04 00 08 08 06 08 06 04 00 04 04 04 02 00
2Ax 06 00 00 06 06 04 06 08 02 04 00 02 02 04 06 08
2Bx 12 00 04 00 00 04 02 02 02 06 10 06 10 02 04 00
2Cx 04 02 06 00 00 06 08 06 04 02 02 08 04 06 04 02
2Dx 06 02 02 06 06 04 04 02 06 02 04 08 04 02 04 02
2Ex 04 06 02 04 02 04 04 02 04 02 04 06 04 10 04 02
2Fx 10 00 04 08 00 06 06 02 00 04 04 02 06 02 02 08
30x 00 12 08 02 00 06 00 00 06 06 00 02 08 02 06 06
31x 02 06 10 04 02 02 02 04 06 00 02 06 00 02 04 12
32x 04 02 02 08 10 08 08 06 00 02 02 04 04 02 02 00
33x 04 02 02 02 06 00 04 00 10 06 06 04 00 04 08 06
34x 00 04 04 02 06 04 00 04 06 02 06 04 02 08 00 12
35x 06 12 04 02 04 02 02 04 08 02 02 00 06 04 04 02
36x 00 02 02 04 04 04 04 00 02 10 12 04 00 10 04 02
37x 10 02 02 06 14 02 02 06 02 00 04 06 02 00 04 02
38x 00 04 14 00 08 02 00 04 04 04 02 00 08 02 04 08
39x 02 04 08 00 06 02 00 06 02 06 04 02 08 06 02 06
3Ax 08 04 00 04 06 02 00 04 06 08 06 00 06 00 04 06
3Bx 00 04 06 06 02 02 02 14 00 12 00 04 02 02 08 00
3Cx 00 06 16 00 02 02 02 08 04 02 00 12 06 02 02 00
3Dx 00 06 02 02 02 06 08 02 04 02 06 02 06 02 04 10
3Ex 04 02 02 04 04 00 06 10 04 02 04 06 06 02 06 02
3Fx 00 04 06 06 04 08 04 00 04 08 04 00 04 08 02 02
01x 00 00 00 02 00 04 04 14 00 12 04 06 02 06 06 04
02x 00 00 00 00 00 12 02 02 00 04 00 04 08 12 06 14
03x 08 02 12 02 06 08 06 00 06 04 04 02 02 00 00 02
04x 00 00 00 08 00 04 04 08 00 08 08 12 02 06 02 02
05x 06 00 00 02 08 00 08 04 00 02 06 00 10 06 06 06
06x 00 02 12 00 08 04 08 02 04 04 04 02 06 00 06 02
07x 04 06 04 12 00 04 02 00 00 14 02 06 04 00 00 06
08x 00 00 00 08 00 00 06 10 00 04 12 04 06 06 00 08
09x 10 08 04 08 06 02 02 00 02 06 08 02 00 06 00 00
0Ax 00 10 06 02 12 02 04 00 04 04 06 04 04 00 00 06
0Bx 00 02 02 02 04 08 06 04 04 00 04 02 06 04 02 14
0Cx 00 00 00 04 00 04 08 04 00 02 06 00 14 12 08 02
0Dx 06 06 02 04 02 06 04 06 06 04 08 08 00 02 00 00
0Ex 00 12 10 10 00 02 04 02 08 06 04 02 00 00 02 02
0Fx 02 00 00 00 06 08 08 00 06 02 04 06 08 00 06 08
10x 00 00 00 04 00 02 08 06 00 06 04 10 08 04 08 04
11x 06 10 10 04 04 02 00 04 04 00 02 08 04 02 02 02
12x 00 00 08 08 02 08 02 08 06 04 02 08 00 00 08 00
13x 04 04 02 02 08 06 00 02 02 02 00 04 06 08 14 00
14x 00 08 06 02 08 08 02 06 04 02 00 02 08 06 00 02
15x 04 04 08 02 04 00 04 10 08 02 04 04 04 02 00 04
16x 00 06 10 02 02 02 02 04 10 08 02 02 00 04 10 00
17x 08 02 04 02 06 04 00 06 04 04 02 02 00 04 08 08
18x 00 16 02 02 06 00 06 00 06 02 08 00 06 00 02 08
19x 00 08 00 02 04 04 10 04 08 00 06 04 02 06 02 04
1Ax 00 02 04 08 12 04 00 06 04 04 00 02 00 06 04 08
1Bx 00 06 02 06 04 02 04 04 06 04 08 04 02 00 10 02
1Cx 00 08 04 04 02 06 06 06 06 04 06 08 00 02 00 02
1Dx 04 04 04 00 00 02 04 02 04 02 02 04 10 10 08 04
1Ex 00 00 02 02 12 06 02 00 12 02 02 04 02 06 08 04
1Fx 02 02 10 14 02 04 02 04 04 06 00 02 04 08 00 00
20x 00 00 00 14 00 08 04 02 00 04 02 08 02 06 00 14
21x 04 02 06 02 12 02 04 00 06 04 10 02 04 02 02 02
22x 10 06 00 02 04 04 10 00 04 00 12 02 08 00 00 02
23x 00 06 02 02 02 04 06 10 00 04 08 02 02 06 00 10
24x 04 02 00 06 08 02 06 00 08 02 02 00 08 02 12 02
25x 02 00 02 16 02 04 06 04 06 08 02 04 00 06 00 02
26x 06 10 00 10 00 06 04 04 02 02 04 06 02 04 02 02
27x 04 00 02 00 02 02 14 00 04 06 06 02 12 02 04 04
28x 14 04 06 04 04 06 02 00 06 06 02 02 04 00 02 02
29x 02 02 00 02 00 08 04 02 04 06 04 04 06 04 12 04
2Ax 02 04 00 00 00 02 08 12 00 08 02 04 08 04 04 06
2Bx 16 06 02 04 06 10 02 02 02 02 02 02 04 02 02 00
2Cx 02 06 06 08 02 02 00 06 00 08 04 02 02 06 08 02
2Dx 06 02 04 02 08 08 02 08 02 04 04 00 02 00 08 04
2Ex 02 04 08 00 02 02 02 04 00 02 08 04 14 06 00 06
2Fx 02 02 02 08 00 02 02 06 04 06 08 08 06 02 00 06
30x 00 06 08 02 08 04 04 00 10 04 04 06 00 00 02 06
31x 00 08 04 00 06 02 02 06 06 00 00 02 06 04 08 10
32x 02 04 00 00 06 04 10 06 06 04 06 02 04 06 02 02
33x 00 16 06 08 02 00 02 02 04 02 08 04 00 04 06 00
34x 00 04 14 08 02 02 02 04 16 02 02 02 00 02 00 04
35x 00 06 00 00 10 08 02 02 06 00 00 08 06 04 04 08
36x 02 00 02 02 04 06 04 04 02 02 04 02 04 16 10 00
37x 06 06 06 08 04 02 04 04 04 00 06 08 02 04 00 00
38x 00 02 02 02 08 08 00 02 02 02 00 06 06 04 10 10
39x 04 04 16 08 00 06 04 02 04 04 02 06 00 02 02 00
3Ax 16 06 04 00 02 00 02 06 00 04 08 10 00 00 04 02
3Bx 02 00 00 02 00 04 04 04 02 06 02 06 06 12 12 02
3Cx 00 00 08 00 12 08 02 06 06 04 00 02 02 04 06 04
3Dx 02 04 12 02 02 02 00 04 06 10 02 06 04 02 00 06
3Ex 04 06 06 06 02 00 04 08 02 10 04 06 00 04 02 00
3Fx 14 00 00 00 08 00 06 08 04 02 00 00 04 08 04 06
12x 00 04 00 08 06 02 08 04 02 04 04 06 02 04 10 00
13x 04 02 02 06 08 06 02 02 14 02 02 04 02 02 02 04
14x 00 16 04 02 06 00 02 06 04 00 04 06 04 06 04 00
15x 00 10 06 00 06 00 02 08 02 02 00 08 02 06 06 06
16x 00 12 06 04 06 00 00 00 08 06 06 02 02 06 04 02
17x 00 06 08 00 06 02 04 06 06 00 02 06 04 04 02 08
18x 00 12 02 02 08 00 08 00 10 04 04 02 04 02 00 06
19x 06 04 08 00 08 00 04 02 00 00 12 02 04 06 02 06
1Ax 00 04 06 02 08 08 00 04 08 00 00 00 06 02 00 16
1Bx 02 04 08 10 02 04 02 08 02 04 08 02 00 02 04 02
1Cx 00 12 06 04 06 04 02 02 06 00 04 04 02 10 02 00
1Dx 08 06 00 00 10 00 00 08 10 04 02 02 02 08 04 00
1Ex 00 04 08 06 08 02 04 04 10 02 02 04 02 00 06 02
1Fx 04 02 04 02 06 02 04 00 02 06 02 02 02 16 08 02
20x 00 00 00 16 00 04 00 00 00 14 06 04 02 00 04 14
21x 00 00 02 10 02 08 10 00 00 06 06 00 10 02 02 06
22x 08 00 06 00 06 04 10 02 00 06 08 00 04 04 02 04
23x 04 08 00 06 00 04 08 06 02 02 10 04 08 00 00 02
24x 04 00 04 08 04 06 02 04 08 06 02 00 00 04 04 08
25x 00 04 06 08 02 08 08 00 04 02 04 04 02 02 06 04
26x 02 06 00 06 04 04 04 06 06 00 04 04 10 04 02 02
27x 06 06 00 00 02 02 06 02 04 04 06 10 02 06 02 06
28x 10 02 06 02 04 12 12 00 02 02 04 00 00 00 02 06
29x 04 00 00 14 02 10 04 02 08 06 04 00 04 02 02 02
2Ax 08 08 00 02 00 02 04 00 02 06 08 14 02 08 00 00
2Bx 02 02 00 00 04 02 10 04 06 02 04 00 06 04 08 10
2Cx 02 06 06 02 04 06 02 00 02 06 04 00 06 04 10 04
2Dx 08 00 04 04 06 02 00 00 06 08 02 04 06 04 04 06
2Ex 06 02 02 04 02 02 06 12 04 00 04 02 08 08 00 02
2Fx 08 12 04 06 06 04 02 02 02 02 04 02 02 04 00 04
30x 00 04 06 02 10 02 02 02 04 08 00 00 08 04 06 06
31x 04 06 08 00 04 06 00 04 04 06 10 02 02 04 04 00
32x 06 06 06 02 04 06 00 02 00 06 08 02 02 06 06 02
33x 06 06 04 02 04 00 00 10 02 02 00 06 08 04 00 10
34x 00 02 12 04 10 04 00 04 12 00 02 04 02 02 02 04
35x 06 04 04 00 10 00 00 04 10 00 00 04 02 08 08 04
36x 04 06 02 02 02 02 06 08 06 04 02 06 00 04 10 00
37x 02 02 08 02 04 04 04 02 06 02 00 10 06 10 02 00
38x 00 04 08 04 02 06 06 02 04 02 02 04 06 04 04 06
39x 04 04 04 08 00 06 00 06 04 08 02 02 02 04 08 02
3Ax 08 08 00 04 02 00 10 04 00 00 00 04 08 06 08 02
3Bx 08 02 06 04 04 04 04 00 06 04 04 06 04 04 04 00
3Cx 00 06 06 06 06 00 00 08 08 02 04 08 04 02 04 00
3Dx 02 02 08 00 10 00 02 12 00 04 00 08 00 02 06 08
3Ex 06 04 00 00 04 04 00 10 06 02 06 12 02 04 00 04
3Fx 00 06 06 00 04 04 06 10 00 06 08 02 00 04 08 00
In the table, each row corresponds to a value of entries and each column to a value of output (in hexadecimal) of
the S-table.
The algorithm is described in paragraph 3.6.2.
You can also consult the C code that allowed to generate tables of distributions on the following pages.
15x 0 0 4 -4 -4 4 4 -4 10 2 2 2 -6 2 6 -2
16x 0 6 2 0 2 -4 0 2 4 2 2 0 -2 0 0 2
17x 0 2 6 -8 6 4 0 -2 -12 -2 -2 0 -6 0 0 -2
18x 0 2 8 2 0 6 4 2 4 -2 4 6 0 -2 -4 2
19x 0 -2 4 -6 0 -6 0 2 4 -6 8 6 0 2 0 -6
1Ax 0 0 -6 2 -2 -2 4 4 -2 -2 0 0 -4 4 2 2
1Bx 0 4 6 2 -10 2 -8 4 -2 -6 4 0 4 0 -2 2
1Cx 0 -4 2 2 2 -6 0 -4 -2 -2 4 0 0 4 2 2
1Dx 0 4 -2 -2 2 -6 -4 0 2 2 -4 0 -12 0 -6 -6
1Ex 0 2 0 -2 4 -2 0 -2 0 6 -4 -2 0 -2 0 2
1Fx 0 2 -4 2 -4 -2 4 2 4 -6 4 -2 -4 2 0 2
20x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
21x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
22x 0 2 -2 0 2 0 0 6 -2 0 -4 6 4 10 10 0
23x 0 2 -2 0 2 0 0 6 6 0 4 -10 -4 -6 2 0
24x 0 2 -6 -8 2 4 4 2 2 0 0 2 0 6 -10 0
25x 0 -2 2 4 2 0 -4 -2 -2 0 4 2 -4 6 -6 0
26x 0 4 4 -4 8 -8 4 0 0 -8 0 -4 0 4 0 0
27x 0 0 -4 -8 -8 4 -4 -4 4 -8 -4 -4 4 4 -4 0
28x 0 4 -2 -2 -2 -2 -4 0 4 4 2 6 -2 -6 12 4
29x 0 0 -2 -6 2 -2 -8 8 0 -4 -2 -2 6 -2 -4 0
2Ax 0 2 0 -2 0 2 0 -2 2 -8 -6 -4 2 0 2 -4
2Bx 0 -10 0 2 -4 2 4 6 -2 0 -10 4 2 -4 -6 0
2Cx 0 6 -4 2 8 2 4 6 -2 -4 -2 12 -2 -8 -2 0
2Dx 0 -2 -4 2 -4 -2 0 2 -2 -4 -2 4 -6 4 2 -4
2Ex 0 -4 2 6 6 -6 -8 4 -4 0 2 6 6 2 4 0
2Fx 0 -4 2 -2 2 -10 12 0 -4 0 2 -2 10 6 0 4
30x 0 -2 -2 0 -2 4 0 2 0 2 6 4 6 0 0 -2
31x 0 -2 2 4 2 0 0 -6 -4 -2 -2 -4 -2 0 -4 2
32x 0 -4 -4 -4 0 0 0 4 -2 -2 -6 -2 -6 6 2 2
33x 0 -4 0 0 4 -4 0 -4 -6 2 2 -2 2 -2 -2 -2
34x 0 8 -8 8 4 4 0 0 -2 -2 2 -6 6 6 -2 -2
35x 0 4 -4 0 -8 -4 0 -4 -2 2 -2 2 2 -2 -2 2
36x 0 6 2 -8 2 -4 8 2 4 -6 2 0 6 0 0 2
37x 0 2 -10 0 6 4 8 -2 4 6 -2 0 2 0 0 -2
38x 0 -10 4 2 -4 -2 4 -2 4 2 0 6 -4 6 -4 -2
39x 0 2 0 -6 -4 2 0 -2 -4 6 -4 -2 4 2 8 -2
3Ax 0 0 6 -2 6 6 0 0 2 2 0 0 8 0 2 2
3Bx 0 4 2 -2 -2 10 4 0 -14 -2 4 0 0 -4 -2 2
3Cx 0 0 10 -2 -6 -2 0 8 -6 6 0 -8 -4 4 -2 2
3Dx 0 8 -2 2 -6 -2 4 4 -2 -6 0 0 0 0 -2 2
3Ex 0 2 0 -2 -8 2 4 2 0 -2 4 -2 -4 2 4 -2
3Fx 0 -14 -12 -6 0 2 0 -2 -4 -6 12 -2 0 -2 4 -2
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
00x 32 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
01x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02x 0 0 0 4 0 0 -4 0 0 0 -4 0 0 0 0 4
03x 0 0 4 -8 0 8 0 -4 0 0 -8 -4 0 8 -4 8
04x 0 -2 2 4 2 0 4 6 0 6 -2 0 2 0 0 10
05x 0 2 2 0 2 -4 4 -6 0 -6 6 4 -6 4 0 -2
06x 0 -2 -2 -4 -2 -4 0 -2 -4 2 2 0 2 0 4 10
07x 0 2 2 -4 -2 0 4 -2 -4 6 6 0 -6 -4 0 2
08x 0 0 2 2 -2 2 4 0 -2 -6 -4 0 0 0 10 -6
09x 0 0 -2 -2 -2 2 8 4 -2 2 0 -4 0 8 -10 -2
0Ax 0 -4 2 2 2 2 -4 -8 2 2 4 0 0 4 -6 2
0Bx 0 4 2 10 2 2 4 0 2 2 4 0 0 -4 2 2
0Cx 0 -2 4 -2 0 2 0 6 -2 0 2 0 2 0 -6 -4
0Dx 0 -6 0 -2 0 6 4 -10 6 -4 6 0 2 -4 -2 4
0Ex 0 -6 0 2 0 -2 -8 6 -2 4 2 0 2 4 -2 0
0Fx 0 -2 0 -2 0 2 0 10 6 8 2 4 2 0 -2 4
10x 0 0 -4 4 0 0 0 0 0 0 4 4 4 -12 -4 -12
11x 0 0 0 0 0 -8 4 4 0 0 8 0 -4 4 8 0
12x 0 0 0 4 0 8 0 4 0 0 -4 8 -4 -12 0 12
13x 0 0 -8 4 0 8 8 4 0 0 -4 0 4 -4 8 -4
14x 0 -2 -6 -4 -2 4 -4 -2 -4 2 2 -4 6 -4 0 2
15x 0 10 -2 -4 -2 0 0 -2 4 6 6 -4 -2 0 4 2
16x 0 -2 -6 0 2 0 4 2 8 -2 2 0 6 4 0 -2
17x 0 10 2 4 2 4 -4 -2 0 2 2 -4 -2 0 0 2
18x 0 -4 2 -2 2 2 4 4 -2 -2 4 4 0 4 -2 2
19x 0 4 -6 6 2 2 4 -4 -2 -2 4 -4 8 4 -2 2
1Ax 0 8 -2 2 -2 2 0 0 2 6 0 0 0 0 2 -2
1Bx 0 -8 10 6 -2 2 4 -4 2 -2 -4 -4 -8 0 -2 -6
1Cx 0 2 8 -2 0 -2 0 2 2 0 -6 4 10 -4 2 0
1Dx 0 -2 0 2 0 -6 0 -2 2 4 2 0 10 0 2 4
1Ex 0 -2 0 6 0 2 4 -2 2 4 -2 0 2 0 2 0
1Fx 0 2 -4 6 0 -2 -8 -2 -14 0 2 0 2 4 -2 0
20x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
21x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
22x 0 0 0 4 0 -8 4 0 0 0 -4 -16 0 -8 -8 4
23x 0 0 -4 0 0 0 0 4 0 0 0 4 0 0 -4 0
24x 0 -2 -2 0 -2 -4 4 -10 0 6 -6 -4 6 4 8 2
25x 0 -6 -2 4 -2 0 4 2 0 2 2 -8 -2 0 8 -2
26x 0 6 -6 0 -14 0 0 -2 -4 -6 -2 4 -2 -4 -4 2
27x 0 2 6 0 2 -4 -4 -2 -4 6 -6 4 6 0 0 -6
28x 0 -4 2 -2 2 2 0 -8 2 2 0 8 8 -4 -6 -2
29x 0 -4 -2 -6 2 2 4 -4 -6 2 -4 -4 0 -4 -2 -6
2Ax 0 8 2 -2 6 -6 0 0 -2 2 0 0 0 0 -6 -2
2Bx 0 0 -6 -2 6 10 0 0 -10 10 0 0 -8 0 2 -2
2Cx 0 -6 0 6 0 -2 -4 -2 2 -8 2 4 -2 0 2 8
2Dx 0 -2 -4 -2 0 -6 0 -10 2 4 -2 4 -10 -4 -2 0
2Ex 0 -2 -4 2 8 2 4 -2 -6 -4 -6 4 -2 4 -2 4
2Fx 0 -6 4 -2 -8 -2 4 2 -6 0 10 0 6 0 -2 0
30x 0 0 -4 4 -4 -4 4 4 0 0 -4 -4 0 0 -8 0
31x 0 0 0 0 4 -4 0 0 0 0 0 -8 0 -8 -4 4
32x 0 0 0 4 4 4 4 0 0 0 4 0 0 0 -4 0
33x 0 0 0 -4 -4 -4 -4 0 0 0 -4 0 0 0 4 0
34x 0 -10 -2 -8 6 4 -8 2 4 2 6 -8 -2 -4 4 -2
35x 0 -6 2 0 -2 0 4 2 -4 -2 -6 0 -2 0 0 -2
36x 0 -2 -2 4 -6 0 8 -2 0 6 6 4 -2 4 -4 2
37x 0 2 -2 -8 -14 4 0 2 8 2 -2 0 -2 0 -4 -2
38x 0 0 2 2 -6 -2 -4 0 2 -2 0 -4 -4 -4 2 2
39x 0 8 10 -6 2 6 4 0 -6 -10 8 -4 4 -4 2 2
3Ax 0 -4 -2 6 -2 -2 -8 4 -2 -2 4 0 -4 0 -2 -2
3Bx 0 -4 2 2 -10 6 -4 0 -10 -2 0 -4 4 0 2 2
3Cx 0 -2 -4 -2 4 -2 0 -2 -2 0 2 0 2 -8 -2 0
3Dx 0 2 4 10 -4 10 -8 -6 6 4 2 -4 2 4 -2 -4
3Ex 0 2 -12 -2 4 2 -4 2 6 -12 -2 -4 2 4 -2 0
3Fx 0 -2 -8 -2 -4 -2 0 -6 -2 0 2 4 2 0 2 0
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
00x 32 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
01x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02x 0 2 -2 -4 0 -2 -6 4 0 6 -2 -8 4 -2 -2 -4
03x 0 -2 -2 0 4 -2 -2 -4 -4 -2 2 0 -4 2 -2 0
04x 0 -4 0 0 2 -2 -6 2 0 4 4 -4 -6 -10 6 -2
05x 0 0 0 -4 2 2 2 6 0 0 4 0 2 -6 6 2
06x 0 -2 2 0 2 -4 0 2 0 -6 -2 0 -2 4 8 -2
07x 0 -2 2 0 -2 8 4 6 -4 -2 2 -4 -10 4 0 -2
08x 0 0 6 -6 0 0 2 -2 -2 2 0 8 6 2 4 -4
09x 0 0 2 -2 0 -8 -2 -6 -2 2 -4 -4 -2 2 8 0
0Ax 0 -2 8 -2 0 2 -8 -6 2 0 6 4 -2 0 2 -4
0Bx 0 2 -4 6 4 2 0 -2 -2 0 -2 0 -2 -4 -2 4
0Cx 0 0 -2 -2 2 10 4 -4 2 -2 0 -4 4 -8 6 -6
0Dx 0 -4 2 -2 2 -2 0 12 2 2 4 4 4 4 2 2
0Ex 0 -2 -4 -2 2 -4 -2 -4 -2 4 2 4 4 -2 -8 -2
0Fx 0 -2 -8 2 -2 0 -2 -4 -6 -8 2 4 4 -2 4 2
10x 0 0 2 -2 2 2 8 4 2 6 -4 12 -4 -8 2 10
11x 0 0 -2 2 -2 -2 0 4 -2 2 4 -4 4 0 -10 6
12x 0 -2 4 2 -2 0 2 4 -2 -4 -6 0 0 -14 4 -2
13x 0 2 0 2 -2 4 -2 -12 -2 0 -2 8 -8 -2 0 -2
14x 0 4 -6 -2 -4 0 2 -2 2 -6 -8 0 6 -2 0 0
15x 0 8 -2 6 0 8 2 2 -2 2 -8 -4 -2 2 4 0
16x 0 2 0 -10 0 6 0 2 -2 0 2 0 2 0 -2 0
17x 0 10 4 -6 0 -2 -4 -2 -2 0 -2 4 2 0 2 -4
18x 0 0 4 -4 2 2 -2 -2 -4 4 4 4 -2 6 6 -2
35x 0 8 -2 -2 12 -4 -2 -2 2 -2 -4 0 -2 2 4 8
36x 0 -2 0 2 -4 -2 -12 2 -6 0 -2 0 -6 -4 -2 4
37x 0 6 -4 -2 -12 -2 0 -2 10 0 2 -4 2 4 2 0
38x 0 0 0 8 6 -2 6 -2 -4 4 0 0 2 2 -2 -2
39x 0 8 0 0 10 2 -6 2 0 0 4 -4 2 2 -2 -2
3Ax 0 6 -2 0 2 4 -4 2 -4 -2 -2 4 -6 0 0 2
3Bx 0 -14 -10 -4 10 0 -4 -2 4 2 -10 0 -6 4 0 -2
3Cx 0 0 0 4 -8 0 0 -4 -4 4 4 0 -4 4 4 0
3Dx 0 -4 0 8 4 0 4 -4 0 -4 8 0 -4 0 -4 -4
3Ex 0 6 2 0 -4 -2 -6 4 -4 -2 -2 0 -4 -6 2 0
3Fx 0 6 -6 8 4 -2 2 4 -12 -2 6 0 4 2 2 0
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
00x 32 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
01x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02x 0 0 4 0 4 0 0 0 0 0 0 4 0 4 0 0
03x 0 -4 0 8 0 0 8 -4 -4 -8 0 0 -8 0 -4 0
04x 0 -2 -2 0 -2 0 0 -10 2 0 0 -6 0 -6 10 0
05x 0 2 -2 -4 2 0 -4 6 2 4 0 -10 4 10 6 0
06x 0 -2 2 0 -2 -4 -4 2 -2 -4 4 2 0 -2 2 8
07x 0 -2 -2 4 -2 12 0 -2 2 0 12 2 4 2 2 0
08x 0 -4 0 0 0 0 0 4 4 0 0 0 0 0 -4 0
09x 0 0 -4 -8 4 0 8 0 0 -8 0 4 8 -4 0 0
0Ax 0 -4 -4 0 4 0 0 -4 -4 0 0 4 0 -4 -4 0
0Bx 0 4 -4 0 -4 0 0 -4 -4 0 0 -4 0 -4 4 0
0Cx 0 -2 2 0 -2 4 4 -6 -2 4 -4 10 0 -10 -6 -8
0Dx 0 -2 -2 -4 -2 4 0 -10 2 0 4 -6 -4 -6 10 0
0Ex 0 -2 -2 0 -2 0 0 -2 2 0 0 2 0 2 2 0
0Fx 0 2 -2 4 2 0 4 -2 2 -4 0 -2 -4 2 -2 0
10x 0 -2 -2 0 2 4 4 10 -2 4 -4 6 0 -6 10 8
11x 0 2 -2 -4 -2 -4 0 -6 -2 0 -4 10 -4 10 6 0
12x 0 2 -2 0 -2 0 0 -6 -2 0 0 10 0 10 6 0
13x 0 2 2 -4 -2 0 4 -10 2 -4 0 -6 4 6 -10 0
14x 0 4 -4 0 -4 0 0 4 -4 0 0 -4 0 -4 -4 0
15x 0 4 4 8 -4 0 0 4 4 0 0 4 -8 -4 4 0
16x 0 8 -4 0 4 8 -8 0 8 -8 -8 4 0 -4 0 0
17x 0 4 8 8 8 0 0 -4 -4 0 0 0 8 0 4 0
18x 0 2 -2 0 -2 0 0 2 -2 0 0 2 0 2 -2 0
19x 0 2 2 4 -2 0 -4 -2 2 4 0 2 -4 -2 -2 0
1Ax 0 -2 -2 0 2 -4 -4 2 -2 -4 4 -2 0 2 2 -8
1Bx 0 2 -2 -12 -2 4 0 2 -2 0 4 2 -12 2 -2 0
1Cx 0 -4 0 0 0 8 -8 4 -4 -8 -8 0 0 0 4 0
1Dx 0 0 -4 -8 -4 0 0 0 0 0 0 4 -8 4 0 0
1Ex 0 -8 0 0 0 0 0 0 8 0 0 0 0 0 0 0
1Fx 0 0 -8 8 8 0 0 0 0 0 0 0 -8 0 0 0
20x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
21x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
22x 0 4 0 0 0 -8 8 4 4 8 8 0 0 0 4 -16
23x 0 0 4 0 4 0 0 0 0 0 0 4 0 4 0 0
24x 0 2 2 0 -2 4 4 6 2 4 -4 2 0 -2 6 -8
25x 0 -2 2 4 2 -4 0 -2 2 0 -4 6 4 6 2 0
26x 0 -10 2 0 2 0 0 -2 10 0 0 -2 0 -2 2 0
27x 0 -2 -10 4 10 0 4 2 -2 -4 0 -2 -4 2 2 0
28x 0 0 4 0 -4 8 -8 0 0 -8 -8 -4 0 4 0 -16
29x 0 -4 0 0 0 0 0 4 4 0 0 0 0 0 -4 0
2Ax 0 4 -4 0 -4 0 0 -4 -4 0 0 -4 0 -4 4 0
2Bx 0 4 4 0 -4 0 16 4 4 -16 0 -4 0 4 4 0
2Cx 0 -2 2 0 2 0 0 -2 2 0 0 6 0 6 2 0
2Dx 0 -2 -2 -4 2 0 -4 -6 -2 4 0 -2 4 2 -6 0
2Ex 0 2 10 0 -10 -4 -4 -2 2 -4 4 2 0 -2 -2 8
2Fx 0 -10 2 -4 2 -12 0 -2 10 0 -12 -2 -4 -2 2 0
30x 0 -2 -2 0 -2 0 0 6 2 0 0 2 0 2 -6 0
31x 0 2 -2 -4 2 0 4 -2 2 -4 0 6 4 -6 -2 0
32x 0 -2 2 0 -2 4 4 2 -2 4 -4 -6 0 6 2 8
33x 0 -2 -2 4 -2 4 0 6 2 0 4 2 4 2 -6 0
34x 0 0 8 0 -8 8 8 0 0 8 -8 0 0 0 0 0
35x 0 -8 0 0 0 8 0 0 8 0 8 0 0 0 0 0
36x 0 0 -4 0 -4 0 0 0 0 0 0 4 0 4 0 0
37x 0 4 0 8 0 0 0 -4 4 0 0 0 -8 0 -4 0
38x 0 -10 2 0 -2 -4 -4 2 -10 -4 4 2 0 -2 2 -8
39x 0 -2 -10 12 -10 -4 0 -2 2 0 -4 2 12 2 2 0
3Ax 0 -2 -10 0 -10 0 0 -2 2 0 0 2 0 2 2 0
3Bx 0 10 -2 4 2 0 -4 -2 10 4 0 -2 -4 2 -2 0
3Cx 0 4 8 0 8 0 0 -4 -4 0 0 0 0 0 4 0
3Dx 0 -8 4 8 -4 0 0 0 -8 0 0 -4 -8 4 0 0
3Ex 0 -4 -4 0 4 8 8 -4 -4 8 -8 -4 0 4 -4 0
3Fx 0 4 -4 0 -4 -8 0 4 -4 0 -8 -4 0 -4 -4 0
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
00x 32 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
01x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02x 0 4 -2 2 -2 2 -4 0 4 0 2 -2 2 -2 0 -4
03x 0 0 -2 6 -2 -2 4 -4 0 0 -2 6 -2 -2 4 -4
04x 0 2 -2 0 0 2 -2 0 0 2 2 4 -4 -2 -2 0
05x 0 2 2 -4 0 10 -6 -4 0 2 -10 0 4 -2 2 4
06x 0 -2 -4 -6 -2 -4 2 0 0 -2 0 -2 -6 -8 2 0
07x 0 2 0 2 -2 8 6 0 -4 6 0 -6 -2 0 -6 -4
08x 0 0 2 6 0 0 -2 -6 -2 2 4 -12 2 6 -4 4
09x 0 -4 6 -2 0 -4 -6 -6 6 -2 0 -4 2 -6 -8 -4
0Ax 0 4 0 0 -2 -6 2 2 2 2 -2 2 4 -4 -4 0
0Bx 0 4 4 4 6 2 -2 -2 -2 -2 -2 2 0 -8 -4 0
0Cx 0 2 0 -2 0 2 4 10 -2 4 -2 -8 -2 4 -6 -4
0Dx 0 6 0 2 0 -2 4 -10 -2 0 -2 4 -2 8 -6 0
0Ex 0 -2 -2 0 -2 4 0 2 -2 0 4 2 -4 6 -2 -4
0Fx 0 -2 -2 8 6 4 0 2 2 4 8 -2 8 -6 2 0
10x 0 2 -2 0 0 -2 -6 -8 0 -2 -2 -4 0 2 10 -20
11x 0 2 -2 0 4 2 -2 -4 4 2 2 0 -8 -6 2 4
12x 0 -2 0 -2 2 -4 -2 -8 4 6 4 6 -2 4 -6 0
13x 0 -6 0 2 -2 4 2 0 4 -6 4 2 -6 4 -2 0
14x 0 4 -4 0 0 0 0 0 -4 -4 4 4 0 4 -4 0
15x 0 4 0 -4 -4 4 -8 -8 0 0 -4 4 8 4 0 4
16x 0 0 6 6 2 -2 4 0 4 0 6 2 2 2 0 0
17x 0 4 -6 -2 6 -2 -4 4 4 -4 -6 2 -2 2 0 4
18x 0 6 0 2 4 -10 -4 2 2 0 -2 0 2 4 -2 -4
19x 0 2 4 -6 0 -2 4 -2 6 8 6 4 10 0 2 -4
1Ax 0 2 2 -8 -2 4 0 2 -2 0 4 2 0 -2 -2 0
1Bx 0 2 6 -4 -6 0 0 2 6 8 0 -2 -4 -6 -2 0
1Cx 0 0 -2 2 4 0 -6 2 -2 6 -4 0 2 -2 0 0
1Dx 0 4 -2 6 -8 0 -2 2 10 -2 -8 -8 2 2 0 4
1Ex 0 -4 -8 0 -2 -2 -2 2 -2 2 -2 6 4 4 4 0
1Fx 0 -4 8 -8 2 -6 -6 -2 -2 2 -2 -2 -8 0 0 -4
20x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
21x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
22x 0 -4 -2 2 -2 2 -4 8 -4 0 -6 6 2 -2 -16 -12
23x 0 0 -2 -2 6 -2 -4 4 0 0 -2 -2 -2 6 4 -4
24x 0 -2 6 4 0 6 -2 4 4 -6 -2 4 0 14 2 0
25x 0 6 2 0 0 6 2 0 -4 -6 2 -8 0 -2 6 -4
26x 0 2 4 -2 -2 0 2 -4 4 -2 -4 -2 6 0 -2 0
27x 0 -10 0 -2 6 4 6 -4 0 6 -12 2 2 0 6 -4
28x 0 4 -2 -2 0 4 -6 2 2 -6 4 0 6 -2 -4 0
29x 0 0 2 6 0 0 6 2 2 -2 -8 0 -2 -6 0 0
2Ax 0 0 -4 -8 6 6 6 -6 6 2 -2 -2 -8 4 -4 4
2Bx 0 8 0 4 6 -2 -6 6 2 6 -2 6 -4 0 4 4
2Cx 0 2 4 -6 0 -6 0 6 -2 -4 2 -4 -2 4 6 0
2Dx 0 -2 -4 -2 0 -2 -8 2 -2 0 -6 -8 -2 0 -2 4
2Ex 0 6 2 -4 6 4 4 -2 -10 -8 0 -2 4 -2 2 0
2Fx 0 6 -6 -4 6 -4 4 -2 2 4 4 -6 0 2 -2 -4
30x 0 2 -2 0 -4 -6 -2 -4 4 2 2 0 0 2 2 4
31x 0 2 -2 0 0 -2 2 0 0 -2 -2 -4 0 2 2 4
32x 0 6 0 -2 -2 8 2 4 0 10 0 2 -2 4 2 0
33x 0 -6 0 10 2 0 -2 -4 0 6 0 -10 2 4 -2 0
34x 0 0 -12 4 -4 0 4 -8 -4 0 -4 0 -4 -4 0 0
35x 0 -8 0 0 8 -4 4 0 0 -4 -4 0 4 4 -4 4
36x 0 4 -2 -6 -2 -2 8 0 4 -4 -2 -2 6 2 -4 0
37x 0 -8 -6 -6 -6 6 0 4 12 0 2 -2 2 2 4 -4
38x 0 2 4 -6 0 -2 4 -2 -6 4 -6 0 6 4 -2 0
39x 0 -2 8 2 -4 6 -4 -6 -2 -4 2 4 -2 0 2 0
3Ax 0 6 -10 0 2 4 0 -2 6 -4 0 2 4 -2 -2 -4
3Bx 0 -2 -6 -4 -10 0 -8 -2 -10 4 4 -2 0 2 -2 4
3Cx 0 -8 -6 -2 0 -4 2 2 -6 2 4 0 10 -2 4 4
3Dx 0 4 2 2 4 4 -2 2 -2 10 0 0 2 2 4 0
3Ex 0 -4 4 -4 2 2 -2 2 2 -2 -2 -2 4 -4 0 4
3Fx 0 -4 -4 -4 14 6 -6 -2 2 -2 6 -2 0 0 -4 0
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
00x 32 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
01x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02x 0 0 0 0 2 -6 2 -6 2 6 -2 2 -4 0 0 4
03x 0 0 0 0 -2 -2 -2 -2 -2 -6 2 -2 4 8 0 4
04x 0 0 2 -2 2 -2 8 0 2 2 0 4 4 8 6 -2
05x 0 0 6 -6 2 6 4 -4 -2 -2 -8 4 8 4 -2 -10
06x 0 4 -2 -2 0 0 2 -2 0 0 -2 2 -8 4 2 2
07x 0 -4 2 2 4 -4 2 -2 0 0 2 -2 4 0 -6 2
08x 0 2 -2 0 -4 -2 -2 -8 2 0 8 6 6 -4 0 -2
09x 0 -2 6 4 0 6 2 0 -2 0 4 6 -2 0 0 10
0Ax 0 2 2 4 2 4 -8 2 0 2 -2 0 -6 -4 4 -2
0Bx 0 -2 -6 8 2 0 8 -2 0 -2 -10 4 2 0 -4 2
0Cx 0 -2 0 2 2 -4 2 8 -4 -2 0 -2 -2 -4 2 4
0Dx 0 2 4 2 -2 -4 2 4 4 2 -4 -2 2 12 10 0
0Ex 0 2 0 6 -4 2 -4 -2 -2 0 -2 -4 -6 8 2 4
0Fx 0 -2 4 -2 -4 6 0 -2 2 0 -2 0 -2 0 2 0
10x 0 2 0 -2 0 2 -4 -14 -4 -2 -4 -6 0 2 -12 10
11x 0 2 0 -2 4 -2 8 6 -4 -2 -4 -6 4 -2 0 -2
12x 0 2 0 -2 -2 0 2 -8 -6 8 -2 8 -4 2 4 -2
13x 0 2 0 -2 -2 0 -6 0 -2 4 -6 -4 0 -2 8 10
14x 0 -2 2 0 2 -4 -4 -2 -2 -4 -4 2 -4 -2 -6 -4
15x 0 6 -2 -4 6 -8 -4 -2 2 8 4 -6 -4 6 -2 0
16x 0 -6 6 0 4 2 -6 0 0 -2 -2 4 0 2 -2 0
17x 0 -6 2 4 -4 2 -2 4 0 6 -6 0 0 2 -6 4
18x 0 0 -2 2 0 0 -2 2 -2 2 4 -4 2 -2 0 0
19x 0 -12 -2 -10 -8 -4 6 -2 -6 -6 8 -4 -2 6 4 0
1Ax 0 8 2 -2 2 2 -4 0 0 -8 6 2 -2 -2 -4 0
1Bx 0 -4 2 2 6 2 0 0 -8 4 -2 -2 2 -2 0 0
1Cx 0 0 0 0 -2 -6 2 -2 0 -4 4 8 2 -6 2 2
1Dx 0 4 4 0 -2 -10 -2 -2 0 -8 -8 0 2 -2 -2 -6
1Ex 0 4 8 -4 -4 -4 0 0 -2 2 -2 -6 6 -2 2 2
1Fx 0 0 -4 4 0 -4 0 4 2 2 -2 -2 -2 2 -2 2
20x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
21x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
22x 0 4 0 -4 -2 2 -2 10 2 -6 -2 14 0 0 4 12
23x 0 4 0 -4 2 -2 2 6 -2 -2 2 -6 0 0 -4 4
24x 0 0 -2 2 -2 2 0 -8 -2 -2 0 -4 4 0 10 2
25x 0 0 2 -2 -2 -6 -4 4 2 2 0 4 0 4 -6 2
26x 0 8 2 6 0 4 6 6 -4 0 6 -2 4 4 -6 -2
27x 0 0 6 10 -4 -8 -2 -2 -12 8 2 2 0 0 2 -2
28x 0 2 2 -4 0 2 -2 0 -2 4 0 -2 6 -12 4 2
29x 0 -2 2 8 4 -6 -6 0 2 -4 -4 -2 6 0 4 -2
2Ax 0 6 -2 4 2 0 -4 2 -4 -6 -2 -4 -2 4 4 2
2Bx 0 2 -2 0 10 4 -4 -2 4 -2 6 0 6 8 4 -2
2Cx 0 -2 8 -6 2 4 2 0 4 -2 0 -10 -2 -4 2 4
2Dx 0 2 4 2 -2 4 -6 4 -4 2 4 -2 2 -4 2 -8
2Ex 0 6 -8 -6 0 -6 0 -2 6 4 -2 0 6 0 -2 4
2Fx 0 2 4 10 -8 6 4 -2 10 4 6 -4 2 0 -2 0
30x 0 2 0 -2 0 2 4 -6 0 2 0 -2 -4 -2 8 -2
31x 0 2 0 -2 4 -2 0 -2 0 2 0 -2 0 -6 4 2
32x 0 -2 0 2 2 -8 -2 0 -2 -8 2 0 -12 -2 4 -6
33x 0 -2 0 2 -6 0 -2 0 2 4 -2 4 0 2 0 -2
34x 0 -2 -2 4 -2 0 -4 -2 -2 -4 0 -2 8 2 2 4
35x 0 -10 10 0 2 -4 -4 -2 10 0 0 -2 0 2 -2 0
36x 0 -10 -6 0 -4 6 -2 0 0 -6 -6 -4 0 -2 2 0
37x 0 6 6 4 -4 -2 10 -4 -8 -6 -2 0 0 -2 -2 4
38x 0 0 -6 6 4 4 -2 -6 -2 -6 8 0 -2 2 0 0
39x 0 4 2 2 -4 0 -2 -2 2 -6 -4 0 2 2 4 0
3Ax 0 4 6 -2 -6 -2 -8 0 0 -4 2 2 6 2 0 0
3Bx 0 8 -2 -6 -10 6 -4 0 0 0 -6 -2 -6 2 -4 -8
3Cx 0 0 0 0 -2 2 2 6 -4 0 0 -4 -2 6 -2 -2
3Dx 0 4 -4 8 -2 -2 6 -2 12 -4 -4 -4 -2 -6 2 -2
3Ex 0 0 -8 0 8 4 -4 0 -6 2 -6 2 6 2 2 -2
3Fx 0 -4 -12 0 -12 -4 -4 4 -2 2 2 -2 6 -2 -2 -2
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
00x 32 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
01x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02x 0 0 2 -6 4 -4 2 2 2 2 0 0 -2 -2 0 0
03x 0 0 -2 6 0 0 2 -6 2 2 -4 -4 -6 2 0 8
04x 0 0 2 2 0 -4 -6 -2 4 4 10 2 4 0 -6 6
05x 0 0 2 10 0 4 -6 -2 0 -8 6 -2 0 -4 6 10
06x 0 4 0 -4 -8 0 4 -4 -2 -6 -2 2 6 -2 2 -6
07x 0 -4 4 0 4 4 -4 4 2 -2 6 2 6 -2 -2 -2
08x 0 -4 0 4 -2 -2 -2 6 -4 4 0 0 -2 2 2 -2
09x 0 4 -4 0 2 -6 -2 -10 0 0 0 0 -2 -6 -2 -6
0Ax 0 0 -2 -10 2 -2 -4 0 2 6 0 4 0 0 2 2
0Bx 0 0 -2 -2 10 -2 -4 0 6 -6 4 0 4 -4 -2 -2
0Cx 0 0 -2 -2 2 -6 0 0 0 -4 -2 2 -2 -6 4 0
0Dx 0 0 2 2 -2 -2 0 -8 0 4 2 -2 2 -2 -4 -8
0Ex 0 0 0 0 2 2 6 -2 -2 -6 -6 6 -4 -8 -4 0
0Fx 0 0 0 0 2 2 -2 6 6 -6 2 6 -4 0 4 0
10x 0 -2 2 4 0 -2 -2 0 -2 4 4 6 -2 -4 8 -14
11x 0 2 -2 4 -4 -2 -2 4 -2 0 0 -2 10 4 -8 -2
12x 0 -2 0 2 -4 2 -4 -10 0 -2 0 -14 4 -6 4 -2
13x 0 2 0 6 4 6 4 -6 0 -6 0 -2 -4 6 -4 -6
14x 0 2 8 6 0 -10 4 -2 6 8 -2 -4 -2 -4 2 4
15x 0 -2 -4 -2 4 -2 4 -6 2 0 -2 0 -2 0 -2 -4
16x 0 -2 2 -4 -8 2 2 0 0 -2 -2 0 0 -6 -2 4
17x 0 2 2 -8 -8 -10 2 -4 -12 6 2 0 4 2 2 4
18x 0 2 -2 -4 2 0 -4 6 2 8 0 -6 0 2 2 -8
19x 0 -2 6 -8 2 -4 -4 -6 6 0 12 2 -4 2 -2 0
1Ax 0 -2 0 2 -2 0 -2 4 0 2 -4 2 2 0 -2 0
1Bx 0 2 4 2 2 0 6 0 4 2 4 -2 2 4 2 0
1Cx 0 2 0 -2 -2 8 2 0 2 0 6 0 -4 2 4 -2
1Dx 0 -2 8 2 -2 -4 2 4 2 -4 -2 4 -12 -2 -4 -6
1Ex 0 2 6 -4 -2 0 4 2 8 -2 -2 0 2 0 0 2
1Fx 0 -2 2 4 2 0 4 -2 0 2 2 0 6 0 0 -2
20x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
21x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
22x 0 0 2 2 -4 4 -6 2 2 2 0 -8 -2 -2 -16 -8
23x 0 0 -2 -2 0 0 2 2 2 2 -4 4 2 -6 -8 8
24x 0 0 -2 6 0 4 -2 2 -4 4 -2 6 4 0 -2 2
25x 0 0 -2 -2 0 -4 -2 2 0 0 2 -6 -8 4 2 -2
26x 0 -4 -4 0 -8 0 -8 0 6 2 2 6 -2 -2 -2 -2
27x 0 -12 0 4 -4 -4 8 0 2 -2 2 -2 -2 -2 -6 2
28x 0 0 -4 4 -2 -6 -6 -2 4 -8 -4 8 6 14 -2 -2
29x 0 0 0 0 2 -2 2 -2 0 4 -4 0 -2 6 -6 2
2Ax 0 -4 2 -2 2 2 0 0 -6 2 4 4 0 -4 -2 2
2Bx 0 4 10 6 2 2 0 -8 6 6 -8 8 4 0 2 -2
2Cx 0 4 6 2 2 -2 0 12 0 0 -2 -2 6 -10 4 -4
2Dx 0 -4 2 6 -2 -6 -8 4 0 0 -6 -6 -6 2 4 4
2Ex 0 4 0 4 -6 -2 6 2 -2 -2 2 2 4 4 4 -4
2Fx 0 12 -8 4 2 -2 -2 2 6 6 2 2 -4 -4 -4 4
30x 0 2 2 0 4 6 2 0 -2 8 4 2 2 4 -4 2
31x 0 6 -2 0 0 6 2 4 -2 4 0 -6 -2 -4 -4 -2
32x 0 2 0 6 0 -6 0 -2 0 2 0 6 0 -6 0 -2
33x 0 6 0 -6 0 6 0 -6 0 -2 0 2 0 -2 0 2
34x 0 -2 4 -2 -4 6 4 2 -2 4 2 4 -6 4 2 0
35x 0 -6 -8 6 0 -2 4 -2 2 4 10 0 2 0 6 0
36x 0 2 -2 4 -4 2 -6 -4 -8 2 2 8 -4 -6 -2 0
37x 0 6 -2 0 4 -2 2 0 4 2 -2 0 0 2 2 0
38x 0 2 -6 0 -2 4 4 -2 2 0 4 -2 -4 -2 2 0
39x 0 6 -6 -4 -2 -8 -4 2 -2 -8 0 -2 0 -2 -2 0
3Ax 0 6 4 -2 2 4 -10 -4 0 2 -8 -2 -2 4 6 0
3Bx 0 2 0 -2 -18 4 -2 0 12 2 0 2 -2 0 2 0
3Cx 0 -6 8 -2 2 4 -10 -4 -6 0 -2 0 0 -2 0 2
3Dx 0 -2 -8 2 2 0 -2 0 -6 4 -2 4 -8 2 0 -2
3Ex 0 -6 6 -4 2 -4 0 -2 0 -2 -2 0 -2 4 -4 -2
3Fx 0 14 10 4 -2 -4 0 2 -8 -6 10 0 -6 4 -4 2
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
00x 32 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
01x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
02x 0 2 0 -2 -2 0 2 0 -2 0 -2 4 -4 -2 0 6
03x 0 -2 0 2 -2 4 -6 4 -2 -4 6 0 -4 2 0 2
04x 0 -2 -2 0 0 2 -2 -4 2 0 4 -2 6 0 0 14
05x 0 -2 -2 0 0 2 6 4 -2 12 -8 2 -6 4 4 2
06x 0 0 2 2 2 6 0 -4 4 4 2 2 2 -2 4 -8
07x 0 -4 2 -10 -6 2 8 0 0 -4 -2 2 -2 -2 0 0
08x 0 0 2 -2 0 0 2 -2 -2 2 0 0 2 -2 4 -4
09x 0 4 -2 -2 4 0 -6 2 2 2 0 12 -6 -6 0 -4
0Ax 0 2 -2 0 2 4 -4 -2 0 -2 -2 4 6 -4 0 -2
0Bx 0 2 2 12 6 8 4 -2 4 -6 -2 4 -2 -4 4 2
0Cx 0 -2 0 -2 0 2 0 -6 0 2 -4 -10 0 6 4 -6
0Dx 0 2 -4 -2 4 2 0 6 0 -2 0 6 -4 6 4 -2
0Ex 0 0 0 4 -2 2 2 2 -2 -6 2 2 4 4 4 0
0Fx 0 0 4 0 -6 -2 -6 2 -2 2 6 6 8 0 4 0
10x 0 0 0 0 0 0 0 -8 0 0 0 0 0 -8 0 -16
11x 0 0 0 0 0 -8 0 0 0 8 0 -8 0 8 0 0
12x 0 2 0 -2 -2 0 2 8 -2 0 -2 4 -4 6 0 -10
13x 0 -2 0 2 -2 -4 10 4 -2 4 -10 8 -4 -6 0 2
14x 0 -2 -6 -4 4 -2 -2 -4 -2 4 -4 -2 -2 0 4 2
15x 0 -2 2 4 -12 6 -2 4 -6 -8 -8 2 2 4 0 -2
16x 0 0 6 -10 -2 -6 0 -4 8 0 -6 2 -6 -2 0 4
17x 0 -4 -2 2 6 -2 0 0 4 0 -2 2 6 -2 4 4
18x 0 -4 2 2 0 4 2 2 6 -2 0 -4 2 2 -4 -8
19x 0 0 6 10 -4 4 -6 -2 2 -2 0 0 -6 -2 0 0
1Ax 0 -2 6 -4 -6 0 -4 2 0 2 -2 0 6 0 0 2
1Bx 0 -2 2 0 -10 4 4 -6 -4 -2 -2 -8 -2 0 -4 -2
1Cx 0 -6 -4 -2 4 2 0 6 4 2 4 2 8 2 0 -6
1Dx 0 -2 -8 -2 0 2 -8 2 -4 -2 0 2 4 2 0 -2
1Ex 0 -4 -4 4 2 2 2 -2 2 -6 -6 -2 -4 0 0 0
1Fx 0 -4 0 0 -10 -2 2 -2 -6 2 6 2 0 -4 0 0
20x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
21x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
22x 0 -2 4 -2 2 0 -6 4 -2 -4 -6 -4 0 -2 16 2
23x 0 2 -4 2 2 -4 -6 8 -2 0 -6 -8 0 -6 -8 -2
24x 0 2 2 -8 0 6 2 4 -2 0 4 2 2 0 0 2
25x 0 2 2 -8 0 6 -6 -4 2 4 0 -2 -2 -4 -4 -2
26x 0 8 2 -6 -2 -6 -4 0 0 -8 6 -2 -6 -2 4 0
27x 0 -4 -6 -2 -10 -2 -4 4 4 0 2 6 -2 -2 0 0
28x 0 0 2 -2 0 0 -6 6 -6 -2 -4 -4 -2 -6 -8 0
29x 0 4 -2 -2 4 0 2 -6 -2 -2 -4 8 6 6 -12 0
2Ax 0 -2 2 0 -2 -4 4 2 4 -2 -2 0 6 -8 4 -2
2Bx 0 -10 -2 -4 2 8 4 2 -8 2 6 0 -2 0 0 2
2Cx 0 -6 4 -2 0 -2 -4 -14 0 -2 0 6 0 2 0 2
2Dx 0 -2 0 -2 4 -2 -4 -2 -8 2 -4 -2 4 -6 8 -2
2Ex 0 0 0 4 2 -10 -2 -2 -10 2 2 2 0 0 0 -4
2Fx 0 8 -4 0 -2 10 -2 -2 -2 10 6 -2 -4 -4 0 4
30x 0 -4 0 4 0 4 0 4 0 4 0 -4 0 -12 0 4
31x 0 4 0 -4 0 4 0 4 0 4 0 -4 0 -4 0 -4
32x 0 -6 -4 -6 2 4 2 0 -2 0 2 0 0 2 8 -2
33x 0 6 4 6 2 0 2 4 -2 4 2 -4 0 6 0 2
34x 0 -2 -10 0 -4 -2 2 0 2 0 -4 -2 10 -4 -4 2
35x 0 6 -2 0 -4 -2 2 0 6 4 0 2 6 0 0 -2
36x 0 4 6 2 2 -6 4 4 -4 0 6 2 2 -6 0 0
37x 0 0 6 6 -6 -2 -4 0 0 8 -6 2 6 2 4 0
38x 0 8 10 -2 0 8 2 -2 -6 6 -4 4 6 2 0 0
39x 0 4 -2 -2 -4 0 -6 2 6 -2 -4 0 -2 6 4 0
3Ax 0 -10 10 0 6 -4 4 2 -4 -2 6 0 -2 0 -4 -2
3Bx 0 6 -2 -4 2 0 4 2 -8 -6 -2 0 6 0 0 2
3Cx 0 2 0 2 -4 -6 4 2 4 2 8 -2 0 2 4 -2
3Dx 0 -2 12 -6 8 2 -4 6 4 -2 -4 -2 4 2 -4 2
3Ex 0 -8 4 0 -2 2 -2 6 10 6 2 2 0 0 -4 0
3Fx 0 -8 0 4 2 -2 -10 -2 -6 6 -2 6 -4 4 -4 0
By arry, you see the values of the differences in outputs of the table S and y input bits of the S-table
corresponding to differences in inputs. The expectancy is T ij64.
5x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
0x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
1x 30 32 34 32 32 32 32 32 32 32 32 32 32 34 32 30
2x 25 37 36 29 35 30 27 37 34 29 32 33 31 34 35 28
3x 25 30 36 36 35 35 27 31 37 33 29 29 30 30 37 32
4x 20 40 34 32 40 24 34 32 42 24 32 32 22 40 32 32
5x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
0x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
1x 25 35 35 31 31 36 32 31 34 29 33 34 35 30 30 31
2x 23 33 30 37 37 33 34 29 37 30 35 30 29 34 27 34
3x 25 33 36 29 30 35 36 31 34 31 33 35 30 29 32 33
4x 17 40 39 29 34 34 33 29 41 21 30 38 35 33 27 32
5x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
0x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
1x 30 32 32 32 32 32 34 32 32 30 32 32 32 32 32 34
2x 23 34 38 32 38 32 28 30 34 36 32 26 32 26 30 41
3x 30 32 32 32 32 32 30 32 32 34 32 32 32 32 32 34
4x 19 36 40 32 40 30 26 32 36 34 30 28 32 28 32 37
5x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
0x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
1x 29 32 32 32 32 32 31 32 32 32 32 34 32 34 32 32
2x 30 32 32 32 32 33 33 32 33 32 32 33 32 31 31 32
3x 22 39 39 27 29 35 37 27 37 29 30 33 35 29 27 37
4x 12 43 51 22 50 22 14 42 45 23 19 41 20 40 44 24
5x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
0x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
1x 27 35 31 32 34 31 31 34 32 33 33 32 35 29 33 30
2x 25 31 33 36 37 34 32 27 34 29 35 32 31 34 29 33
3x 24 32 35 30 33 30 35 36 34 35 32 30 36 31 26 33
4x 15 37 47 27 44 28 19 35 35 39 30 26 33 25 32 40
5x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
0x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
1x 29 33 32 31 32 33 34 31 30 33 33 31 32 33 34 31
2x 23 28 37 36 32 30 32 34 37 35 29 31 35 35 31 27
3x 28 34 32 31 32 32 30 33 31 34 34 31 33 32 32 33
4x 20 38 41 28 38 29 26 35 42 27 24 36 25 36 39 28
5x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
Distributions of the S - Table 8
00x 01x 02x 03x 04x 05x 06x 07x 08x 09x 0Ax 0Bx 0Cx 0Dx 0Ex 0Fx
0x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
1x 29 33 33 30 33 32 32 33 34 31 31 34 31 33 32 31
2x 23 37 38 25 39 26 27 40 39 28 27 39 26 37 36 25
3x 30 32 33 32 33 32 31 32 33 32 32 32 31 32 33 32
4x 20 44 40 24 44 20 24 40 40 24 28 36 24 40 36 28
5x 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32 32
S-Table 1
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.496 0.493 0.493 0.493 0.494 0.492 0.494 0.493 0.494 0.493 0.492 0.492 0.493 0.492 0.493 0.493
1x 0.408 0.496 0.534 0.467 0.507 0.486 0.477 0.478 0.504 0.497 0.476 0.486 0.488 0.476 0.486 0.486
2x 0.326 0.505 0.572 0.423 0.511 0.414 0.403 0.522 0.518 0.443 0.413 0.471 0.441 0.517 0.494 0.412
3x 0.361 0.449 0.427 0.430 0.436 0.449 0.448 0.442 0.426 0.457 0.443 0.439 0.453 0.447 0.455 0.448
4x 0.191 0.523 0.516 0.271 0.541 0.277 0.248 0.533 0.530 0.267 0.295 0.483 0.278 0.483 0.472 0.266
5x 0.289 0.287 0.288 0.288 0.286 0.287 0.286 0.286 0.287 0.287 0.287 0.287 0.287 0.286 0.286 0.287
S-table 2
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.497 0.494 0.494 0.493 0.495 0.492 0.491 0.493 0.495 0.493 0.494 0.493 0.495 0.494 0.494 0.493
1x 0.467 0.480 0.506 0.480 0.492 0.490 0.477 0.485 0.493 0.488 0.480 0.479 0.489 0.488 0.485 0.480
2x 0.361 0.541 0.542 0.461 0.524 0.433 0.400 0.524 0.524 0.428 0.454 0.462 0.437 0.512 0.505 0.436
3x 0.321 0.449 0.484 0.453 0.468 0.467 0.377 0.455 0.466 0.454 0.429 0.427 0.427 0.445 0.482 0.430
4x 0.230 0.523 0.415 0.382 0.509 0.263 0.400 0.407 0.537 0.269 0.387 0.406 0.259 0.513 0.400 0.381
5x 0.295 0.290 0.290 0.289 0.289 0.286 0.288 0.290 0.290 0.287 0.290 0.290 0.291 0.289 0.289 0.289
S-table 3
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.495 0.493 0.494 0.494 0.494 0.494 0.494 0.494 0.495 0.493 0.494 0.493 0.494 0.493 0.492 0.494
1x 0.374 0.515 0.527 0.482 0.491 0.512 0.504 0.473 0.521 0.455 0.497 0.514 0.502 0.472 0.449 0.499
2x 0.326 0.507 0.502 0.480 0.516 0.490 0.484 0.465 0.525 0.429 0.483 0.481 0.476 0.493 0.430 0.477
3x 0.331 0.454 0.472 0.436 0.435 0.464 0.452 0.425 0.463 0.436 0.445 0.456 0.443 0.415 0.433 0.449
4x 0.187 0.492 0.480 0.379 0.437 0.410 0.420 0.363 0.513 0.255 0.398 0.462 0.400 0.411 0.312 0.401
5x 0.295 0.290 0.290 0.290 0.290 0.290 0.290 0.291 0.290 0.291 0.290 0.290 0.290 0.289 0.288 0.289
S-table 4
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.498 0.493 0.495 0.493 0.495 0.493 0.494 0.494 0.494 0.493 0.494 0.494 0.493 0.495 0.494 0.495
1x 0.457 0.490 0.495 0.480 0.483 0.487 0.491 0.486 0.488 0.481 0.485 0.496 0.488 0.486 0.483 0.490
2x 0.311 0.499 0.565 0.489 0.580 0.471 0.374 0.446 0.522 0.511 0.460 0.367 0.463 0.379 0.467 0.574
3x 0.405 0.438 0.452 0.439 0.439 0.445 0.430 0.435 0.443 0.449 0.436 0.429 0.434 0.424 0.446 0.461
4x 0.201 0.440 0.520 0.390 0.510 0.378 0.305 0.361 0.455 0.401 0.368 0.314 0.377 0.313 0.372 0.476
5x 0.294 0.290 0.291 0.290 0.290 0.289 0.290 0.290 0.290 0.290 0.290 0.290 0.290 0.291 0.290 0.290
S-table 5
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.496 0.494 0.495 0.494 0.493 0.493 0.494 0.493 0.494 0.493 0.494 0.493 0.493 0.493 0.493 0.493
1x 0.455 0.484 0.478 0.483 0.466 0.497 0.485 0.480 0.480 0.491 0.489 0.489 0.495 0.484 0.482 0.498
2x 0.440 0.472 0.475 0.462 0.481 0.475 0.473 0.464 0.475 0.472 0.467 0.464 0.474 0.456 0.465 0.481
3x 0.301 0.546 0.555 0.376 0.414 0.469 0.501 0.392 0.515 0.414 0.410 0.472 0.464 0.407 0.409 0.471
4x 0.157 0.609 0.707 0.272 0.647 0.285 0.184 0.538 0.627 0.283 0.247 0.524 0.260 0.496 0.543 0.331
5x 0.294 0.288 0.287 0.291 0.286 0.289 0.290 0.289 0.288 0.290 0.291 0.288 0.290 0.289 0.289 0.290
S-table 6
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.499 0.493 0.495 0.494 0.493 0.493 0.493 0.493 0.494 0.494 0.493 0.493 0.493 0.494 0.496 0.494
1x 0.409 0.505 0.495 0.487 0.508 0.492 0.454 0.501 0.500 0.498 0.496 0.481 0.502 0.469 0.488 0.499
2x 0.373 0.478 0.468 0.487 0.528 0.478 0.503 0.464 0.486 0.463 0.479 0.471 0.472 0.479 0.430 0.469
3x 0.301 0.457 0.495 0.420 0.428 0.453 0.482 0.443 0.461 0.465 0.442 0.445 0.457 0.442 0.389 0.471
4x 0.153 0.477 0.680 0.341 0.622 0.345 0.194 0.449 0.453 0.513 0.373 0.316 0.414 0.289 0.414 0.518
5x 0.297 0.289 0.291 0.290 0.288 0.289 0.290 0.290 0.289 0.292 0.289 0.289 0.290 0.289 0.293 0.292
S-table 7
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.497 0.494 0.495 0.494 0.493 0.493 0.493 0.494 0.495 0.493 0.493 0.493 0.493 0.494 0.493 0.494
1x 0.449 0.495 0.469 0.478 0.491 0.491 0.489 0.486 0.461 0.487 0.488 0.492 0.492 0.485 0.498 0.492
2x 0.351 0.455 0.520 0.520 0.485 0.456 0.462 0.479 0.553 0.469 0.456 0.453 0.464 0.506 0.443 0.456
3x 0.387 0.462 0.446 0.416 0.440 0.448 0.443 0.443 0.436 0.448 0.440 0.438 0.449 0.426 0.438 0.452
4x 0.221 0.462 0.524 0.341 0.495 0.317 0.290 0.414 0.524 0.336 0.291 0.432 0.308 0.442 0.469 0.335
5x 0.295 0.291 0.290 0.289 0.290 0.291 0.291 0.291 0.290 0.291 0.291 0.291 0.291 0.289 0.291 0.291
S-table 8
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.495 0.494 0.494 0.494 0.494 0.492 0.494 0.494 0.495 0.493 0.494 0.494 0.494 0.494 0.494 0.495
1x 0.443 0.500 0.496 0.463 0.502 0.465 0.473 0.500 0.491 0.468 0.472 0.493 0.474 0.494 0.491 0.474
2x 0.338 0.499 0.519 0.400 0.535 0.382 0.399 0.535 0.533 0.412 0.410 0.499 0.402 0.512 0.486 0.404
3x 0.408 0.456 0.450 0.423 0.453 0.435 0.432 0.435 0.451 0.443 0.437 0.432 0.436 0.438 0.440 0.445
4x 0.222 0.497 0.486 0.265 0.512 0.238 0.269 0.487 0.458 0.296 0.303 0.429 0.279 0.435 0.419 0.322
5x 0.294 0.291 0.291 0.292 0.290 0.291 0.292 0.290 0.290 0.291 0.291 0.290 0.292 0.289 0.290 0.291
S-table 2
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.500 0.518 0.546 0.513 0.504 0.498 0.505 0.507 0.498 0.496 0.502 0.512 0.496 0.498 0.519 0.524
1x 0.510 0.548 0.511 0.443 0.486 0.497 0.550 0.516 0.482 0.504 0.562 0.512 0.447 0.502 0.668 0.510
2x 0.482 0.532 0.605 0.467 0.617 0.583 0.526 0.627 0.537 0.365 0.669 0.471 0.567 0.474 0.628 0.311
3x 0.579 0.560 0.663 0.620 0.603 0.526 0.496 0.303 0.464 0.633 0.597 0.634 0.395 0.460 0.616 0.786
4x 0.672 0.657 0.630 0.509 0.660 0.767 0.686 0.567 0.690 0.772 0.543 0.554 0.742 0.385 0.391 0.377
5x 0.728 0.724 0.716 0.703 0.720 0.733 0.695 0.690 0.724 0.713 0.775 0.717 0.707 0.722 0.659 0.634
S-table 3
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.494 0.528 0.550 0.510 0.503 0.507 0.506 0.511 0.501 0.495 0.506 0.505 0.505 0.503 0.498 0.512
1x 0.535 0.578 0.350 0.604 0.496 0.566 0.513 0.803 0.559 0.525 0.570 0.324 0.442 0.360 0.494 0.524
2x 0.542 0.437 0.544 0.467 0.492 0.518 0.586 0.695 0.536 0.463 0.396 0.123 0.722 0.357 0.832 0.502
3x 0.505 0.481 0.641 0.762 0.595 0.574 0.653 0.656 0.548 0.394 0.565 0.838 0.395 0.235 0.462 0.757
4x 0.505 0.572 0.616 0.583 0.608 0.627 0.398 0.563 0.556 0.367 0.617 0.880 0.819 0.368 0.784 0.629
5x 0.727 0.724 0.695 0.696 0.717 0.708 0.703 0.694 0.730 0.733 0.702 0.748 0.705 0.760 0.685 0.718
S-table 4
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.502 0.527 0.543 0.521 0.492 0.497 0.512 0.500 0.500 0.495 0.503 0.516 0.498 0.491 0.504 0.507
1x 0.403 0.430 0.525 0.531 0.529 0.503 0.628 0.585 0.417 0.360 0.485 0.552 0.495 0.566 0.585 0.726
2x 0.570 0.480 0.518 0.573 0.522 0.566 0.447 0.272 0.622 0.637 0.526 0.454 0.514 0.469 0.783 0.260
3x 0.545 0.464 0.543 0.439 0.553 0.448 0.561 0.535 0.640 0.564 0.689 0.587 0.695 0.562 0.570 0.561
4x 0.641 0.689 0.649 0.524 0.762 0.669 0.448 0.632 0.597 0.502 0.536 0.610 0.659 0.241 0.740 0.623
5x 0.709 0.707 0.706 0.689 0.693 0.680 0.710 0.722 0.729 0.736 0.741 0.718 0.733 0.709 0.685 0.733
S-table 5
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.498 0.531 0.547 0.515 0.499 0.495 0.507 0.520 0.502 0.513 0.504 0.485 0.512 0.507 0.499 0.509
1x 0.456 0.585 0.425 0.530 0.411 0.482 0.474 0.468 0.587 0.773 0.518 0.511 0.511 0.518 0.510 0.516
2x 0.531 0.614 0.493 0.540 0.508 0.541 0.524 0.527 0.530 0.547 0.409 0.534 0.648 0.538 0.542 0.524
3x 0.566 0.603 0.630 0.657 0.570 0.589 0.759 0.616 0.557 0.568 0.650 0.219 0.450 0.736 0.575 0.051
4x 0.604 0.596 0.469 0.618 0.533 0.715 0.650 0.451 0.587 0.720 0.514 0.344 0.440 0.627 0.725 0.949
5x 0.704 0.709 0.689 0.711 0.673 0.689 0.703 0.711 0.737 0.743 0.714 0.738 0.697 0.736 0.696 0.737
S-table 6
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.505 0.526 0.547 0.520 0.505 0.504 0.499 0.496 0.500 0.512 0.506 0.514 0.509 0.485 0.505 0.520
1x 0.455 0.460 0.482 0.514 0.448 0.645 0.435 0.487 0.531 0.301 0.540 0.486 0.598 0.675 0.656 0.503
2x 0.423 0.442 0.537 0.606 0.616 0.628 0.426 0.440 0.588 0.411 0.593 0.602 0.506 0.604 0.453 0.786
3x 0.580 0.597 0.585 0.617 0.467 0.657 0.615 0.511 0.766 0.760 0.452 0.645 0.515 0.507 0.301 0.316
4x 0.679 0.676 0.527 0.726 0.610 0.642 0.790 0.659 0.450 0.510 0.582 0.738 0.462 0.742 0.595 0.071
5x 0.709 0.712 0.709 0.755 0.712 0.727 0.683 0.706 0.709 0.707 0.700 0.745 0.712 0.682 0.740 0.645
S-table 7
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.502 0.526 0.543 0.509 0.495 0.517 0.506 0.514 0.496 0.498 0.500 0.504 0.499 0.490 0.518 0.520
1x 0.408 0.363 0.458 0.563 0.494 0.557 0.439 0.725 0.465 0.373 0.474 0.541 0.613 0.575 0.576 0.749
2x 0.437 0.458 0.535 0.484 0.377 0.534 0.736 0.773 0.727 0.568 0.496 0.460 0.543 0.625 0.440 0.277
3x 0.665 0.640 0.686 0.514 0.600 0.568 0.601 0.614 0.570 0.428 0.582 0.225 0.523 0.383 0.594 0.573
4x 0.609 0.603 0.713 0.613 0.553 0.697 0.707 0.579 0.640 0.520 0.551 0.285 0.636 0.323 0.714 0.822
5x 0.701 0.722 0.697 0.710 0.721 0.712 0.744 0.719 0.731 0.701 0.717 0.698 0.718 0.724 0.696 0.688
S-table 8
0x 1x 2x 3x 4x 5x 6x 7x 8x 9x Ax Bx Cx Dx Ex Fx
0x 0.502 0.531 0.543 0.525 0.504 0.493 0.503 0.522 0.500 0.497 0.502 0.515 0.498 0.502 0.507 0.497
1x 0.545 0.601 0.578 0.604 0.489 0.519 0.398 0.511 0.445 0.454 0.456 0.657 0.507 0.502 0.505 0.498
2x 0.538 0.520 0.516 0.489 0.607 0.508 0.612 0.565 0.681 0.624 0.517 0.536 0.523 0.573 0.569 0.080
3x 0.562 0.552 0.597 0.680 0.596 0.642 0.584 0.574 0.540 0.501 0.562 0.555 0.510 0.391 0.573 0.577
4x 0.631 0.572 0.594 0.661 0.587 0.705 0.674 0.434 0.608 0.670 0.645 0.597 0.674 0.444 0.624 0.643
5x 0.731 0.724 0.729 0.693 0.731 0.700 0.728 0.751 0.699 0.710 0.726 0.697 0.723 0.636 0.736 0.640
Presentation
TSS
Success
OK
9027,00
98.24
87.38
3325,00
99.37
94.98
3087,00
99.42
95.33
10
1734,00
99.67
97,38
11
1,390.00
99.74
97.91
12
1325,35
99.75
98.00
13
1197,76
99.77
98.19
14
1113,64
99,79
98.32
15
1051,14
99.80
98,41
16
676,32
99.87
98,98
17
396,19
99.93
99.40
18
315.32
99.98
99.53
19
178,82
99.99
99.73
20
91.81
100.00
99.87
21
41.69
100.00
99.94
22
26.25
100.00
99.97
23
20.38
100.00
99.97
24
15,22
100.00
99.98
25
10.81
100.00
99.99
26
8.13
100.00
100.00
These values are obtained by the experience for initialization of the random weight with a
learning rate Epsilon equal to 0.5. The number of binary inputs is 16, the number of hidden
neurons is 32 and the number of neurons output is 8.
We note that at the end of a single presentation was 87% of the correct output values what is
ample in Cryptography.
Similarly, the number of presentation doesn't have to exceed 30 while simpler operations such
as XOR 1-bit or 4-bit XOR to 200 to 500 presentations according to the random weight
initialisation.
CM-5
This hybrid SIMD and MIMD machine was manufactured by Thinking Machine. This is done
ten improved, synchronized with the delta-parallelism MIMD machine.
It consists of 16384 Sparc processors and 4 vector processors Texas Instrument.
Each node has 32 MB of DRAM with a bandwidth of 1 GB/s.
A node
The structure of the communication links is Fat-Tree (tree with 3 levels of depths). Routing is
a Wormhole.
User operations are of three kinds: distribution (for the memory management), combination
(management of parallelism) and global. The CMMD software is a library of functions in C,
Lisp and Fortran.
Its performances are 45 Gflops to 1024 processors.