Sie sind auf Seite 1von 6

Fully Homomorphic Encryption - An Implementation over Integer Numbers with Compression Public Key using Genetic Algorithm

Joffre Gavinho Filho

NCE/IM Federal University of Rio de Janeiro (UFRJ). Rio de Janeiro, RJ, Brazil joffreufrj@gmail.com

AbstractThe Fully Homomorphic Encryption (FHE) is an encryption technique for processing encrypted data without the need of decrypting them. This method is suitable for use in untrusted environments, such as cloud computing platforms. Various methods have been proposed to implement this technique. But the greatest problem of these methods is that for its operation, there is the need to generate public keys with large sizes, estimated in the order of O(λ10), whose immediate consequence is to cause such encryption schemes not reach the desired runtime performance. This article aims to optimize techniques for reducing public keys using of Genetic Algorithms (GA) for calibration of the parameters of the primitive of the Coron test variants in order to speed up the execution time of each of these primitives, with the consequence of reducing the size of public keys generated.

Keywords— Fully Homomorphic Encryption, Security, Genetic Algorithm.

I.

INTRODUCTION

The technological advances have driven the exchange of knowledge and lead to great development, especially in business. Thus, information has become one of the most important assets for companies, and, in most cases, the information security is a critical requirement. Encryption has been one of the most used ways for providing information security. Cryptographic techniques are not new [3], even in its computer version has been used a long time. However, technological advances, such as the increase of processing capabilities, threaten the security of the known cryptographic algorithms [13]. There are two basic models of computational cryptography: symmetric and asymmetric [16]. The fundamental difference between these models is that the symmetric key is used for both encoding and decoding information. While the asymmetric model uses two keys, one public, used to encrypt the data, and a secret key used to decrypt the encrypted blocks. Both models are commonly used by security providers in the processing and storage of data hosted in cloud computing platforms [17]. However, when conventional encryption methods are used on such platforms, the data become fully vulnerable since in processing, storage and manipulation of information, there is the need to decrypt data. This decryption is necessary because cloud servers cannot validate or even compare encrypted data. So, this validation must be made accessing the original data. In short, the main problem with the shared cloud environment

Gabriel P. Silva, Mitre Dourado and Cláudio Miceli

NCE/IM Federal University of Rio de Janeiro (UFRJ). Rio de Janeiro, RJ, Brazil {gabriel.silva, mitredourado, cmicelifarias}@gmail.com

lies in the processing of encrypted data, because in traditional encryption, data cannot be changed while encrypted. This concept is called "non-malleability" [6] and requires that all data must be decrypted before being processed, even in the cloud environment. Therefore, data security is compromised. This type of vulnerability is not observed in homomorphic systems, because the data is manipulated in encrypted form, without the need to access the original information. The homomorphic encryption scheme (HE) is based on additive and multiplicative processing functions, i.e., and encryption and decryption algorithms [16]. There is also an assessment algorithm that takes as input an encrypted message f(m) and returns an encryption of f(m) [9]. HE schemes may be classified into two types. The first are the schemes called partially homomorphic, that in addition to encryption and decryption operations also perform sum or multiplication operation that take as input encrypted messages m 1 and m 2 and return the encryption m 1 + m 2 or m 1 *m 2 , respectively. If an HE scheme supports both addition and multiplication, it can also evaluate any arithmetic encrypted data circuit [9] and therefore we can say that is a Fully Homomorphic Encryption (FHE) scheme. In FHE, if E(m) is the encoding of a message m, an encryption model is f ully homomorphic if : f(m) E(m 1 + m 2 ) = E(m 1 ) + E (m 2 ); and f(m) E(m1 * m2) = E(m1) * E (m2) [6]. Using such scheme, any circuit can receive a homomorphic evaluation, allowing the construction of programs that can be run with the encryption of its inputs to produce an encryption of its output. As these programs do not decode the information, they can be used by untrusted third parties without revealing your input and internal state. For example, you can add two numbers and encrypt them, and unless they can decrypt the result, there is no way to discover the value of individual original numbers [9]. The greatest problem of the proposed methods for fully homomorphic system is that their running times and the size of the parameters used, especially public key, grow every iteration in complexity order of O(λ 10 ) [6]. Where λ is the safety parameter throughout the system that defines the size in bits, of the generated keys. Process improvements were proposed. One proposal was made by Coron [6]. The authors proposed a scheme of public keys DGHV [8] fully homomorphic, and that reduces the size of that generated keys for about O(λ 7 ) [6]. And the authors in Bilar et al. [1], which optimizes Coron test in order to reduce the scheme runtime.

Our proposal is to apply the heuristics of a Genetic Algorithm (GA) to reduce the parameters used in the method of Coron [6], together with optimizations performed by Bilar [1]. Our goal is to optimize performance in comparison to the runtimes of compression techniques. A genetic algorithm [10] is an heuristic optimization, which aims to find the solution that corresponds to the maximum and / or minimum point of a given function, and it suits the purpose of this study. This paper is organized as follows: Section 2 describes the mechanisms Fully Homomorphic Encryption and public key compression methods and related work; in Section 3, the proposed compression and optimization is presented, as well as experiments and analysis of results; and finally, in Section 4 the final conclusions and proposals for future work are woven.

II. BASIC CONCEPTS

A. Fully Homomorphic Encryption (FHE)

Cryptographers recognized the need for an encryption algorithm that allows any arbitrary computation on encrypted data in 1978 by Rivest, Adleman and Dertouzos [14]. They suggested the construction of secret homomorphisms - private homomorphisms - as a way of providing protective mechanisms for computation on sensitive data. However, the scheme in addition to being partially homomorphic using only the multiplicative rather than additive function did not provide protection against chosen plain text attacks (Chosen Plaintext Attack - CPA) [12], meaning it had no security semantics. Since, in cryptography, a scheme is semantically secure if no probabilistic algorithm, given a ciphertext of any message and that message’s size in polynomial time (Probabilistic, Polynomial-Time Algorithm - PPTA) [11], can determine any of the information in the message with greater significance than random choice probability. In other words, knowledge of the ciphertext and the size of an unknown message reveals no information about the message that can be extracted easily from the ciphertext. Following the above research, the scientific community began to seek practical implementations of this theory, this is, algorithms able to make the so-called homomorphic encryption. The problem remained unsolved until recently in 2009 when Craig Gentry [9] solved it by suggesting the use of ideal lattices in the construction of a fully homomorphic cryptosystem. Unfortunately due to the complexity of the evaluation of multiplications and the size of the public key, Craig Gentry’s proposal is not efficient enough to be used in practice. Also in 2009, NP Smart and F. Vercauteren [15] released their fully homomorphic encryption system, their main influence being Gentry's scheme. The proposed system generates encrypted blocks with small errors, which propagate as homomorphic operations are performed. The problem with this system is that after a certain number of operations, the blocks show a very large mistakes, making it impossible at any given time to correct the decryption.

The first challenge in homomorphic encryption is its practicality. While Gentry’s original construction [9] is seen as impractical, recent construction and implementation efforts have dramatically improved the efficiency of the fully

homomorphic encryption. Initial implementation efforts, focused on Gentry’s original proposal and its variations [6, 7, 8, 9] and seemed to improve its efficiency bottlenecks. Later implementations made use of recent advances in the algorithms [10, 11, 12] and algebraic techniques [13, 14] to improve the efficiency of concrete results in fully homomorphic encryption schemes with fewer restrictions for the use the schemes. The size of public keys created by these methods, and the need of a storage structure is one of the biggest challenges in homomorphic encryption. In the Dijk, Gentry, Halevi and Vaikuntanathan (DGHV) [3] scheme, for example, the size of

of λ 10 bits and the size of the

the public key is in the range

private key in around λ 2 bits, where λ is the security parameter.

The size of the private key is required to ensure the security of the scheme. Coron, Mandal and Naccache [6] proposed a modification in the generation of the keys in order to reduce the size of the public key to λ 7 bits. The modification consists of using quadratic forms of the elements of the public key rather than linear forms, as is done in DGHV scheme. The idea of Coron et al. is to store only a small subset of the public key and, when necessary, to generate the public key complete multiplicatively combining the elements of the subset. This proposal maintains the semantic security as it is based on the problem of partial approximate maximum common divisor (partial approximate greatest common divisor) [11], which consists in removing the error of the first term of public key created during the process. This problem has the same basic security problem of approximate greatest common divisor (approximate greatest common divisor), which is the basis of the DGHV security scheme.

B. DGHV Scheme

Dijk, Gentry and Halevi Vaikuntanathan in 2010 proposed a fully homomorphic scheme (DGHV) using only modular algebra over a set of integers, which has proved to be less complex compared with the schemes based on crosslinks. This same pattern was analyzed by Coron, which proposed two variants of the same method which optimizes the computational cost of certain lower primitives, and decreasing the size of the original public key scheme using various reduction and compression techniques. The DGHV scheme E(KeyGen, Encrypt, Decrypt, Evaluate) consists of four algorithms, also called primitives. KeyGen primitive is responsible for generating the key pair scheme, Encrypt responsible for generating the ciphertext, Decrypt responsible for deciphering the ciphertext and Evaluate which performs public way, a logic circuit on a tuple of coded bits, and that returns the equivalent encrypted this circuit applied to the original data.

C. Coron Schems

The first variant of Coron, called DGHV with reduced key, includes the addition of new quadratic parameters to the primitives scheme, storing only a small set values related to

public key and then generating a complete public key at runtime. Using this technique, Coron demonstrated the reduction of the public key size of an order of O (λ 10 ) to O (λ 7 )

[1].

In the work entitled "Public Key Modulus Compression and Switching is Fully homomorphic encryption over the Integers", besides showing an attack on this system with complexity of O(2 ρ ), Coron obtained an implementation in SAGE whose public key owned 10.1 MB in size without the use of BGV framework and 18 MB using the BGV framework, unlike 802MB of his earlier work. In this work, the length of the public key has been further reduced to O(λ 7 ) to O(λ 5 ). The main innovation proposed by Coron scheme is that, instead of storing the encryption key elements and it only stores the correction value relative to a random number generator. Thus, the data to be stored are smaller, and the complete data is recovered "on-the-fly" by the primitives Encrypt, Recrypt, Decrypt and Expand. Furthermore, it is described a module exchange technique, which allows this scheme to work without using the boostraping framework proposed by Brakerski Gentry and Vaikuntanathan. The original scheme of integers is used by Coron as a basis for their work, as well as for the creation of a second variant. Gentry [9] defines the whole DGHV on bases of a set of integers, = . + , 0 ≤τ where the set of integers is secret, being given a security parameter λ, the following parameters must be used to compose the Reduced Homomorphic Encryption (RHE) scheme, which then must be enhanced to generate the FHE on integers[61]:

γ is the length in bits of 's.

η is the length in bits of secret key.

ρ is the length in bits of noise.

τ is the number of 's public key.

ρ'é a secondary noise parameter used to encrypt.

The scheme must follow the following restrictions:

ρ = ω ( λ), to protect against brute force attacks targeted noise

η≥ρ.Θ (λ 2λ), so you can perform homomorphic operations to assess the "RDC” “Reduced Decryption Circuit"

γ = ω (η2. λ), to thwart attacks based on reticles to approach the problem of Greatest Common Divisor (GCD) τ≥γ + ω ( λ) to reduce the approach by GCD

ρ '= ρ + ω ( λ) to the secondary noise parameter

D. Genetic Algorith

According to [10], Genetic Algorithms (GA) are optimization and search methods, inspired by the evolution mechanisms of populations/organisms, and are implemented as a computer simulation in which a population of abstract representations population is selected in search for better solutions. The evolution usually starts from a set of solutions randomly created and is carried through generations. With each generation, the adaptation of each solution in the population is evaluated, some individuals are selected for the next generation and recombined or mutated to form a new

population. The new population is then used as input for the next iteration of the algorithm. Among the various uses of such algorithms, we can use it to assign weights to sets whose optimization is to find the solution that matches the point of maximum or minimum for a particular function. Considering, for example, a function f(x) consisting of k elements to be maximized. Each of the elements x k is assigned a weight by creating j weights vectors, chromosomes defined as (Equation 1).

f(x) = j(x 1 ) + … + j(x k )

(1)

Each chromosome possess j positions, one for each set of k elements. Each position [j, k] contains a real number in the [0,1] interval randomly chosen, which represent the number of genes called elements. The j chromosomes are randomly set at the beginning of the process of assigning weights, then form the first generation of the population that will be used at this stage. Each chromosome will be processed one by one to evaluate its performance. The processing is performed as

, gene k ) is analyzed by

follows: chromosome j (gene 1 , gene 2 ,

the evolution of the genetic algorithm formula (called fitness)

of the chromosome, represented in Equation 2 [60].

fitness =

Total

FP

2*

FN

Total

(2)

Where: Total = Total data analyzed, FP = False Positives and False Negatives = FN. After calculating the fitness of the chromosomes j of the first generation of the population, begins the evolutionary process of the genetic algorithm. The evolution of the population is carried out through selection, crossover and mutation of chromosomes. The most widely used method in the selection phase is the roulette method (roulette wheel) [15]. In the roulette method, each chromosome is represented proportionally to their fitness compared to the sum of fitness (Equation 2) of all chromosomes of the population. A random value is generated and the corresponding chromosome in roulette is selected to generate offspring. The number of selected chromosomes is equal to the original population size. The method is formalized as follows: (i) the fitness of all chromosomes is added (Tf); (ii) generation of a random number n: 0 <= n <= Tf; (iii) selection of the chromosome whose fitness added to the fitnesses of the preceding chromosomes is equal to or greater than n. After distributing and calculating the fitnesses of the sum (Tf), a random number is created and there is the choice of the chromosome. The roulette method is used to select two parent chromosomes, which then initiates the phase crossover and mutation. The crossing consists basically of mixing genetic material from two individuals (parents) of the population, producing two new individuals (children) who inherit characteristics from their parents. It is used the cross at two points (two-point crossover) [15], that is, are randomly set two cut-off points on

chromosomes selected in the selection phase, one of the descendants get the central part of one parent and the extreme parts of the other parent. The children then replace the positions occupied by the parents. The mutation operation prevents premature convergence of the algorithm by introducing new regions in search of the solution space. This consists of random values to replace some genes of the chromosomes. It used the margin of Y% of the population to carry out the mutation in one of the sons chromosomes. A random number between 1 and Y is calculated, if the number is in the range between 1 and Y / 10, the child chromosome undergoes mutation, i.e. a random number between 1 and n is chosen. This number represents the position of the gene to be replaced and then another random real number between 0 and 1 is calculated and the selected gene is replaced by this new number. An important observation at this point is: if the fitness of the new chromosome created by mutation is less than the fitness of the chromosome that is suffering the process, the mutation does not occur. The evolutionary process of the algorithm consists of a total of k generations, where the phases mentioned above (selection, crossover and mutation) are repeatedly performed. At the end of the process the chromosome (vector genes) with the highest fitness, i.e., one that is adapted to sorting setting is chosen as the vector k weighted values for each element x of function analysis.

III. PROPOSAL FOR OPTIMIZATION EXPERIMENTAL EVALUATION

The optimization process of the mechanism proposed by Coron described in this paper basically consists of calibrating the values of the parameters used in cryptographic primitives through the Genetic Algorithm.

in cryptographic primitives through the Genetic Algorithm. Fig. 1. Main screen of homomorphic encryption Simulator .

Fig. 1.

Main screen of homomorphic encryption Simulator.

To this end, the proposed engine both for the encryption part, as to the fully homomorphic encryption is implemented in Matlab/Simulink© mathematical software, as well as all cryptographic primitives described in [6] and [7]. We can see in Figure 1, the main screen of homomorphic encryption

simulator implemented in Simulink. In this module are implemented most homomorphic schemes proposed in the literature. In Figure 2, we observe the calculations used as base: primality calculations, random number generation, modularity calculations as well as all basic numerical calculations.

calculations as well as all basic numerical calculations. Fig. 2. Calculations Module Basic. Fig. 3. Fully

Fig. 2.

Calculations Module Basic.

numerical calculations. Fig. 2. Calculations Module Basic. Fig. 3. Fully Homomorphic Encryption Module . In this

Fig. 3.

Fully Homomorphic Encryption Module .

In this module are implemented some relevant cryptographic methods that were required by our proposal. Among them, the pioneering work of Gentry [9], along with optimizations performed by DGHV. Including the methods that are basis for our work: Coron [6] and [7], and Bilar [1]. In particular, the cryptographic primitives should be noted:

KeyGen, Encrypt, Decrypt, Evaluate, Recrypt and Expand, which are coded and run through simulations of the algorithms proposed by Coron [9] and which are applied at the analysis

and variations in the parameter values calculated using Genetic Algorithm, base of our proposal. Coron implemented hiss proposal for the DGHV scheme using the mathematical software SAGE (“System of Algebra and Geometry Experimentation”) [18]. As a comparative basis, all metrics and primitives [6], found in Table 1 [1] and Table 3, originally implemented in PYTHON, were re- implemented and simulated in our proposal in a simulator made in Matlab/Simulink ©. The tests were performed, the results analyzed and compared with the results obtained by the authors in [3] in their work and achieved the same results as the authors previuosly had, corroborating the results presented by the original papers. Our tests and simulations were performed on an Intel platform (R) Core (TM) Duo CPU E4500 with 2.20 GHz frequency, 3:00 GB of RAM and a 64-bit OS. For each generation of the algorithms, 1,000 rounds were performed for each value of λ. Varying the range of integers [λ - 2, λ + 3], as it can be seen in Table 1. For each experiment, 1.000 rounds are performed the process for each safety parameter size Toy (42 bits), Small (52 bits), Medium (62 bits) and Large (72 bits), followed by the proposed Coron [2]. Totaling up to each parameter: Toy, Small, Medium and Large, 6.000 rounds. A total of 24.000 for each generation. Finally the entire evolutionary process of genetic algorithm in 24.000.000 rounds.

TABLE I.

PARAMETERS USED BY CORON.

Parameters

× 10 6

ΘΘΘΘ

Toy

42

16

336

56

0.061

195

Small

52

20

390

65

0.270

735

Medium

62

26

438

73

1.020

2925

Large

72

34

492

82

2.200

5700

After this initial phase of the calibration modules and GA training phase is initiated evolutionary genetic algorithm process where, for a total of 100 generations where phase selection, crossover and mutation are repeatedly performed for the algorithm convergence to a central value for each size of the security parameter λ.

TABLE II.

RUNTIMES OBTAINED BY CORON.

Safety

KeyGen

Encrypt

Decrypt

Expand

Recrypt

Parameters

Toy

0,06 s

0,05 s

0,00 s

0,01 s

0,41 s

Small

1,00 s

1,00 s

0,00 s

0,15 s

4,50 s

Medium

28.00 s

21,00 s

0,01 s

2,70 s

51,00 s

Large

10 min

7 min 15s

0,05 s

51,00 s

11min34 s

Table 2 illustrates the runtime of cryptographic primitives of the system obtained by Coron. The literature commonly uses the measure execution time of each primitive in order to quantify and evaluate the performance of each of the cryptographic primitives. The primitives are performed repetitively and has its runtime accounted for by software, held by the runtime and the number of times of execution of

primitive applies a simple arithmetic average of the same, thus obtaining the average time execution of their primitive that can be used in a comparative way among various implementations and various homomorphic schemes [61]. We can observe in Tables 4, 5, 6 and 7, the execution of the primitive times home safety parameters and their sizes λ. The implementation process starts by producing a text data mass plain 500 MB.

TABLE III.

RUNTIMES OBTAINED BY BILAR

Safety

KeyGen

Encrypt

Decrypt

Evaluate

Parameters

Toy

0.6

s

0.02 s

0.0

s

0.2

s

Small

3.6

s

0.6 s

0.0

s

1.9

s

Medium

1 min 48 s

55 s

0.0

s

14.7 s

Large

*

*

*

*

This mass is initially used for two purposes: i) Calibration of modules analogously primitive Coron, mainly in the evaluation algorithm of the calibration of each primitive runtime; and, ii) be used as training data for the genetic algorithm (GA).

TABLE IV.

TEST PARAMETERS TOY USING GENETIC ALGORITHM.

#

40

41

42

43

44

45

KeyGen

0,05´´

0,05´´

0,06´´

0,06´´

0,17´´

0,21´´

Encrypt

0,04´´

0,05´´

0,05´´

0,05´´

0,06´´

0,16´´

Decrypt

0,00´´

0,00´´

0,00´´

0,00´´

0,00´´

0,00´´

Expand

0,01´´

0,01´´

0,01´´

0,01´´

0,03´´

0,05´´

Recrypt

0,29´´

0,39´´

0,41´´

0,41´´

1,00´´

1.30´´

Evaluate

0.17´´

0.19´´

0.20´´

0.20´´

0.35´´

0.43´´

TABLE V.

TEST PARAMETERS SMALL USING G.A

#

50

51

52

53

54

KeyGen

 

0,49´´

0,59´´

1,00´´

3,00´´

7,00´´

Encrypt

 

0,53´´

0,59´´

1,00´´

3,50´´

7,10´´

Decrypt

 

0,00´´

0,00´´

0,00´´

0,00´´

0,00´´

Expand

 

0,14´´

0,14´´

0,15´´

0,19´´

0,22´´

Recrypt

 

4,15´´

4,40´´

4,50´´

5,51´´

6,55´´

Evaluate

 

1.50´´

1.79´´

1.90 s

2´10´´

3´15´´

TABLE VI.

TEST PARAMETERS MEDIUM USING G.A

#

 

60

61

62

63

64

KeyGen

 

27,00´´

27,50´´

28,00´´

31,00´´

58,0´´

Encrypt

 

20,00´´

20,045´´

21,00s

22,12´´

24,10´´

Decrypt

 

0,01´´

0,01´´

0,01´´

0,01´´

0,02´´

Expand

 

2,30´´

2,60´´

2,70´´

4,80´´

7,30´´

Recrypt

 

48,00´´

50,00´´

51,00s

1´00´´

2´10´´

Evaluate

 

11.50s

13.20s

14.70s

30.15´´

1´00´´

After all the simulation rounds, the parameter values λ converged to the levels observed in Table VII. Values which, besides being a magnitude unit smaller than the parameters set by Coron [1], the proposed method has also a substantial

reduction in execution time for each cryptographic algorithm. Although, it was found that, for smaller values of λ, there is no provision of security semantics. This can be observed in the tables: III, VI, V and VI, which were calculated with smaller values for λ.

TABLE VII.

RE-ENCRYPTION TESTS TEST PARAMETERS LARGE USING G.A

#

70

71

72

73

74

KeyGen

9´10´´

9´55´´

10´00´´

12´00´´

17´00´´

Encrypt

3´35´´

5´12´´

7´15´´

11´10´´

15´00´´

Decrypt

0,04 ´´

0,05 ´´

0,05´´

0,05´´

0,06´

Expand

50,00´´

50,00´´

51,00´´

51,00´´

55,00´´

Recrypt

10´10´´

11´00´´

11´34´´

12´20´´

13´00´´

Evaluate

10´30´´

11´45´´

12´00´´

123´30´

14´05´´

Re-encryption tests were introduced to analyze the cyclic running times, i.e., performed the encryption method recursively level by level by. These tests were not performed by Bilar [1] and they can be observed in Table III. We have also obtained the results for the parameter λ Large, not reached by Bilar [1] as seen in Table VI.

TABLE VIII.

VALUES AFTER THE CALIBRATION RESULTS

Parameters

ToyToyToyToy

SamllSamllSamllSamll

MediumMediumMediumMedium

Large

λ

41

51

61

71

We observe in Figure 4 the graphs comparing the cryptographic primitives execution times: toy, Small, Medium and Large, when performed by three methods under consideration in this work.

performed by three methods under consideration in this work. Fig. 4 The execution times of the

Fig. 4 The execution times of the cryptogrqphic primitives proposed by Joffre, Coron and Bilar. (a) Toy. (b) Small. (c) Medium. (d) Large.

IV. CONCLUSION

We demonstrated in this work, that when we use the genetic algorithm to calibrate the cryptographic mechanisms, we can reduce by one bit the size of the cryptographic algorithm security parameters, keeping the semantics of the safety mechanisms, achieving, in consequence reduction of process runtime. Besides, we have introduced re-encryption tests that were not performed by Coron [1]. We have also

found results for the parameter λ (Large) what was not achieved by Bilar [2]. As future work, we will try to experiment with other heuristics such as ant colonies or any other biological inspired algorithm.

[1] Bilar, G. R. (2014)

REFERENCES

“Implementação do esquema totalmente

homomórfico sobre números inteiros utilizando python com compressão de chave pública” . – Trabalho de Graduação - UNIVEM. [2] Boneh, D., Halevi, S., Hamburg, M., et al. “Circular-secure encryption from decision diffie-hellman”. In: Advances in Cryptology–CRYPTO 2008,Springer, pp. 2008. [3] Buchmann, Johannes A. “Introdução a Criptografia”. Ed. Berkeley, São Paulo 2002. [4] Brakerski, Z., gentry, C., Vaikuntanathan, V. “Fully homomórfica encryption without bootstrapping”, ITCS 2012, 2012. [5] Coron, J., Naccache, D., Tibouchi, M. Optimization of Fully Homomórfica Encryption. Cryptology ePrint Archive, Report 2011/440, 2012., 2012.

[6] Coron, J., Mandal, A., Naccache, D., et al. “Fully homomorphic encryption over the integers with shorter public keys”, Advances in Cryptology–, pp. 487–504, 2011. [7] CSA Security Guidance for Critical Areas of Focus in Cloud Computing –v2.1. Cloud Security Alliance.2009. [8] DHGV - Dijk., M. Van, Gentry, C., Halevi, S. e Vaikuntanathan, V., Fully homomorphic encryption over the integers. In H. Gilbert (Ed.), EUROCRYPT 2010, LNCS, vol. 6110, Springer, p. 24-43, 2010. [9] Gentry, C. “Fully homomórfica encryption using ideal lattices”. In:

Proceedings of the 41st annual ACM symposium on Theory of computing, pp. 169–178. ACM, 2009. [10]Lacerda, E.G.M e Carvalho, A.C.P.L. “Introdução aos algoritmos genéticos”, In: Sistemas inteligentes: aplicações a recursos hídricos e ciências ambientais. Editado por Galvão, C.O., Valença, M.J.S. Ed. Universidade/UFRGS: ABRH. 1999. [11]Michael O. Rabin. Probabilistic algorithm for testing primality. Journal of Number Theory , 12(1):128 – 138, 1980. [12]Morris, Christopher , "Navy Ultra's Poor Relations", in Hinsley, F.H.; Stripp, Alan, Codebreakers: The inside story of Bletchley Park, Oxford:

Oxford University Press, p. 235, 978-0-19-280132-6- 1993 [13] NIST- National institute of standards and technology. Cyber security Framework Development Overview.NIST’s Role in Implementing Executive Order 7213636, Improving Critical Infrastructure Cybersecurity, Presentation to ISPAB, 2013. [14]RDA - R L Rivest, L Adleman, and M L Dertouzos. On data banks and privacy homomorphisms, in r. a. demillo et al. In Eds.), Foundations of Secure Computation, . Academic Press, 1978. [15] Smart, N.; Vercauteren, F. Fully homomórfica encryption with relatively small key and ciphertext sizes. Cryptology ePrint Archive, Report 2009/571, 2009. [16]Stalling, Willian, Criptografia e Segurança de Redes: Princípios E Práticas 4. Ed. Prentice Hall Brasil, pag 17-36. 2007. [17] Sousa, F. R. C.; M, L. O.; Machado, J. C. Computação em Nuvem:

Conceitos, Tecnologias, Aplicações e Desafios. Fortaleza. 2009. [18] William Stein. SAGE: A Computer System for Algebra and Geometry Experimentation. 2012.