Sie sind auf Seite 1von 2

c, every positive integer has a unique prime factorization.

(By convention 1 is
the empty product.) If the integer is prime then it can be recognized as such in
polynomial time. If composite however, the theorem gives no insight into how to
obtain the factors.
Given a general algorithm for integer factorization, any integer can be factored
down to its constituent prime factors simply by repeated application of this al
gorithm. The situation is more complicated with special-purpose factorization al
gorithms, whose benefits may not be realized as well or even at all with the fac
tors produced during decomposition. For example, if N = 10 p q where p < q are v
ery large primes, trial division will quickly produce the factors 2 and 5 but wi
ll take p divisions to find the next factor. As a contrasting example, if N is t
he product of the primes 13729, 1372933, and 18848997161, where 13729 1372933 =
18848997157, Fermat's factorization method will start out with a = ?vN? = 188489
97159 which immediately yields b = va2 - N = v4 = 2 and hence the factors a - b
= 18848997157 and a + b = 18848997161. While these are easily recognized as resp
ectively composite and prime, Fermat's method will take much longer to factorize
the composite one because the starting value of ?v18848997157? = 137292 for a i
s nowhere near 1372933.
In number theory, integer factorization is the decomposition of a composite numb
er into a product of smaller integers. If these integers are further restricted
to prime numbers, the process is called prime factorization.
When the numbers are very large, no efficient, non-quantum integer factorization
algorithm is known. An effort by several researchers, concluded in 2009, to fac
tor a 232-digit number (RSA-768) utilizing hundreds of machines took two years a
nd the researchers estimated that a 1024-bit RSA modulus would take about a thou
sand times as long.[1] However, it has not been proven that no efficient algorit
hm exists. The presumed difficulty of this problem is at the heart of widely use
d algorithms in cryptography such as RSA. Many areas of mathematics and computer
science have been brought to bear on the problem, including elliptic curves, al
gebraic number theory, and quantum computing.
Not all numbers of a given length are equally hard to factor. The hardest instan
ces of these problems (for currently known techniques) are semiprimes, the produ
ct of two prime numbers. When they are both large, for instance more than two th
ousand bits long, randomly chosen, and about the same size (but not too close, e
.g., to avoid efficient factorization by Fermat's factorization method), even th
e fastest prime factorization algorithms on the fastest computers can take enoug
h time to make the search impractical; that is, as the number of digits of the p
rimes being factored increases, the number of operations required to perform the
factorization on any computer increases drastically.
Many cryptographic protocols are based on the difficulty of factoring large comp
osite integers or a related problem for example, the RSA problem. An algorithm tha
t efficiently factors an arbitrary integer would render RSA-based public-key cry
ptogra
One way to classify composite numbers is by counting the number of prime factors
. A composite number with two prime factors is a semiprime or 2-almost prime (th
e factors need not be distinct, hence squares of primes are included). A composi
te number with three distinct prime factors is a sphenic number. In some applica
tions, it is necessary to differentiate between composite numbers with an odd nu
mber of distinct prime factors and those with an even number of distinct prime f
actors. For the latter
{\displaystyle \mu (n)=(-1)^{2x}=1\,} \mu(n) = (-1)^{2x} = 1\,
(where is the Mbius function and x is half the total of prime factors), while for

the former
{\displaystyle \mu (n)

Das könnte Ihnen auch gefallen