Primality test

From Wikipedia, the free encyclopedia

A primality test is an algorithm for determining whether an input number is prime. It is important to note the difference between primality testing and integer factorization. As of 2007, factorization is a computationally hard problem, whereas primality testing is comparatively easy.

Contents

[edit] Naïve methods

The simplest primality test is as follows: Given an input number n, we see if any integer m from 2 to n-1 divides n. If n is divisible by any m then n is composite, otherwise it is prime.

Rather than testing all m up to n-1, we need only test m up to \sqrt n: if n is composite then it can be factored into two values, at least one of which is less than or equal to \sqrt n.

We can also improve the efficiency by skipping all even m except 2, since if any even number divides n then 2 does. We can further improve by observing that all primes are of the form 6k ± 1, with the only exceptions of 2 and 3. This is because all integers can be expressed as (6k + i) for some k and for i = -1, 0, 1, 2, 3, or 4; 2 divides (6k + 0), (6k + 2), (6k + 4); and 3 divides (6k + 3). We first test if n is divisible by 2 or 3, then we run through all the numbers of form 6k ± 1 \leq\sqrt n. This is 3 times as fast as the previous method. In fact, all primes are of the form c#k + i for i < c# where i represents the numbers that are coprime to c#. In fact, as c \rightarrow \infty the number of values that c#k + i can take over a certain range decreases, and so the time to test n decreases. For this method, you must divide by all primes that are less than c. Observations analogous to the preceding can be applied recursively, giving the Sieve of Eratosthenes method.

A good way to speed up these methods (and all the others mentioned below) is to pre-compute and store a list of all primes up to a certain bound, say all primes up to 200. (Such a list can be computed with the Sieve of Eratosthenes). Then, before testing n for primality with a serious method, one first checks whether n is divisible by any prime from the list.

[edit] Probabilistic tests

Most popular primality tests are probabilistic tests. These tests use, apart from the tested number n, some other numbers a which are chosen at random from some sample space; the usual randomized primality tests never report a prime number as composite, but it is possible for a composite number to be reported as prime. The probability of error can be reduced by repeating the test with several independently chosen as; for two commonly used tests, for any composite n at least half the as detect n 's compositeness, so k repetitions reduce the error probability to at most 2−k, which can be made arbitrarily small by increasing k.

The basic structure of randomized primality tests is as follows:

  1. Randomly pick a number a.
  2. Check some equality involving a and the given number n. If the equality fails to hold true, then n is a composite number, a is known as a witness for the compositeness, and the test stops.
  3. Repeat from step 1 until the required certainty is achieved.

After several iterations, if n is not found to be a composite number, then it can be declared probably prime.

The simplest probabilistic primality test is the Fermat primality test. It is only a heuristic test; some composite numbers (Carmichael numbers) will be declared "probably prime" no matter what witness is chosen. Nevertheless, it is sometimes used if a rapid screening of numbers is needed, for instance in the key generation phase of the RSA public key cryptographical algorithm.

The Miller-Rabin primality test and Solovay-Strassen primality test are more sophisticated variants which detect all composites (once again, this means: for every composite number n, at least 3/4 (Miller-Rabin) or 1/2 (Solovay-Strassen) of numbers a are witnesses of compositeness of n). They are often the methods of choice, as they are much faster than other general primality tests.

Leonard Adleman and Huang presented an errorless (but expected polynomial-time) variant of the elliptic curve primality test. Unlike the other probabilistic tests, this algorithm produces a certificate for primality, and thus can be used to prove that a number is prime. The algorithm is prohibitively slow in practice.

[edit] Fast deterministic tests

The first deterministic primality test significantly faster than the naïve methods was the cyclotomy test; its runtime can be proven to be O((log n)clog log log n), where n is the number to test for primality and c is a constant independent of n.

The elliptic curve primality test can be proven to run in O((log n)6), but only if some still unproven (but widely assumed to be true) statements of analytic number theory are used. It is one of the most often used deterministic tests in practice.

The implementation of these two methods is rather difficult, creating a risk of programming errors; this is one reason they are not preferred.

If the generalized Riemann hypothesis is assumed, the Miller-Rabin test can be turned into a deterministic version with runtime Õ((log n)4). In practice, this algorithm is slower than the other two for sizes of numbers that can be dealt with at all.

In 2002, Manindra Agrawal, Nitin Saxena and Neeraj Kayal described a new deterministic primality test, the AKS primality test, which they proved runs in Õ((log n)12). This was the first deterministic primality test with proven polynomial run-time. If a widely believed conjecture about the distribution of Sophie Germain primes is true, then it runs in Õ((log n)6). In practice, this algorithm is slower than probabilistic methods.

[edit] Complexity

In computational complexity theory, the formal language corresponding to the prime numbers is denoted as PRIMES. It is easy to show that PRIMES is in coNP: its complement COMPOSITES is in NP because one can decide compositeness by nondeterministically guessing a factor.

In 1975, Vaughan Pratt showed that there existed a certificate for primality that was checkable in polynomial time, and thus that PRIMES was in NP, and therefore in NP ∩ coNP. See primality certificate for details.

The subsequent discovery of the Solovay-Strassen and Miller-Rabin algorithms put PRIMES in coRP. In 1992, the Adleman-Huang algorithm reduced the complexity to ZPP = RPcoRP, which superseded Pratt's result.

The cyclotomy test of Adleman, Pomerance, and Rumely from 1983 put PRIMES in QP (quasi-polynomial time), which is not known to be comparable with the classes mentioned above.

Because of its tractability in practice, polynomial-time algorithms assuming the Riemann hypothesis, and other similar evidence, it was long suspected but not proven that primality could be solved in polynomial time. The existence of the AKS primality test finally settled this long-standing question and placed PRIMES in P. However, PRIMES is not known to be P-complete, and it is not known whether it lies in classes lying inside P such as L or NC.

[edit] Number-theoretic methods

Certain number-theoretic methods exist for testing whether a number is prime, such as the Lucas-Lehmer test and Proth's test. These tests typically require factorization of n + 1, n − 1, or a similar quantity, which means that they are not useful for general-purpose primality testing, but they are often quite powerful when the tested number n is known to have a special form.

The Lucas-Lehmer test relies on the fact that the multiplicative order of a number a modulo n is n − 1 for a prime n when a is a primitive root modulo n. If we can show a is primitive for n, we can show n is prime.

[edit] See also

[edit] External links

[edit] References