Talk:Lenstra elliptic curve factorization

From Wikipedia, the free encyclopedia

Try using a better browser for your edits, or use an offline editor like TextPad or NotePad so you won't lose what you type. --Ed Poor

The text "20 digits" should be replaced by "the cubic root of the number to be factored".

Average and worst case runtime? When was the method developed?Possible to give examples?

Most people who know this stuff would characterise ECM as "probabilistic" and the seive methods MPQS and GNFS as "deterministic". It is possible to voice minor quibbles in both cases, but to assert exactly the opposite as the article does, is totally wrong. The rest of the article looks mostly ok. Possible deliberate vandalism by a later editor? Or just momentary brainfade? 203.164.222.80 02:03, 30 October 2006 (UTC)


QS, MPQS and GNFS are classed as true random algorithms (this is unquestionably so) as when the final relation x^2 = y^2 mod n [means (x-y) (x+y) = 0 mod n] is achieved there is a fifty-fifty chance that it will yield a trivial factor when n is the product of two primes, and fifty-fifty chance it will give a non trivial factor, so there is no way to make it deterministic. If the first relation achieved yields a trivial factor another relationship must be generated. (This is thus analagous to the complexity class ZPP. In ZPP, the expected running time is polynomial, but the worst case is exponential.) It is exponentially unlikely but theoretically possbible that one would have to generate exponential number of relations x^2 = y^2 mod n to get a non trivial factor. I don't believe current theory even by invoking the Riemann hypothesis and its generalizations and extensions can provably get this down to even sub-exponential. Maybe, but I highly doubt it, it would be possible to provably get it down to n^c trials, where c is 1/2 or so for the quadratic sieve, but I highly doubt it. One might have better luck with Dixon's algorithm, but I doubt it highly as well.

The basic problem is if (x-y) = 0 or (x+y) = 0 mod n one gets a trivial factor and there is no known way to force x and y to not be so ahead of time. To do so is equivalant to factoring n.

Perhaps the most accurate and precise statement would be: "Currently there is no known way to make NFS deterministic. While no rigerous proof exists that the NFS is a probabilistic algorithm, it is widely believed to be probabilistic."

The Pollard p-1 method and the EC Factoring method may in practice be implemented in a random way, but nothing in the theory says it can't be implemented in a deterministic way using pseudo randomness. In the case of p-1 method the base can chosen to be two or any small number and the power (which is a product of primes up to some limit) is chosen in advance and does not need randomness.

One can pick a seed non randomly (say a small number less than the the log of the number to factor) generate pseudo random numbers from this seed to get the needed "pseudo-random" parameters for the curves and needed points. Since the needed seed is very small one has actually derandomized the algorithm with only a log n slow down at most, so one in effect has derandomized the algorithm. Perhaps a better way to state this with only a poly(log n) slow down the EC factoring method can be derandomized and hence is deterministic. My algorithmic number theory professor always referred to the p-1 method as deterministic and I see no theoretical reason why the same logic doesn't apply to the EC factoring method.

Current theory on elliptic curves, I believe, does not give any asymptotic advantage in implementing the EC factoring method in a true random manner. I am sure the seed could be very small, one could always pick say 2 and run it through a decent but unchanging pseudo random number generator that was fixed for one's implementation. By unchanging I mean a pseudo random number generator that would not change no matter how many times it was run for any numbers that were being factored. Thus one would have no real slow down, not even a poly one, like stated above, just the time to create the pseudo random parameters and points, which would not change the asymptotic running time.

The above is how a complexity theorist would view the matter as even a polynomial time slow downs "doesn't count" in the sense of changing the basic complexity, which is sub-exponential in the small factor of the number being factored. I should check see how algorithimic number theorists view the matter for EC factoring method.

It is possible that some variant optimized implementations of EC factoring method may not be derandomized so easily, but the basic, plain vanilla algorithm can and hence should like the Pollard p-1 method be considered in a strict technical sense deterministic. Depending on the implementation it may make sense to use true randomness, but I doubt if it would make any difference in most implementations. 71.117.44.150 14:02, 19 June 2007 (UTC)