Talk:Logistic map
From Wikipedia, the free encyclopedia
- The logistic map is related to the Mandelbrot set by the equation c=(1-(r-1)2)/4.
I can't follow the algebra here. Mandelbrot is z[n+1] = z[n]^2 + c and logistic is x[n+1] = r x[n] (1-x[n]). I don't think the formula given above converts these formulas into each other. AxelBoldt 02:24 Sep 30, 2002 (UTC)
- I agree.
Are there any other formulae that have this property.
I.e., you have the Mandelbrot set and the logistic map being a degree 2 polynomial. Are there any interesting examples with degree 3? Degree 4? Other types of functions?
Yes. If you iterate practically any function that's not linear (not a straight line), then there will be some starting values that give chaotic (although, of course, perfectly deterministic) results. The FractInt software comes with lots of them. -- DavidCary
(Should this answer go on the article page ?)
There seems to be a problem with this equation:
x_{n+1} = r x_n (1 - x_n),
One of the quantities X sub n on the right side of the equation should be an X sub 0. I'm not sure which one, though.
Nope. Both on the right are supposed to be x_n. -- DavidCary
Regarding "also given access to perfect computation...". This statement is logically false, because a logistic map, in principle, operates on un-computable numbers. This means that even a theoretically ("in principle") "perfect" computer could not compute the vast majority of trajectories. And by vast majority, I mean 99.999999999...%. For a formal definition of computability, see Turing's essay "On Computable Numbers".
-- Kevin Baas 20:11, 26 Feb 2004 (UTC)
I was going to fix this paragraph myself, and I had much trouble, because when ever I encountered a "not the same as..." I thought for a minute and eventually answered "Yes it is." The proper way to speak of predictability/unpredictability is in information-theoretic terms. In information theoretic terms, one speaks of probability, noise, and divergence. A stochastic system is probabilistic, but may still very well be a continuous stochastic system, that is, have an information dimension of one, and ultimately represent the same amount of information as a chaotic system. Indeed, one can have a stochastic chaotic system, with an arbitrary rate infromation decay/divergence. Chaotic and stochastic are orthogonal classifications. A linear(non-chaotic) and non-stochastic system simply evades the question of information and therefore predictability altogether. We must therefore disclude it from any concept of predictability or randomness. Indeed, randomness already discludes all non-stochastic systems. Where a system is "random", it has "divergence", times a scalar. A system may have divergence distributed evenly in resepct to a real-valued parameter. In this case, one simply adds a "noise" term: + N. This is so-called "randomness". In a chaotic system, on the other hand, the divergence is concentrated by a point. Regardless, one has convergence and divergence, and in either case noise is noise is noise. i.e. randomness is divergence is noise. In terms of "causuality" in the markov sense, there is really no fundamental discriminatory factor. It still remains possible, ofcourse, for one to make a (superficial) mathematical distinction, but one should not look for any ontological distinction - it is ontologically unnecessary for there to be a distinction.
But I have not yet been complete enough, one might still raise the objection that a chaotic system is fully specified by a finite set of symbols and is non-stochastic. However, my point is that this is completely irrelevant. The symbols are arbitrary.
A chaotic is neccessarily a non-equilibrium system; chaotic systems only exist where there is a flow of energy, and thus also neccessarily, a flow of information. The "unpredictability" of a chaotic system is part of this flow, just as the "noise" in a stochastic system is an influx of information. The difference is simply whether the distribution of this influx - this flow - is flat with respect to a given measurement.
But there is also the philosophical problem of "noise" as extrinsic "unknown" (determinism) - the scientific deterministic assumption - vs. noise as an in-itself (free-will). But in either case there is, inextricably, noise, and thus noise, since it cannot be reduced, replaced, or removed, must ultimatly be in-itself, regardless of whether it is "extrinsic" or "intrinsic". (The question was never what name we give to the difference of the being-of-noise, but rather the simply differential character of the difference itself.) The more fundamental question is not intrinisic vs. extrinsic, but in-itself (essential) vs. phenomenological. But the two positions are informationally equaivalent/indistinguishable: one nonetheless has "noise", "information", and everything else. There is no way to statistically distingush among the consequences beyond a mere quantification of the "noise rate". -- Kevin Baas 06:21, 28 Feb 2004 (UTC)
"With r between 3 and 1+√6 (approximately 3.45)..." It seems like there's a typo in the 1+6, but I'm not sure what it should be --Kevinatilusa
Why does it keep saying independent of inital seed? It seems like (r-1)/r is a solution no matter what r is (r>=1). For example if r was 4, you wouldn't get much chaotic behavior if you started with .75 .Is the talk about period and chaotic behavior talking about a general seed such as a transcendental one? Hiiiiiiiiiiiiiiiiiiiii 00:41, 23 May 2006 (UTC)
[edit] note
if r = 4, the logistic map appears to be solvable: set
then if
we have
which can be easily checked. Scythe33 20:55, 18 December 2006 (UTC)
Yes, that's correct. There are also explicit solutions for r = 2 and r = -2 More information can be found here: [1] Scribblesinmindscapes 16:29, 10 January 2007 (UTC)
[edit] Chaos (or period 3) begins at precisely r = 1 + Sqrt(8)
There are methods for proving that period 3 begins at exactly r = 1 + Sqrt(8). I can provide a published proof if necessary, but also did the proof myself as a presentation in a chaos theory class. I think this section could benefit by mentioning this. 68.55.58.255 07:41, 23 May 2007 (UTC)
[edit] Biology
It would be interesting to read about connections to biology. Are there interesting observations in nature predicted by this? (Most interesting seems the range 3-3.45) —Preceding unsigned comment added by 130.242.107.211 (talk) 02:55, 7 April 2008 (UTC)