The logistic map is a polynomialmapping (equivalently, recurrence relation) of degree 2, often referred to as an archetypal example of how complex, chaotic behaviour can arise from very simple nonlinear dynamical equations. The map, initially utilized by Edward Lorenz in the 1960s to showcase irregular solutions (e.g., Eq. 3 of [1]), was popularized in a 1976 paper by the biologist Robert May,[2] in part as a discrete-time demographic model analogous to the logistic equation written down by Pierre François Verhulst.[3]
Mathematically, the logistic map is written
(1)
where xn is a number between zero and one, which represents the ratio of existing population to the maximum possible population.
This nonlinear difference equation is intended to capture two effects:
reproduction, where the population will increase at a rate proportional to the current population when the population size is small,
starvation (density-dependent mortality), where the growth rate will decrease at a rate proportional to the value obtained by taking the theoretical "carrying capacity" of the environment less the current population.
The usual values of interest for the parameter r are those in the interval [0, 4], so that xn remains bounded on [0, 1]. The r = 4 case of the logistic map is a nonlinear transformation of both the bit-shift map and the μ = 2 case of the tent map. If r > 4, this leads to negative population sizes. (This problem does not appear in the older Ricker model, which also exhibits chaotic dynamics.) One can also consider values of r in the interval [−2, 0], so that xn remains bounded on [−0.5, 1.5].[4]
Characteristics of the map
edit
Behavior dependent on r
edit
The image below shows the amplitude and frequency content of some logistic map iterates for parameter values ranging from 2 to 4.
By varying the parameter r, the following behavior is observed:
With r between 0 and 1, the population will eventually die, independent of the initial population.
With r between 1 and 2, the population will quickly approach the value r − 1/r, independent of the initial population.
With r between 2 and 3, the population will also eventually approach the same value r − 1/r, but first will fluctuate around that value for some time. The rate of convergence is linear, except for r = 3, when it is dramatically slow, less than linear (see Bifurcation memory).
With r between 3 and 1 + √6 ≈ 3.44949 the population will approach permanent oscillations between two values. These two values are dependent on r and given by[4].
With r between 3.44949 and 3.54409 (approximately), from almost all initial conditions the population will approach permanent oscillations among four values. The latter number is a root of a 12th degree polynomial (sequence A086181 in the OEIS).
With r increasing beyond 3.54409, from almost all initial conditions the population will approach oscillations among 8 values, then 16, 32, etc. The lengths of the parameter intervals that yield oscillations of a given length decrease rapidly; the ratio between the lengths of two successive bifurcation intervals approaches the Feigenbaum constantδ ≈ 4.66920. This behavior is an example of a period-doubling cascade.
At r ≈ 3.56995 (sequence A098587 in the OEIS) is the onset of chaos, at the end of the period-doubling cascade. From almost all initial conditions, we no longer see oscillations of finite period. Slight variations in the initial population yield dramatically different results over time, a prime characteristic of chaos.
Most values of r beyond 3.56995 exhibit chaotic behaviour, but there are still certain isolated ranges of r that show non-chaotic behavior; these are sometimes called islands of stability. For instance, beginning at 1 + √8[5] (approximately 3.82843) there is a range of parameters r that show oscillation among three values, and for slightly higher values of r oscillation among 6 values, then 12 etc.
The development of the chaotic behavior of the logistic sequence as the parameter r varies from approximately 3.56995 to approximately 3.82843 is sometimes called the Pomeau–Manneville scenario, characterized by a periodic (laminar) phase interrupted by bursts of aperiodic behavior. Such a scenario has an application in semiconductor devices.[7] There are other ranges that yield oscillation among 5 values etc.; all oscillation periods occur for some values of r. A period-doubling window with parameter c is a range of r-values consisting of a succession of subranges. The kth subrange contains the values of r for which there is a stable cycle (a cycle that attracts a set of initial points of unit measure) of period 2kc. This sequence of sub-ranges is called a cascade of harmonics.[8] In a sub-range with a stable cycle of period 2k*c, there are unstable cycles of period 2kc for all k < k*. The r value at the end of the infinite sequence of sub-ranges is called the point of accumulation of the cascade of harmonics. As r rises there is a succession of new windows with different c values. The first one is for c = 1; all subsequent windows involving odd c occur in decreasing order of c starting with arbitrarily large c.[8][9]
At , two chaotic bands of the bifurcation diagram intersect in the first Misiurewicz point for the logistic map. It satisfies the equations .[10]
Beyond r = 4, almost all initial values eventually leave the interval [0,1] and diverge. The set of initial conditions which remain within [0,1] form a Cantor set and the dynamics restricted to this Cantor set is chaotic.[11]
For any value of r there is at most one stable cycle. If a stable cycle exists, it is globally stable, attracting almost all points.[12]: 13 Some values of r with a stable cycle of some period have infinitely many unstable cycles of various periods.
The bifurcation diagram at right summarizes this. The horizontal axis shows the possible values of the parameter r while the vertical axis shows the set of values of x visited asymptotically from almost all initial conditions by the iterates of the logistic equation with that r value.
The bifurcation diagram is a self-similar: if we zoom in on the above-mentioned value r ≈ 3.82843 and focus on one arm of the three, the situation nearby looks like a shrunk and slightly distorted version of the whole diagram. The same is true for all other non-chaotic points. This is an example of the deep and ubiquitous connection between chaos and fractals.
We can also consider negative values of r:
For r between -2 and -1 the logistic sequence also features chaotic behavior.[4]
With r between -1 and 1 - √6 and for x0 between 1/r and 1-1/r, the population will approach permanent oscillations between two values, as with the case of r between 3 and 1 + √6, and given by the same formula.[4]
Chaos and the logistic map
edit
The relative simplicity of the logistic map makes it a widely used point of entry into a consideration of the concept of chaos. A rough description of chaos is that chaotic systems exhibit a great sensitivity to initial conditions—a property of the logistic map for most values of r between about 3.57 and 4 (as noted above).[2] A common source of such sensitivity to initial conditions is that the map represents a repeated folding and stretching of the space on which it is defined. In the case of the logistic map, the quadraticdifference equation describing it may be thought of as a stretching-and-folding operation on the interval (0,1).[13]
The following figure illustrates the stretching and folding over a sequence of iterates of the map. Figure (a), left, shows a two-dimensional Poincaré plot of the logistic map's state space for r = 4, and clearly shows the quadratic curve of the difference equation (1). However, we can embed the same sequence in a three-dimensional state space, in order to investigate the deeper structure of the map. Figure (b), right, demonstrates this, showing how initially nearby points begin to diverge, particularly in those regions of xt corresponding to the steeper sections of the plot.
This stretching-and-folding does not just produce a gradual divergence of the sequences of iterates, but an exponential divergence (see Lyapunov exponents), evidenced also by the complexity and unpredictability of the chaotic logistic map. In fact, exponential divergence of sequences of iterates explains the connection between chaos and unpredictability: a small error in the supposed initial state of the system will tend to correspond to a large error later in its evolution. Hence, predictions about future states become progressively (indeed, exponentially) worse when there are even very small errors in our knowledge of the initial state. This quality of unpredictability and apparent randomness led the logistic map equation to be used as a pseudo-random number generator in early computers.[13]
At r = 2, the function intersects precisely at the maximum point, so convergence to the equilibrium point is on the order of . Consequently, the equilibrium point is called "superstable". Its Lyapunov exponent is . A similar argument shows that there is a superstable value within each interval where the dynamical system has a stable cycle. This can be seen in the Lyapunov exponent plot as sharp dips.[14]
Since the map is confined to an interval on the real number line, its dimension is less than or equal to unity. Numerical estimates yield a correlation dimension of 0.500±0.005 (Grassberger, 1983), a Hausdorff dimension of about 0.538 (Grassberger 1981), and an information dimension of approximately 0.5170976 (Grassberger 1983) for r ≈ 3.5699456 (onset of chaos). Note: It can be shown that the correlation dimension is certainly between 0.4926 and 0.5024.
It is often possible, however, to make precise and accurate statements about the likelihood of a future state in a chaotic system. If a (possibly chaotic) dynamical system has an attractor, then there exists a probability measure that gives the long-run proportion of time spent by the system in the various regions of the attractor. In the case of the logistic map with parameter r = 4 and an initial state in (0,1), the attractor is also the interval (0,1) and the probability measure corresponds to the beta distribution with parameters a = 0.5 and b = 0.5. Specifically,[15] the invariant measure is
Unpredictability is not randomness, but in some circumstances looks very much like it. Hence, and fortunately, even if we know very little about the initial state of the logistic map (or some other chaotic system), we can still say something about the distribution of states arbitrarily far into the future, and use this knowledge to inform decisions based on the state of the system.
importnumpyasnpimportmatplotlib.pyplotaspltinterval=(2.8,4)# start, endaccuracy=0.0001reps=600# number of repetitionsnumtoplot=200lims=np.zeros(reps)fig,biax=plt.subplots()fig.set_size_inches(16,9)lims[0]=np.random.rand()forrinnp.arange(interval[0],interval[1],accuracy):foriinrange(reps-1):lims[i+1]=r*lims[i]*(1-lims[i])biax.plot([r]*numtoplot,lims[reps-numtoplot:],"b.",markersize=0.02)biax.set(xlabel="r",ylabel="x",title="logistic map")plt.show()
Special cases of the map
edit
Upper bound when 0 ≤ r ≤ 1
edit
Although exact solutions to the recurrence relation are only available in a small number of cases, a closed-form upper bound on the logistic map is known when 0 ≤ r ≤ 1.[16] There are two aspects of the behavior of the logistic map that should be captured by an upper bound in this regime: the asymptotic geometric decay with constant r, and the fast initial decay when x0 is close to 1, driven by the (1 − xn) term in the recurrence relation. The following bound captures both of these effects:
Solution when r = 4
edit
The special case of r = 4 can in fact be solved exactly, as can the case with r = 2;[17] however, the general case can only be predicted statistically.[18]
The solution when r = 4 is,[17][19]
where the initial condition parameter θ is given by
For rational θ, after a finite number of iterations xn maps into a periodic sequence. But almost all θ are irrational, and, for irrational θ, xn never repeats itself – it is non-periodic. This solution equation clearly demonstrates the two key features of chaos – stretching and folding: the factor 2n shows the exponential growth of stretching, which results in sensitive dependence on initial conditions, while the squared sine function keeps xn folded within the range [0,1].
For r = 4 an equivalent solution in terms of complex numbers instead of trigonometric functions is[17]
where α is either of the complex numbers
with modulus equal to 1. Just as the squared sine function in the trigonometric solution leads to neither shrinkage nor expansion of the set of points visited, in the latter solution this effect is accomplished by the unit modulus of α.
for x0 ∈ [0,1). Since (1 − 2x0) ∈ (−1,1) for any value of x0 other than the unstable fixed point 0, the term (1 − 2x0)2n goes to 0 as n goes to infinity, so xn goes to the stable fixed point 1/2.
Finding cycles of any length when r = 4
edit
For the r = 4 case, from almost all initial conditions the iterate sequence is chaotic. Nevertheless, there exist an infinite number of initial conditions that lead to cycles, and indeed there exist cycles of length k for all integers k > 0. We can exploit the relationship of the logistic map to the dyadic transformation (also known as the bit-shift map) to find cycles of any length. If x follows the logistic map xn + 1 = 4xn(1 − xn) and y follows the dyadic transformation
The reason that the dyadic transformation is also called the bit-shift map is that when y is written in binary notation, the map moves the binary point one place to the right (and if the bit to the left of the binary point has become a "1", this "1" is changed to a "0"). A cycle of length 3, for example, occurs if an iterate has a 3-bit repeating sequence in its binary expansion (which is not also a one-bit repeating sequence): 001, 010, 100, 110, 101, or 011. The iterate 001001001... maps into 010010010..., which maps into 100100100..., which in turn maps into the original 001001001...; so this is a 3-cycle of the bit shift map. And the other three binary-expansion repeating sequences give the 3-cycle 110110110... → 101101101... → 011011011... → 110110110.... Either of these 3-cycles can be converted to fraction form: for example, the first-given 3-cycle can be written as 1/7 → 2/7 → 4/7 → 1/7. Using the above translation from the bit-shift map to the logistic map gives the corresponding logistic cycle 0.611260467... → 0.950484434... → 0.188255099... → 0.611260467.... We could similarly translate the other bit-shift 3-cycle into its corresponding logistic cycle. Likewise, cycles of any length k can be found in the bit-shift map and then translated into the corresponding logistic cycles.
However, since almost all numbers in [0,1) are irrational, almost all initial conditions of the bit-shift map lead to the non-periodicity of chaos. This is one way to see that the logistic r = 4 map is chaotic for almost all initial conditions.
The number of cycles of (minimal) length k = 1, 2, 3,… for the logistic map with r = 4 (tent map with μ = 2) is a known integer sequence (sequence A001037 in the OEIS): 2, 1, 2, 3, 6, 9, 18, 30, 56, 99, 186, 335, 630, 1161.... This tells us that the logistic map with r = 4 has 2 fixed points, 1 cycle of length 2, 2 cycles of length 3 and so on. This sequence takes a particularly simple form for prime k: 2 ⋅ 2k − 1 − 1/k. For example: 2 ⋅ 213 − 1 − 1/13 = 630 is the number of cycles of length 13. Since this case of the logistic map is chaotic for almost all initial conditions, all of these finite-length cycles are unstable.
Universality
edit
Period-doubling route to chaos
edit
In the logistic map, we have a function , and we want to study what happens when we iterate the map many times. The map might fall into a fixed point, a fixed cycle, or chaos. When the map falls into a stable fixed cycle of length , we would find that the graph of and the graph of intersects at points, and the slope of the graph of is bounded in at those intersections.
For example, when , we have a single intersection, with slope bounded in , indicating that it is a stable single fixed point.
As increases to beyond , the intersection point splits to two, which is a period doubling. For example, when , there are three intersection points, with the middle one unstable, and the two others stable.
As approaches , another period-doubling occurs in the same way. The period-doublings occur more and more frequently, until at a certain , the period doublings become infinite, and the map becomes chaotic. This is the period-doubling route to chaos.
Relationship between and when . Before the period doubling bifurcation occurs. The orbit converges to the fixed point .
Relationship between and when . The tangent slope at the fixed point . is exactly 1, and a period doubling bifurcation occurs.
Relationship between and when . The fixed point becomes unstable, splitting into a periodic-2 stable cycle.
When , we have a single intersection, with slope exactly , indicating that it is about to undergo a period-doubling.
When , there are three intersection points, with the middle one unstable, and the two others stable.
When , there are three intersection points, with the middle one unstable, and the two others having slope exactly , indicating that it is about to undergo another period-doubling.
Looking at the images, one can notice that at the point of chaos , the curve of looks like a fractal. Furthermore, as we repeat the period-doublings, the graphs seem to resemble each other, except that they are shrunken towards the middle, and rotated by 180 degrees.
This suggests to us a scaling limit: if we repeatedly double the function, then scale it up by for a certain constant :then at the limit, we would end up with a function that satisfies . This is a Feigenbaum function, which appears in most period-doubling routes to chaos (thus it is an instance of universality). Further, as the period-doubling intervals become shorter and shorter, the ratio between two period-doubling intervals converges to a limit, the first Feigenbaum constant .
The constant can be numerically found by trying many possible values. For the wrong values, the map does not converge to a limit, but when it is , it converges. This is the second Feigenbaum constant.
Chaotic regime
edit
In the chaotic regime, , the limit of the iterates of the map, becomes chaotic dark bands interspersed with non-chaotic bright bands.
Other scaling limits
edit
When approaches , we have another period-doubling approach to chaos, but this time with periods 3, 6, 12, ... This again has the same Feigenbaum constants . The limit of is also the same Feigenbaum function. This is an example of universality.
We can also consider period-tripling route to chaos by picking a sequence of such that is the lowest value in the period- window of the bifurcation diagram. For example, we have , with the limit . This has a different pair of Feigenbaum constants .[20] And converges to the fixed point to As another example, period-4-pling has a pair of Feigenbaum constants distinct from that of period-doubling, even though period-4-pling is reached by two period-doublings. In detail, define such that is the lowest value in the period- window of the bifurcation diagram. Then we have , with the limit . This has a different pair of Feigenbaum constants .
In general, each period-multiplying route to chaos has its own pair of Feigenbaum constants. In fact, there are typically more than one. For example, for period-7-pling, there are at least 9 different pairs of Feigenbaum constants.[20]
Generally, , and the relation becomes exact as both numbers increase to infinity: .
The gradual increase of at interval changes dynamics from regular to chaotic one [23] with qualitatively the same bifurcation diagram as those for logistic map.
Renormalization estimate
edit
The Feigenbaum constants can be estimated by a renormalization argument. (Section 10.7,[14]).
By universality, we can use another family of functions that also undergoes repeated period-doubling on its route to chaos, and even though it is not exactly the logistic map, it would still yield the same Feigenbaum constants.
Define the family The family has an equilibrium point at zero, and as increases, it undergoes period-doubling bifurcation at .
The first bifurcation occurs at . After the period-doubling bifurcation, we can solve for the period-2 stable orbit by , which yields At some point , the period-2 stable orbit undergoes period-doubling bifurcation again, yielding a period-4 stable orbit. In order to find out what the stable orbit is like, we "zoom in" around the region of , using the affine transform . Now, by routine algebra, we havewhere . At approximately , the second bifurcation occurs, thus .
By self-similarity, the third bifurcation when , and so on. Thus we have , or . Iterating this map, we find , and .
Thus, we have the estimates , and . These are within 10% of the true values.
Relation to logistic ordinary differential equation
edit
The logistic map exhibits numerous characteristics of both periodic and chaotic solutions, whereas the logistic ordinary differential equation (ODE) exhibits regular solutions, commonly referred to as the S-shaped sigmoid function. The logistic map can be seen as the discrete counterpart of the logistic ODE, and their correlation has been extensively discussed in literature[24]
Occurrences
edit
In a toy model for discrete laser dynamics:
,
where stands for electric field amplitude, [25] is laser gain as bifurcation parameter.
^Lorenz, Edward N. (1964-02-01). "The problem of deducing the climate from the governing equations". Tellus. 16 (1): 1–11. Bibcode:1964Tell...16....1L. doi:10.1111/j.2153-3490.1964.tb00136.x. ISSN 0040-2826.
^ abMay, Robert M. (1976). "Simple mathematical models with very complicated dynamics". Nature. 261 (5560): 459–467. Bibcode:1976Natur.261..459M. doi:10.1038/261459a0. hdl:10338.dmlcz/104555. PMID 934280. S2CID 2243371.
^Bechhoefer, John (1996-04-01). "The Birth of Period 3, Revisited". Mathematics Magazine. 69 (2): 115–118. doi:10.1080/0025570X.1996.11996402. ISSN 0025-570X.
^Jeffries, Carson; Pérez, José (1982). "Observation of a Pomeau–Manneville intermittent route to chaos in a nonlinear oscillator". Physical Review A. 26 (4): 2117–2122. Bibcode:1982PhRvA..26.2117J. doi:10.1103/PhysRevA.26.2117. S2CID 119466337.
^ abMay, R. M. (1976). "Simple mathematical models with very complicated dynamics". Nature. 261 (5560): 459–67. Bibcode:1976Natur.261..459M. doi:10.1038/261459a0. hdl:10338.dmlcz/104555. PMID 934280. S2CID 2243371.
^Little, M.; Heesch, D. (2004). "Chaotic root-finding for a small class of polynomials" (PDF). Journal of Difference Equations and Applications. 10 (11): 949–953. arXiv:nlin/0407042. doi:10.1080/10236190412331285351. S2CID 122705492.
^Lorenz, Edward (1964). "The problem of deducing the climate from the governing equations". Tellus. 16 (February): 1–11. Bibcode:1964Tell...16....1L. doi:10.3402/tellusa.v16i1.8893.
^ abDelbourgo, R.; Hart, W.; Kenny, B. G. (1985-01-01). "Dependence of universal constants upon multiplication period in nonlinear maps". Physical Review A. 31 (1): 514–516. doi:10.1103/PhysRevA.31.514. ISSN 0556-2791.
^Feigenbaum, M. J. (1976) "Universality in complex discrete dynamics", Los Alamos Theoretical Division Annual Report 1975-1976
^Feigenbaum, Mitchell (1978). "Quantitative universality for a class of nonlinear transformations". Journal of Statistical Physics. 19 (1): 25–52. Bibcode:1978JSP....19...25F. CiteSeerX10.1.1.418.9339. doi:10.1007/BF01020332. S2CID 124498882.
^Okulov, A Yu; Oraevskiĭ, A N (1984). "Regular and stochastic self-modulation in a ring laser with nonlinear element". Soviet Journal of Quantum Electronics. 14 (2): 1235–1237. Bibcode:1984QuEle..14.1235O. doi:10.1070/QE1984v014n09ABEH006171.
^Shen, Bo-Wen; Pielke, Roger A.; Zeng, Xubin (2023-08-12). "The 50th Anniversary of the Metaphorical Butterfly Effect since Lorenz (1972): Multistability, Multiscale Predictability, and Sensitivity in Numerical Models". Atmosphere. 14 (8): 1279. Bibcode:2023Atmos..14.1279S. doi:10.3390/atmos14081279. ISSN 2073-4433.
^Okulov, A Yu; Oraevskiĭ, A N (1986). "Space–temporal behavior of a light pulse propagating in a nonlinear nondispersive medium". J. Opt. Soc. Am. B. 3 (5): 741–746. Bibcode:1986JOSAB...3..741O. doi:10.1364/JOSAB.3.000741. S2CID 124347430.
References
edit
Grassberger, P.; Procaccia, I. (1983). "Measuring the strangeness of strange attractors". Physica D. 9 (1–2): 189–208. Bibcode:1983PhyD....9..189G. doi:10.1016/0167-2789(83)90298-1.
Grassberger, P. (1981). "On the Hausdorff dimension of fractal attractors". Journal of Statistical Physics. 26 (1): 173–179. Bibcode:1981JSP....26..173G. doi:10.1007/BF01106792. S2CID 119833080.
Sprott, Julien Clinton (2003). Chaos and Time-Series Analysis. Oxford University Press. ISBN 978-0-19-850840-3.
Strogatz, Steven (2000). Nonlinear Dynamics and Chaos. Perseus Publishing. ISBN 978-0-7382-0453-6.
Tufillaro, Nicholas; Abbott, Tyler; Reilly, Jeremiah (1992). An Experimental Approach to Nonlinear Dynamics and Chaos. Addison-Wesley New York. ISBN 978-0-201-55441-0.
External links
edit
Wikibooks has a book on the topic of: Fractals/Iterations_of_real_numbers/r_iterations#Logistic_map
The Chaos Hypertextbook. An introductory primer on chaos and fractals.
An interactive visualization of the logistic map as a Jupyter notebook
The Logistic Map and Chaos by Elmer G. Wiens
Complexity & Chaos (audiobook) by Roger White. Chapter 5 covers the Logistic Equation.