It is the opposite of determinism and related to chance. It is highly relevant to the philosophical problem of free will, particularly in the form of libertarianism. In science, most specifically quantum theory in physics, indeterminism is the belief that no event is certain and the entire outcome of anything is probabilistic. Heisenberg's uncertainty principle and the "Born rule", proposed by Max Born, are often starting points in support of the indeterministic nature of the universe. Indeterminism is also asserted by Sir Arthur Eddington, and Murray Gell-Mann. Indeterminism has been promoted by the French biologist Jacques Monod's essay "Chance and Necessity". The physicist-chemist Ilya Prigogine argued for indeterminism in complex systems.
Indeterminists do not have to deny that causes exist. Instead, they can maintain that the only causes that exist are of a type that do not constrain the future to a single course; for instance, they can maintain that only necessary and not sufficient causes exist. The necessary/sufficient distinction works as follows:
If x is a necessary cause of y; then the presence of y implies that x definitely preceded it. The presence of x, however, does not imply that y will occur.
If x is a sufficient cause of y, then the presence of y implies that x may have preceded it. (However, another cause z may alternatively cause y. Thus the presence of y does not imply the presence of x, or z, or any other suspect.)
It is possible for everything to have a necessary cause, even while indeterminism holds and the future is open, because a necessary condition does not lead to a single inevitable effect. Indeterministic (or probabilistic) causation is a proposed possibility, such that "everything has a cause" is not a clear statement of indeterminism.
Interpreting causation as a deterministic relation means that if A causes B, then A must always be followed by B. In this sense, war does not cause deaths, nor does smoking cause cancer. As a result, many turn to a notion of probabilistic causation. Informally, A probabilistically causes B if A's occurrence increases the probability of B. This is sometimes interpreted to reflect the imperfect knowledge of a deterministic system but other times interpreted to mean that the causal system under study has an inherently indeterministic nature. (Propensity probability is an analogous idea, according to which probabilities have an objective existence and are not just limitations in a subject's knowledge).
It can be proved that realizations of any probability distribution other than the uniform one are mathematically equal to applying a (deterministic) function (namely, an inverse distribution function) on a random variable following the latter (i.e. an "absolutely random" one); the probabilities are contained in the deterministic element. A simple form of demonstrating it would be shooting randomly within a square and then (deterministically) interpreting a relatively large subsquare as the more probable outcome.
A distinction is generally made between indeterminism and the mere inability to measure the variables (limits of precision). This is especially the case for physical indeterminism (as proposed by various interpretations of quantum mechanics). Yet some philosophers have argued that indeterminism and unpredictability are synonymous.
"The cosmos, then, became like a spherical form in this way: the atoms being submitted to a casual and unpredictable movement, quickly and incessantly".
Aristotle described four possible causes (material, efficient, formal, and final). Aristotle's word for these causes was αἰτίαι (aitiai, as in aetiology), which translates as causes in the sense of the multiple factors responsible for an event. Aristotle did not subscribe to the simplistic "every event has a (single) cause" idea that was to come later.
In his Physics and Metaphysics, Aristotle said there were accidents (συμβεβηκός, sumbebekos) caused by nothing but chance (τύχη, tukhe). He noted that he and the early physicists found no place for chance among their causes.
We have seen how far Aristotle distances himself from any view which makes chance a crucial factor in the general explanation of things. And he does so on conceptual grounds: chance events are, he thinks, by definition unusual and lacking certain explanatory features: as such they form the complement class to those things which can be given full natural explanations.— R.J. Hankinson, "Causes" in Blackwell Companion to Aristotle
Aristotle opposed his accidental chance to necessity:
Nor is there any definite cause for an accident, but only chance (τυχόν), namely an indefinite (ἀόριστον) cause.
It is obvious that there are principles and causes which are generable and destructible apart from the actual processes of generation and destruction; for if this is not true, everything will be of necessity: that is, if there must necessarily be some cause, other than accidental, of that which is generated and destroyed. Will this be, or not? Yes, if this happens; otherwise not.
...we show the existence of causes are plausible, and if those, too, are plausible which prove that it is incorrect to assert the existence of a cause, and if there is no way to give preference to any of these over others – since we have no agreed-upon sign, criterion, or proof, as has been pointed out earlier – then, if we go by the statements of the Dogmatists, it is necessary to suspend judgment about the existence of causes, too, saying that they are no more existent than non-existent
Epicurus argued that as atoms moved through the void, there were occasions when they would "swerve" (clinamen) from their otherwise determined paths, thus initiating new causal chains. Epicurus argued that these swerves would allow us to be more responsible for our actions, something impossible if every action was deterministically caused. For Epicureanism, the occasional interventions of arbitrary gods would be preferable to strict determinism.
In 1729 theTestament of Jean Meslier states:
"The matter, by virtue of its own active force, moves and acts in blind manner".
Soon after Julien Offroy de la Mettrie in his L'Homme Machine. (1748, anon.) wrote:
"Perhaps, the cause of man's existence is just in existence itself? Perhaps he is by chance thrown in some point of this terrestrial surface without any how and why".
In his Anti-Sénèque [Traité de la vie heureuse, par Sénèque, avec un Discours du traducteur sur le même sujet, 1750] we read:
"Then, the chance has thrown us in life".
In the 19th century the French Philosopher Antoine-Augustin Cournot theorized chance in a new way, as series of not-linear causes. He wrote in Essai sur les fondements de nos connaissances (1851):
"It is not because of rarity that the chance is actual. On the contrary, it is because of chance they produce many possible others."
Tychism (Greek: τύχη "chance") is a thesis proposed by the American philosopher Charles Sanders Peirce in the 1890s. It holds that absolute chance, also called spontaneity, is a real factor operative in the universe. It may be considered both the direct opposite of Albert Einstein's oft quoted dictum that: "God does not play dice with the universe" and an early philosophical anticipation of Werner Heisenberg's uncertainty principle.
Peirce does not, of course, assert that there is no law in the universe. On the contrary, he maintains that an absolutely chance world would be a contradiction and thus impossible. Complete lack of order is itself a sort of order. The position he advocates is rather that there are in the universe both regularities and irregularities.
In 1931, Arthur Holly Compton championed the idea of human freedom based on quantum indeterminacy and invented the notion of amplification of microscopic quantum events to bring chance into the macroscopic world. In his somewhat bizarre mechanism, he imagined sticks of dynamite attached to his amplifier, anticipating the Schrödinger's cat paradox.
Reacting to criticisms that his ideas made chance the direct cause of our actions, Compton clarified the two-stage nature of his idea in an Atlantic Monthly article in 1955. First there is a range of random possible events, then one adds a determining factor in the act of choice.
A set of known physical conditions is not adequate to specify precisely what a forthcoming event will be. These conditions, insofar as they can be known, define instead a range of possible events from among which some particular event will occur. When one exercises freedom, by his act of choice he is himself adding a factor not supplied by the physical conditions and is thus himself determining what will occur. That he does so is known only to the person himself. From the outside one can see in his act only the working of physical law. It is the inner knowledge that he is in fact doing what he intends to do that tells the actor himself that he is free.
Compton welcomed the rise of indeterminism in 20th century science, writing:
In my own thinking on this vital subject I am in a much more satisfied state of mind than I could have been at any earlier stage of science. If the statements of the laws of physics were assumed correct, one would have had to suppose (as did most philosophers) that the feeling of freedom is illusory, or if [free] choice were considered effective, that the laws of physics ... [were] unreliable. The dilemma has been an uncomfortable one.
Together with Arthur Eddington in Britain, Compton was one of those rare distinguished physicists in the English speaking world of the late 1920s and throughout the 1930s arguing for the “liberation of free will” with the help of Heisenberg’s indeterminacy principle, but their efforts had been met not only with physical and philosophical criticism but most primarily with fierce political and ideological campaigns.
In his essay Of Clouds and Clocks, included in his book Objective Knowledge, Popper contrasted "clouds", his metaphor for indeterministic systems, with "clocks", meaning deterministic ones. He sided with indeterminism, writing
I believe Peirce was right in holding that all clocks are clouds to some considerable degree — even the most precise of clocks. This, I think, is the most important inversion of the mistaken determinist view that all clouds are clocks
Popper was also a promoter of propensity probability.
Kane is one of the leading contemporary philosophers on free will. Advocating what is termed within philosophical circles "libertarian freedom", Kane argues that "(1) the existence of alternative possibilities (or the agent's power to do otherwise) is a necessary condition for acting freely, and (2) determinism is not compatible with alternative possibilities (it precludes the power to do otherwise)". It is important to note that the crux of Kane's position is grounded not in a defense of alternative possibilities (AP) but in the notion of what Kane refers to as ultimate responsibility (UR). Thus, AP is a necessary but insufficient criterion for free will. It is necessary that there be (metaphysically) real alternatives for our actions, but that is not enough; our actions could be random without being in our control. The control is found in "ultimate responsibility".
What allows for ultimate responsibility of creation in Kane's picture are what he refers to as "self-forming actions" or SFAs — those moments of indecision during which people experience conflicting wills. These SFAs are the undetermined, regress-stopping voluntary actions or refrainings in the life histories of agents that are required for UR. UR does not require that every act done of our own free will be undetermined and thus that, for every act or choice, we could have done otherwise; it requires only that certain of our choices and actions be undetermined (and thus that we could have done otherwise), namely SFAs. These form our character or nature; they inform our future choices, reasons and motivations in action. If a person has had the opportunity to make a character-forming decision (SFA), he is responsible for the actions that are a result of his character.
Mark Balaguer, in his book Free Will as an Open Scientific Problem argues similarly to Kane. He believes that, conceptually, free will requires indeterminism, and the question of whether the brain behaves indeterministically is open to further empirical research. He has also written on this matter "A Scientifically Reputable Version of Indeterministic Libertarian Free Will".
In probability theory, a stochastic process, or sometimes random process, is the counterpart to a deterministic process (or deterministic system). Instead of dealing with only one possible reality of how the process might evolve over time (as is the case, for example, for solutions of an ordinary differential equation), in a stochastic or random process there is some indeterminacy in its future evolution described by probability distributions. This means that even if the initial condition (or starting point) is known, there are many possibilities the process might go to, but some paths may be more probable and others less so.
The idea that Newtonian physics proved causal determinism was highly influential in the early modern period. "Thus physical determinism [..] became the ruling faith among enlightened men; and everybody who did not embrace this new faith was held to be an obscurantist and a reactionary". However: "Newton himself may be counted among the few dissenters, for he regarded the solar system as imperfect, and consequently as likely to perish".
John Earman has argued that most physical theories are indeterministic. For instance, Newtonian physics admits solutions where particles accelerate continuously, heading out towards infinity. By the time reversibility of the laws in question, particles could also head inwards, unprompted by any pre-existing state. He calls such hypothetical particles "space invaders".
Branching space-time is a theory uniting indeterminism and the special theory of relativity. The idea was originated by Nuel Belnap. The equations of general relativity admit of both indeterministic and deterministic solutions.
Ludwig Boltzmann, was one of the founders of statistical mechanics and the modern atomic theory of matter. He is remembered for his discovery that the second law of thermodynamics is a statistical law stemming from disorder. He also speculated that the ordered universe we see is only a small bubble in much larger sea of chaos. The Boltzmann brain is a similar idea.
Darwinian evolution has an enhanced reliance on the chance element of random mutation compared to the earlier evolutionary theory of Herbert Spencer. However, the question of whether evolution requires genuine ontological indeterminism is open to debate
In the essay Chance and Necessity (1970) Jacques Monod rejected the role of final causation in biology, instead arguing that a mixture of efficient causation and "pure chance" lead to teleonomy, or merely apparent purposefulness.
The Japanese theoretical population geneticist Motoo Kimura emphasises the role of indeterminism in evolution. According to neutral theory of molecular evolution: "at the molecular level most evolutionary change is caused by random drift of gene mutants that are equivalent in the face of selection.
In his 1997 book, The End of Certainty, Prigogine contends that determinism is no longer a viable scientific belief. "The more we know about our universe, the more difficult it becomes to believe in determinism." This is a major departure from the approach of Newton, Einstein and Schrödinger, all of whom expressed their theories in terms of deterministic equations. According to Prigogine, determinism loses its explanatory power in the face of irreversibility and instability.
Prigogine traces the dispute over determinism back to Darwin, whose attempt to explain individual variability according to evolving populations inspired Ludwig Boltzmann to explain the behavior of gases in terms of populations of particles rather than individual particles. This led to the field of statistical mechanics and the realization that gases undergo irreversible processes. In deterministic physics, all processes are time-reversible, meaning that they can proceed backward as well as forward through time. As Prigogine explains, determinism is fundamentally a denial of the arrow of time. With no arrow of time, there is no longer a privileged moment known as the "present," which follows a determined "past" and precedes an undetermined "future." All of time is simply given, with the future as determined or undetermined as the past. With irreversibility, the arrow of time is reintroduced to physics. Prigogine notes numerous examples of irreversibility, including diffusion, radioactive decay, solar radiation, weather and the emergence and evolution of life. Like weather systems, organisms are unstable systems existing far from thermodynamic equilibrium. Instability resists standard deterministic explanation. Instead, due to sensitivity to initial conditions, unstable systems can only be explained statistically, that is, in terms of probability.
Prigogine asserts that Newtonian physics has now been "extended" three times, first with the use of the wave function in quantum mechanics, then with the introduction of spacetime in general relativity and finally with the recognition of indeterminism in the study of unstable systems.
At one time, it was assumed in the physical sciences that if the behavior observed in a system cannot be predicted, the problem is due to lack of fine-grained information, so that a sufficiently detailed investigation would eventually result in a deterministic theory ("If you knew exactly all the forces acting on the dice, you would be able to predict which number comes up").
However, the advent of quantum mechanics removed the underpinning from that approach, with the claim that (at least according to the Copenhagen interpretation) the most basic constituents of matter at times behave indeterministically. This comes from the collapse of the wave function, in which the state of a system upon measurement cannot in general be predicted. Quantum mechanics only predicts the probabilities of possible outcomes, which are given by the Born rule. Non-deterministic behavior in wave function collapse is not only a feature of the Copenhagen interpretation, with its observer-dependence, but also of objective collapse and other theories.
Opponents of quantum indeterminism suggested that determinism could be restored by formulating a new theory in which additional information, so-called hidden variables, would allow definite outcomes to be determined. For instance, in 1935, Einstein, Podolsky and Rosen wrote a paper titled "Can Quantum-Mechanical Description of Physical Reality Be Considered Complete?" arguing that such a theory was in fact necessary to preserve the principle of locality. In 1964, John S. Bell was able to define a theoretical test for these local hidden variable theories, which was reformulated as a workable experimental test through the work of Clauser, Horne, Shimony and Holt. The negative result of the 1980s tests by Alain Aspect ruled such theories out, provided certain assumptions about the experiment hold. Thus any interpretation of quantum mechanics, including deterministic reformulations, must either reject locality or reject counterfactual definiteness altogether. David Bohm's theory is the main example of a non-local deterministic quantum theory.
The many-worlds interpretation is said to be deterministic, but experimental results still cannot be predicted: experimenters do not know which 'world' they will end up in. Technically, counterfactual definiteness is lacking.
A notable consequence of quantum indeterminism is the Heisenberg uncertainty principle, which prevents the simultaneous accurate measurement of all a particle's properties.
Primordial fluctuations are density variations in the early universe which are considered the seeds of all structure in the universe. Currently, the most widely accepted explanation for their origin is in the context of cosmic inflation. According to the inflationary paradigm, the exponential growth of the scale factor during inflation caused quantum fluctuations of the inflaton field to be stretched to macroscopic scales, and, upon leaving the horizon, to "freeze in". At the later stages of radiation- and matter-domination, these fluctuations re-entered the horizon, and thus set the initial conditions for structure formation.
Neuroscientists such as Björn Brembs and Christof Koch believe thermodynamically stochastic processes in the brain are the basis of free will, and that even very simple organisms such as flies have a form of free will. Similar ideas are put forward by some philosophers such as Robert Kane.
Despite recognizing indeterminism to be a very low-level, necessary prerequisite, Björn Brembs says that it's not even close to being sufficient for addressing things like morality and responsibility. Edward O. Wilson does not extrapolate from bugs to people, and Corina E. Tarnita alerts against trying to draw parallels between people and insects, since human selflessness and cooperation, however, is of a different sort, also involving the interaction of culture and sentience, not just genetics and environment.
Against Einstein and others who advocated determinism, indeterminism—as championed by the English astronomer Sir Arthur Eddington—says that a physical object has an ontologically undetermined component that is not due to the epistemological limitations of physicists' understanding. The uncertainty principle, then, would not necessarily be due to hidden variables but to an indeterminism in nature itself.
Determinism and indeterminism are examined in Causality and Chance in Modern Physics by David Bohm. He speculates that, since determinism can emerge from underlying indeterminism (via the law of large numbers), and that indeterminism can emerge from determinism (for instance, from classical chaos), the universe could be conceived of as having alternating layers of causality and chaos.
Indeterminism—or, more precisely physical indeterminism—is merely the doctrine that not all events in the physical world are predetermined with absolute precision
Russell, Bertrand. “Elements of Ethics.” Philosophical essays, 1910.