Method of Fluxions (Latin: De Methodis Serierum et Fluxionum)[1] is a mathematical treatise by Sir Isaac Newton which served as the earliest written formulation of modern calculus. The book was completed in 1671 and posthumously published in 1736.[2]
Author | Isaac Newton |
---|---|
Language | English |
Genre | Mathematics |
Publisher | Henry Woodfall |
Publication date | 1736 |
Pages | 339 |
Fluxion is Newton's term for a derivative. He originally developed the method at Woolsthorpe Manor during the closing of Cambridge due to the Great Plague of London from 1665 to 1667. Newton did not choose to make his findings known (similarly, his findings which eventually became the Philosophiae Naturalis Principia Mathematica were developed at this time and hidden from the world in Newton's notes for many years). Gottfried Leibniz developed his form of calculus independently around 1673, 7 years after Newton had developed the basis for differential calculus, as seen in surviving documents like “the method of fluxions and fluents..." from 1666. Leibniz, however, published his discovery of differential calculus in 1684, nine years before Newton formally published his fluxion notation form of calculus in part during 1693.[3]
The calculus notation in use today is mostly that of Leibniz, although Newton's dot notation for differentiation is frequently used to denote derivatives with respect to time.
Newton's Method of Fluxions was formally published posthumously, but following Leibniz's publication of the calculus a bitter rivalry erupted between the two mathematicians over who had developed the calculus first, provoking Newton to reveal his work on fluxions.
For a period of time encompassing Newton's working life, the discipline of analysis was a subject of controversy in the mathematical community. Although analytic techniques provided solutions to long-standing problems, including problems of quadrature and the finding of tangents, the proofs of these solutions were not known to be reducible to the synthetic rules of Euclidean geometry. Instead, analysts were often forced to invoke infinitesimal, or "infinitely small", quantities to justify their algebraic manipulations. Some of Newton's mathematical contemporaries, such as Isaac Barrow, were highly skeptical of such techniques, which had no clear geometric interpretation. Although in his early work Newton also used infinitesimals in his derivations without justifying them, he later developed something akin to the modern definition of limits in order to justify his work.[4]