Mathematical notation consists of using symbols for representing operations, unspecified numbers, relations and any other mathematical objects, and assembling them into expressions and formulas. Mathematical notation is widely used in mathematics, science, and engineering for representing complex concepts and properties in a concise, unambiguous and accurate way.
Mathematical notation was first introduced by François Viète at the end of the 16th century, and largely expanded during the 17th and 18th century by René Descartes, Isaac Newton, Gottfried Wilhelm Leibniz, and overall Leonhard Euler.
The use of many symbols is the basis of mathematical notation. They play a similar role as words in natural languages. They may play different roles in mathematical notation similarly as verbs, adjective and nouns play different roles in a sentence.
Letters are typically used for naming—in mathematical jargon, one says representing—mathematical objects. This is typically the Latin and Greek alphabets that are used, but some letters of Hebrew alphabet are sometimes used. Uppercase and lowercase letters are considered as different symbols. For Latin alphabet, different typefaces provide also different symbols. For example, and could theoretically appear in the same mathematical text with six different meanings. Normally, roman upright typeface is not used for symbols, except for symbols that are formed of several letters, such as the symbol " " of the sine function.
In order to have more symbols, and for allowing related mathematical objects to be represented by related symbols, diacritics, subscripts and superscripts are often used. For example, may denote the Fourier transform of the derivative of a function called
Some symbols are similar to Latin or Greek letters, some are obtained by deforming letters, some are traditional typographic symbols, but many have been specially designed for mathematics.
An expression is a finite combination of symbols that is well-formed according to rules that depend on the context. In general, an expression denotes or names a mathematical object, and plays therefore in the language of mathematics the role of a noun phrase in the natural language.
An expression contains often some operators, and may therefore be evaluated by the action of the operators in it. For example, is an expression in which the operator can be evaluated for giving the result So, and are two different expressions that represent the same number. This is the meaning of the equality
A more complicated example is given by the expression that can be evaluated to Although the resulting expression contains the operators of division, subtraction and exponentiation, it cannot be evaluated further because a and b denote unspecified numbers.
It is believed that a mathematical notation to represent counting was first developed at least 50,000 years ago—early mathematical ideas such as finger counting have also been represented by collections of rocks, sticks, bone, clay, stone, wood carvings, and knotted ropes. The tally stick is a way of counting dating back to the Upper Paleolithic. Perhaps the oldest known mathematical texts are those of ancient Sumer. The Census Quipu of the Andes and the Ishango Bone from Africa both used the tally mark method of accounting for numerical concepts.
The development of zero as a number is one of the most important developments in early mathematics. It was used as a placeholder by the Babylonians and Greek Egyptians, and then as an integer by the Mayans, Indians and Arabs (see the history of zero for more information).
The earliest mathematical viewpoints in geometry did not lend themselves well to counting. The natural numbers, their relationship to fractions, and the identification of continuous quantities actually took millennia to take form, and even longer to allow for the development of notation.
In fact, it was not until the invention of analytic geometry by René Descartes that geometry became more subject to a numerical notation. Some symbolic shortcuts for mathematical concepts came to be used in the publication of geometric proofs. Moreover, the power and authority of geometry's theorem and proof structure greatly influenced non-geometric treatises, such as Principia Mathematica by Isaac Newton for instance.
The 18th and 19th centuries saw the creation and standardization of mathematical notation as used today. Leonhard Euler was responsible for many of the notations currently in use: the use of a, b, c for constants and x, y, z for unknowns, e for the base of the natural logarithm, sigma (Σ) for summation, i for the imaginary unit, and the functional notation f(x). He also popularized the use of π for the Archimedes constant (due to William Jones' proposal for the use of π in this way based on the earlier notation of William Oughtred).
In addition, many fields of mathematics bear the imprint of their creators for notation: the differential operator of Leibniz, the cardinal infinities of Georg Cantor (in addition to the lemniscate (∞) of John Wallis), the congruence symbol (≡) of Gauss, and so forth.
Theorem-proving software comes with its own notations for mathematics; the OMDoc project seeks to provide an open commons for such notations; and the MMT language provides a basis for interoperability between other notations.
(Western notation uses Arabic numerals, but the Arabic notation also replaces Latin letters and related symbols with Arabic script.)
In addition to Arabic notation, mathematics also makes use of Greek alphabets to denote a wide variety of mathematical objects and variables. In some occasions, certain Hebrew alphabets are also used (such as in the context of infinite cardinals).
The great accomplishment of Descartes in mathematics invariably is described as the arithmetization of geometry.
|Wikimedia Commons has media related to Mathematical notation.|