The inch (symbol: in or ″) is a unit of length in the British imperial and the United States customary systems of measurement. It is equal to 1/36 yard or 1/12 of a foot. Derived from the Roman uncia ("twelfth"), the word inch is also sometimes used to translate similar units in other measurement systems, usually understood as deriving from the width of the human thumb.
|Unit system||Imperial/US units|
|Symbol||in or ″ (the double prime)|
|1 in in ...||... is equal to ...|
|Imperial/US units||1/36 yd or 1/12 ft|
|Metric (SI) units||25.4 mm|
Standards for the exact length of an inch have varied in the past, but since the adoption of the international yard during the 1950s and 1960s the inch has been based on the metric system and defined as exactly 25.4 mm.
The English word "inch" (Old English: ynce) was an early borrowing from Latin uncia ("one-twelfth; Roman inch; Roman ounce"). The vowel change from Latin /u/ to Old English /y/ (which became Modern English /ɪ/) is known as umlaut. The consonant change from the Latin /k/ (spelled c) to English /tʃ/ is palatalisation. Both were features of Old English phonology; see Phonological history of Old English § Palatalization and Germanic umlaut § I-mutation in Old English for more information.
In many other European languages, the word for "inch" is the same as or derived from the word for "thumb", as a man's thumb is about an inch wide (and this was even sometimes used to define the inch). Examples include Catalan: polzada ("inch") and polze ("thumb"); Czech: palec ("thumb"); Danish and Norwegian: tomme ("inch") tommel ("thumb"); Dutch: duim (whence Afrikaans: duim and Russian: дюйм); French: pouce; Hungarian: hüvelyk; Italian: pollice; Portuguese: polegada ("inch") and polegar ("thumb"); ("duim"); Slovak: palec ("thumb"); Spanish: pulgada ("inch") and pulgar ("thumb"); and Swedish: tum ("inch") and tumme ("thumb").
The inch is a commonly used customary unit of length in the United States, Canada, and the United Kingdom. It is also used in Japan for electronic parts, especially display screens. In most of continental Europe, the inch is also used informally as a measure for display screens. For the United Kingdom, guidance on public sector use states that, since 1 October 1995, without time limit, the inch (along with the foot) is to be used as a primary unit for road signs and related measurements of distance (with the possible exception of clearance heights and widths) and may continue to be used as a secondary or supplementary indication following a metric measurement for other purposes.
Inches are commonly used to specify the diameter of vehicle wheel rims, and the corresponding inner diameter of tyres in tyre codes.
The international standard symbol for inch is in (see ISO 31-1, Annex A) but traditionally the inch is denoted by a double prime, which is often approximated by a double quote symbol, and the foot by a prime, which is often approximated by an apostrophe. For example; three feet, two inches can be written as 3′ 2″. (This is akin to how the first and second "cuts" of the hour are likewise indicated by prime and double prime symbols, and also the first and second cuts of the degree.)
Subdivisions of an inch are typically written using dyadic fractions with odd number numerators; for example, two and three-eighths of an inch would be written as 2+3/8″ and not as 2.375″ nor as 2+6/16″. However, for engineering purposes fractions are commonly given to three or four places of decimals and have been for many years.
1 international inch is equal to:
The earliest known reference to the inch in England is from the Laws of Æthelberht dating to the early 7th century, surviving in a single manuscript, the Textus Roffensis from 1120. Paragraph LXVII sets out the fine for wounds of various depths: one inch, one shilling; two inches, two shillings, etc.[m]
An Anglo-Saxon unit of length was the barleycorn. After 1066, 1 inch was equal to 3 barleycorns, which continued to be its legal definition for several centuries, with the barleycorn being the base unit. One of the earliest such definitions is that of 1324, where the legal definition of the inch was set out in a statute of Edward II of England, defining it as "three grains of barley, dry and round, placed end to end, lengthwise".
Similar definitions are recorded in both English and Welsh medieval law tracts. One, dating from the first half of the 10th century, is contained in the Laws of Hywel Dda which superseded those of Dyfnwal, an even earlier definition of the inch in Wales. Both definitions, as recorded in Ancient Laws and Institutes of Wales (vol i., pp. 184, 187, 189), are that "three lengths of a barleycorn is the inch".
King David I of Scotland in his Assize of Weights and Measures (c. 1150) is said to have defined the Scottish inch as the width of an average man's thumb at the base of the nail, even including the requirement to calculate the average of a small, a medium, and a large man's measures. However, the oldest surviving manuscripts date from the early 14th century and appear to have been altered with the inclusion of newer material.
In 1814, Charles Butler, a mathematics teacher at Cheam School, recorded the old legal definition of the inch to be "three grains of sound ripe barley being taken out the middle of the ear, well dried, and laid end to end in a row", and placed the barleycorn, not the inch, as the base unit of the English Long Measure system, from which all other units were derived. John Bouvier similarly recorded in his 1843 law dictionary that the barleycorn was the fundamental measure. Butler observed, however, that "[a]s the length of the barley-corn cannot be fixed, so the inch according to this method will be uncertain", noting that a standard inch measure was now [i.e. by 1843] kept in the Exchequer chamber, Guildhall, and that was the legal definition of the inch.
This was a point also made by George Long in his 1842 Penny Cyclopædia, observing that standard measures had since surpassed the barleycorn definition of the inch, and that to recover the inch measure from its original definition, in case the standard measure were destroyed, would involve the measurement of large numbers of barleycorns and taking their average lengths. He noted that this process would not perfectly recover the standard, since it might introduce errors of anywhere between one hundredth and one tenth of an inch in the definition of a yard.
Before the adoption of the international yard and pound, various definitions were in use. In the United Kingdom and most countries of the British Commonwealth, the inch was defined in terms of the Imperial Standard Yard. The United States adopted the conversion factor 1 metre = 39.37 inches by an act in 1866. In 1893, Mendenhall ordered the physical realization of the inch to be based on the international prototype metres numbers 21 and 27, which had been received from the CGPM, together with the previously adopted conversion factor.
As a result of the definitions above, the U.S. inch was effectively defined as 25.4000508 mm (with a reference temperature of 68 degrees Fahrenheit) and the UK inch at 25.399977 mm (with a reference temperature of 62 degrees Fahrenheit). When Carl Edvard Johansson started manufacturing gauge blocks in inch sizes in 1912, Johansson's compromise was to manufacture gauge blocks with a nominal size of 25.4mm, with a reference temperature of 20 degrees Celsius, accurate to within a few parts per million of both official definitions. Because Johansson's blocks were so popular, his blocks became the de facto standard for manufacturers internationally, with other manufacturers of gauge blocks following Johansson's definition by producing blocks designed to be equivalent to his.
In 1930, the British Standards Institution adopted an inch of exactly 25.4 mm. The American Standards Association followed suit in 1933. By 1935, industry in 16 countries had adopted the "industrial inch" as it came to be known, effectively endorsing Johansson's pragmatic choice of conversion ratio.
In 1946, the Commonwealth Science Congress recommended a yard of exactly 0.9144 metres for adoption throughout the British Commonwealth. This was adopted by Canada in 1951; the United States on 1 July 1959; Australia in 1961, effective 1 January 1964; and the United Kingdom in 1963, effective on 1 January 1964. The new standards gave an inch of exactly 25.4 mm, 1.7 millionths of an inch longer than the old imperial inch and 2 millionths of an inch shorter than the old US inch.
The United States retains the 1/39.37-metre definition for surveying, producing a 2 millionth part difference between standard and US survey inches. This is approximately 1/8 inch per mile; 12.7 kilometres is exactly 500,000 standard inches and exactly 499,999 survey inches. This difference is substantial when doing calculations in State Plane Coordinate Systems with coordinate values in the hundreds of thousands or millions of feet.
In 2020, the U.S. NIST announced that the U.S. survey foot would "be phased out" on 1 January 2023 and be superseded by the International foot (also known as the foot) equal to 0.3048 metres exactly, for all further applications. and by implication, the survey inch with it.
Before the adoption of the metric system, several European countries had customary units whose name translates into "inch". The French pouce measured roughly 27.0 mm, at least when applied to describe the calibre of artillery pieces. The Amsterdam foot (voet) consisted of 11 Amsterdam inches (duim). The Amsterdam foot is about 8% shorter than an English foot.
One Metre is equal to ... 30.371 inches"
The basic major dia is actually 1.309 in.