A Turing machine is a theoretical device that manipulates symbols contained on a strip of tape. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside of a computer. The "Turing" machine was described by Alan Turing in 1937,^{[1]} who called it an "a(utomatic)machine". Turing machines are not intended as a practical computing technology, but rather as a thought experiment representing a computing machine. They help computer scientists understand the limits of mechanical computation.
A succinct definition of the thought experiment was given by Turing in his 1948 essay, "Intelligent Machinery". Referring back to his 1936 publication, Turing writes that the Turing machine, here called a Logical Computing Machine, consisted of:
...an infinite memory capacity obtained in the form of an infinite tape marked out into squares on each of which a symbol could be printed. At any moment there is one symbol in the machine; it is called the scanned symbol. The machine can alter the scanned symbol and its behavior is in part determined by that symbol, but the symbols on the tape elsewhere do not affect the behavior of the machine. However, the tape can be moved back and forth through the machine, this being one of the elementary operations of the machine. Any symbol on the tape may therefore eventually have an innings^{[2]}. (Turing 1948, p. 61)
A Turing machine that is able to simulate any other Turing machine is called a Universal Turing machine (UTM, or simply a universal machine). A more mathematicallyoriented definition with a similar "universal" nature was introduced by Alonzo Church, whose work on lambda calculus intertwined with Turing's in a formal theory of computation known as the Church–Turing thesis. The thesis states that Turing machines indeed capture the informal notion of effective method in logic and mathematics, and provide a precise definition of an algorithm or 'mechanical procedure'.
Studying their abstract properties yields many insights into computer science and complexity theory.
The Turing machine mathematically models a machine that mechanically operates on a tape on which symbols are written which it can read and write one at a time using a tape head. Operation is fully determined by a finite set of elementary instructions such as "in state 42, if the symbol seen is 0, write a 1; if the symbol seen is 1, shift to the right, and change into state 17; in state 17, if the symbol seen is 0, write a 1 and change to state 6;" etc. In the original article ("On computable numbers, with an application to the Entscheidungsproblem", see also references below), Turing imagines not a mechanical machine, but a person whom he calls the "computer", who executes these deterministic mechanical rules slavishly (or as Turing puts it, "in a desultory manner").
More precisely, a Turing machine consists of:
Note that every part of the machine—its state and symbolcollections—and its actions—printing, erasing and tape motion—is finite, discrete and distinguishable; it is the potentially unlimited amount of tape that gives it an unbounded amount of storage space.
To see examples of the following models, see Turing machine examples:
Hopcroft and Ullman (1979, p. 148) formally define a (onetape) Turing machine as a 7tuple where
Anything that operates according to these specifications is a Turing machine.
The 7tuple for the 3state busy beaver looks like this (see more about this busy beaver at Turing machine examples) :
Initially all tape cells are marked with 0.
Tape symbol  Current state A  Current state B  Current state C  

Write symbol  Move tape  Next state  Write symbol  Move tape  Next state  Write symbol  Move tape  Next state  
0  1  R  B  1  L  A  1  L  B 
1  1  L  C  1  R  B  1  R  HALT 
In the words of van Emde Boas (1990), p. 6: "The settheoretical object [his formal seventuple description similar to the above] provides only partial information on how the machine will behave and what its computations will look like."
For instance,
Definitions in literature sometimes differ slightly, to make arguments or proofs easier or clearer, but this is always done in such a way that the resulting machine has the same computational power. For example, changing the set {L,R} to {L,R,N}, where N ("None" or "Nooperation") would allow the machine to stay on the same tape cell instead of moving left or right, does not increase the machine's computational power.
The most common convention represents each "Turing instruction" in a "Turing table" by one of nine 5tuples, per the convention of Turing/Davis (Turing (1936) in Undecidable, p. 126127 and Davis (2000) p. 152):
Other authors (Minsky (1967) p. 119, Hopcroft and Ullman (1979) p. 158, Stone (1972) p. 9) adopt a different convention, with new state q_{m} listed immediately after the scanned symbol S_{j}:
For the remainder of this article "definition 1" (the Turing/Davis convention) will be used.
Current state  Scanned symbol  Print symbol  Move tape  Final (i.e. next) state  5tuples 

A  0  1  R  B  (A, 0, 1, R, B) 
A  1  1  L  C  (A, 1, 1, L, C) 
B  0  1  L  A  (B, 0, 1, L, A) 
B  1  1  R  B  (B, 1, 1, R, B) 
C  0  1  L  B  (C, 0, 1, L, B) 
C  1  1  N  H  (C, 1, 1, N, H) 
In the following table, Turing's original model allowed only the first three lines that he called N1, N2, N3 (cf Turing in Undecidable, p. 126). He allowed for erasure of the "scanned square" by naming a 0th symbol S_{0} = "erase" or "blank", etc. However, he did not allow for nonprinting, so every instructionline includes "print symbol S_{k}" or "erase" (cf footnote 12 in Post (1947), Undecidable p. 300). The abbreviations are Turing's (Undecidable p. 119). Subsequent to Turing's original paper in 1936–1937, machinemodels have allowed all nine possible types of fivetuples:
Current mconfiguration (Turing state)  Tape symbol  Printoperation  Tapemotion  Final mconfiguration (Turing state)  5tuple  5tuple comments  4tuple  

N1  q_{i}  S_{j}  Print(S_{k})  Left L  q_{m}  (q_{i}, S_{j}, S_{k}, L, q_{m})  "blank" = S_{0}, 1=S_{1}, etc.  
N2  q_{i}  S_{j}  Print(S_{k})  Right R  q_{m}  (q_{i}, S_{j}, S_{k}, R, q_{m})  "blank" = S_{0}, 1=S_{1}, etc.  
N3  q_{i}  S_{j}  Print(S_{k})  None N  q_{m}  (q_{i}, S_{j}, S_{k}, N, q_{m})  "blank" = S_{0}, 1=S_{1}, etc.  (q_{i}, S_{j}, S_{k}, q_{m})  
4  q_{i}  S_{j}  None N  Left L  q_{m}  (q_{i}, S_{j}, N, L, q_{m})  (q_{i}, S_{j}, L, q_{m})  
5  q_{i}  S_{j}  None N  Right R  q_{m}  (q_{i}, S_{j}, N, R, q_{m})  (q_{i}, S_{j}, R, q_{m})  
6  q_{i}  S_{j}  None N  None N  q_{m}  (q_{i}, S_{j}, N, N, q_{m})  Direct "jump"  (q_{i}, S_{j}, N, q_{m})  
7  q_{i}  S_{j}  Erase  Left L  q_{m}  (q_{i}, S_{j}, E, L, q_{m})  
8  q_{i}  S_{j}  Erase  Right R  q_{m}  (q_{i}, S_{j}, E, R, q_{m})  
9  q_{i}  S_{j}  Erase  None N  q_{m}  (q_{i}, S_{j}, E, N, q_{m})  (q_{i}, S_{j}, E, q_{m}) 
Any Turing table (list of instructions) can be constructed from the above nine 5tuples. For technical reasons, the three nonprinting or "N" instructions (4, 5, 6) can usually be dispensed with. For examples see Turing machine examples.
Less frequently the use of 4tuples are encountered: these represent a further atomization of the Turing instructions (cf Post (1947), Boolos & Jeffrey (1974, 1999), DavisSigalWeyuker (1994)); also see more at Post–Turing machine.
The word "state" used in context of Turing machines can be a source of confusion. Most commentators after Turing have used "state" to mean the name/designator of the current instruction to be performed—i.e. the contents of the state register. But Turing (1936) made a strong distinction between a record of what he called the machine's "mconfiguration", and the machine's (or person's) "state of progress" through the computation. What Turing called "the state formula" includes both the current instruction and all the symbols on the tape:
Thus the state of progress of the computation at any stage is completely determined by the note of instructions and the symbols on the tape. That is, the state of the system may be described by a single expression (sequence of symbols) consisting of the symbols on the tape followed by Δ (which we suppose not to appear elsewhere) and then by the note of instructions. This expression is called the 'state formula'.—Undecidable, p.139–140, emphasis added
Earlier in his paper Turing carried this even further: he gives an example where he places a symbol of the current "mconfiguration"—the instruction's label—beneath the scanned square, together with all the symbols on the tape (Undecidable, p. 121); this he calls "the complete configuration" (Undecidable, p. 118). To print the "complete configuration" on one line he places the statelabel/mconfiguration to the left of the scanned symbol.
A variant of this is seen in Kleene (1952) where Kleene shows how to write the Gödel number of a machine's "situation": he places the "mconfiguration" symbol q_{4} over the scanned square in roughly the center of the 6 nonblank squares on the tape (see the Turingtape figure in this article) and puts it to the right of the scanned square. But Kleene refers to "q_{4}" itself as "the machine state" (Kleene, p. 374375). Hopcroft and Ullman call this composite the "instantaneous description" and follow the Turing convention of putting the "current state" (instructionlabel, mconfiguration) to the left of the scanned symbol (p. 149).
Example: total state of 3state 2symbol busy beaver after 3 "moves" (taken from example "run" in the figure below):
This means: after three moves the tape has ... 000110000 ... on it, the head is scanning the rightmost 1, and the state is A. Blanks (in this case represented by "0"s) can be part of the total state as shown here: B01 ; the tape has a single 1 on it, but the head is scanning the 0 ("blank") to its left and the state is B.
"State" in the context of Turing machines should be clarified as to which is being described: (i) the current instruction, or (ii) the list of symbols on the tape together with the current instruction, or (iii) the list of symbols on the tape together with the current instruction placed to the left of the scanned symbol or to the right of the scanned symbol.
Turing's biographer Andrew Hodges (1983: 107) has noted and discussed this confusion.
Tape symbol  Current state A  Current state B  Current state C  

Write symbol  Move tape  Next state  Write symbol  Move tape  Next state  Write symbol  Move tape  Next state  
0  P  R  B  P  L  A  P  L  B 
1  P  L  C  P  R  B  P  R  HALT 
To the right: the above TABLE as expressed as a "state transition" diagram.
Usually large TABLES are better left as tables (Booth, p. 74). They are more readily simulated by computer in tabular form (Booth, p. 74). However, certain concepts—e.g. machines with "reset" states and machines with repeating patterns (cf Hill and Peterson p. 244ff)—can be more readily seen when viewed as a drawing.
Whether a drawing represents an improvement on its TABLE must be decided by the reader for the particular context. See Finite state machine for more.
The reader should again be cautioned that such diagrams represent a snapshot of their TABLE frozen in time, not the course ("trajectory") of a computation through time and/or space. While every time the busy beaver machine "runs" it will always follow the same statetrajectory, this is not true for the "copy" machine that can be provided with variable input "parameters".
The diagram "Progress of the computation" shows the 3state busy beaver's "state" (instruction) progress through its computation from start to finish. On the far right is the Turing "complete configuration" (Kleene "situation", HopcroftUllman "instantaneous description") at each step. If the machine were to be stopped and cleared to blank both the "state register" and entire tape, these "configurations" could be used to rekindle a computation anywhere in its progress (cf Turing (1936) Undecidable pp. 139–140).
Many machines that might be thought to have more computational capability than a simple universal Turing machine can be shown to have no more power (Hopcroft and Ullman p. 159, cf Minsky (1967)). They might compute faster, perhaps, or use less memory, or their instruction set might be smaller, but they cannot compute more powerfully (i.e. more mathematical functions). (Recall that the Church–Turing thesis hypothesizes this to be true for any kind of machine: that anything that can be "computed" can be computed by some Turing machine.)
A Turing machine is equivalent to a pushdown automaton that has been made more flexible and concise by relaxing the lastinfirstout requirement of its stack. (Interestingly, this seemingly minor relaxation enables the Turing machine to perform such a wide variety of computations that it can serve as a clearer model for the computational capabilities of all modern computer software.)
At the other extreme, some very simple models turn out to be Turingequivalent, i.e. to have the same computational power as the Turing machine model.
Common equivalent models are the multitape Turing machine, multitrack Turing machine, machines with input and output, and the nondeterministic Turing machine (NDTM) as opposed to the deterministic Turing machine (DTM) for which the action table has at most one entry for each combination of symbol and state.
Readonly, rightmoving Turing Machines are equivalent to NDFA's (as well as DFA's by conversion using the NDFA to DFA conversion algorithm).
For practical and didactical intentions the equivalent register machine can be used as a usual assembly programming language.
Early in his paper (1936) Turing makes a distinction between an "automatic machine"—its "motion ... completely determined by the configuration" and a "choice machine":
...whose motion is only partially determined by the configuration ... When such a machine reaches one of these ambiguous configurations, it cannot go on until some arbitrary choice has been made by an external operator. This would be the case if we were using machines to deal with axiomatic systems.—Undecidable, p. 118
Turing (1936) does not elaborate further except in a footnote in which he describes how to use an amachine to "find all the provable formulae of the [Hilbert] calculus" rather than use a choice machine. He "suppose[s] that the choices are always between two possibilities 0 and 1. Each proof will then be determined by a sequence of choices i_{1}, i_{2}, ..., i_{n} (i_{1} = 0 or 1, i_{2} = 0 or 1, ..., i_{n} = 0 or 1), and hence the number 2^{n} + i_{1}2^{n1} + i_{2}2^{n2} + ... +i_{n} completely determines the proof. The automatic machine carries out successively proof 1, proof 2, proof 3, ..." (Footnote ‡, Undecidable, p. 138)
This is indeed the technique by which a deterministic (i.e. a) Turing machine can be used to mimic the action of a nondeterministic Turing machine; Turing solved the matter in a footnote and appears to dismiss it from further consideration.
An oracle machine or omachine is a Turing amachine that pauses its computation at state "o" while, to complete its calculation, it "awaits the decision" of "the oracle"—an unspecified entity "apart from saying that it cannot be a machine" (Turing (1939), Undecidable p. 166–168). The concept is now actively used by mathematicians.
As Turing wrote in Undecidable, p. 128 (italics added):
It is possible to invent a single machine which can be used to compute any computable sequence. If this machine U is supplied with the tape on the beginning of which is written the string of quintuples separated by semicolons of some computing machine M, then U will compute the same sequence as M.
This finding is now taken for granted, but at the time (1936) it was considered astonishing. The model of computation that Turing called his "universal machine"—"U" for short—is considered by some (cf Davis (2000)) to have been the fundamental theoretical breakthrough that led to the notion of the stored program computer.
Turing's paper ... contains, in essence, the invention of the modern computer and some of the programming techniques that accompanied it.—Minsky (1967), p. 104
In terms of computational complexity, a multitape universal Turing machine need only be slower by logarithmic factor compared to the machines it simulates. This result was obtained in 1966 by F. C. Hennie and R. E. Stearns. (Arora and Barak, 2009, theorem 1.9)
Finding small UTMs has recently become a popular endeavor. For a long time the smallest UTM was due to Marvin Minsky who in 1962 discovered a 7state 4symbol universal Turing machine using 2tag systems. As part of his book A New Kind of Science, Stephen Wolfram announced his discovery of a 2state 5symbol universal Turing machine. Wolfram also announced a US$25,000 prize for the proof or disproof of the conjecture that an even simpler, 2state 3symbol Turing machine is universal. The prize was awarded in 2007 to Alex Smith, an undergraduate studying Electronic and Computer Engineering at the University of Birmingham, UK (Giles 2007). Later, Vaughan Pratt of Stanford University announced he had discovered a flaw in the proof (Pratt 2007). Wolfram Research (Rowland 2007) and Smith himself disputed Pratt's interpretation. Pratt's main point was that the same argument that would make Wolfram's 2,3 Turing machine universal would make a Linear Bounded Automaton (LBA) universal. Smith explained that the LBA would need to be restarted at running time to perform a computation, while the 2, 3 Turing machine restarts automatically, therefore the proof does not make an LBA universal as Pratt first thought.
It is often said that Turing machines, unlike simpler automata, are as powerful as real machines, and are able to execute any operation that a real program can. What is missed in this statement is that, because a real machine can only be in finitely many configurations, in fact this "real machine" is nothing but a deterministic finite automaton. On the other hand, Turing machines are equivalent to machines that have an unlimited amount of storage space for their computations. In fact, Turing machines are not intended to model computers, but rather they are intended to model computation itself; historically, computers, which compute only on their (fixed) internal storage, were developed only later.
There are a number of ways to explain why Turing machines are useful models of real computers:
One way in which Turing machines are a poor model for programs is that many real programs, such as operating systems and word processors, are written to receive unbounded input over time, and therefore do not halt. Turing machines do not model such ongoing computation well (but can still model portions of it, such as individual procedures).
A limitation of Turing Machines is that they do not model the strengths of a particular arrangement well. For instance, modern storedprogram computers are actually instances of a more specific form of abstract machine known as the random access stored program machine or RASP machine model. Like the Universal Turing machine the RASP stores its "program" in "memory" external to its finitestate machine's "instructions". Unlike the Universal Turing Machine, the RASP has an infinite number of distinguishable, numbered but unbounded "registers"—memory "cells" that can contain any integer (cf Elgot and Robinson (1964), Hartmanis (1971), and in particular CookRechow (1973); references at random access machine). The RASP's finitestate machine is equipped with the capability for indirect addressing (e.g. the contents of one register can be used as an address to specify another register); thus the RASP's "program" can address any register in the registersequence. The upshot of this distinction is that there are computational optimizations that can be performed based on the memory indices, which are not possible in a general Turing Machine; thus when Turing Machines are used as the basis for bounding running times, a 'false lower bound' can be proven on certain algorithms' running times (due to the false simplifying assumption of a Turing Machine). An example of this is binary search, an algorithm that can be shown to perform more quickly when using the RASP model of computation rather than the Turing machine model.
They were described in 1936 by Alan Turing.
Robin Gandy (1919–1995)—a student of Alan Turing (1912–1954) and his lifelong friend—traces the lineage of the notion of "calculating machine" back to Babbage (circa 1834) and actually proposes "Babbage's Thesis":
That the whole of development and operations of analysis are now capable of being executed by machinery.—(italics in Babbage as cited by Gandy, p. 54)
Gandy's analysis of Babbage's Analytical Engine describes the following five operations (cf p. 52–53):
Gandy states that "the functions which can be calculated by (1), (2), and (4) are precisely those which are Turing computable." (p. 53). He cites other proposals for "universal calculating machines" included those of Percy Ludgate (1909), Leonardo Torres y Quevedo (1914), M. d'Ocagne (1922), Louis Couffignal (1933), Vannevar Bush (1936), Howard Aiken (1937). However:
... the emphasis is on programming a fixed iterable sequence of arithmetical operations. The fundamental importance of conditional iteration and conditional transfer for a general theory of calculating machines is not recognized ...—Gandy p. 55
With regards to Hilbert's problems posed by the famous mathematician David Hilbert in 1900, an aspect of problem #10 had been floating about for almost 30 years before it was framed precisely. Hilbert's original expression for #10 is as follows:
10. Determination of the solvability of a Diophantine equation. Given a Diophantine equation with any number of unknown quantities and with rational integral coefficients: To devise a process according to which it can be determined in a finite number of operations whether the equation is solvable in rational integers.The Entscheidungsproblem [decision problem for firstorder logic] is solved when we know a procedure that allows for any given logical expression to decide by finitely many operations its validity or satisfiability ... The Entscheidungsproblem must be considered the main problem of mathematical logic.
—quoted, with this translation and the original German, in Dershowitz and Gurevich, 2008
By 1922, this notion of "Entscheidungsproblem" had developed a bit, and H. Behmann stated that
... most general form of the Entscheidungsproblem [is] as follows:
 A quite definite generally applicable prescription is required which will allow one to decide in a finite number of steps the truth or falsity of a given purely logical assertion ...
—Gandy p. 57, quoting Behmann
Behmann remarks that ... the general problem is equivalent to the problem of deciding which mathematical propositions are true.—ibid.
If one were able to solve the Entscheidungsproblem then one would have a "procedure for solving many (or even all) mathematical problems".—ibid., p. 92
By the 1928 international congress of mathematicians Hilbert "made his questions quite precise. First, was mathematics complete ... Second, was mathematics consistent ... And thirdly, was mathematics decidable?" (Hodges p. 91, Hawking p. 1121). The first two questions were answered in 1930 by Kurt Gödel at the very same meeting where Hilbert delivered his retirement speech (much to the chagrin of Hilbert); the third—the Entscheidungsproblem—had to wait until the mid1930s.
The problem was that an answer first required a precise definition of "definite general applicable prescription", which Alonzo Church would come to call "effective calculability", and in 1928 no such definition existed. But over the next 6–7 years Emil Post developed his definition of a worker moving from room to room writing and erasing marks per a list of instructions (Post 1936), as did Princeton professor Church and his two students Stephen Kleene and J. B. Rosser by use of Church's lambdacalculus and Gödel's recursion theory (1934). Church's paper (published 15 April 1936) showed that the Entscheidungsproblem was indeed "undecidable" and beat Turing to the punch by almost a year (Turing's paper submitted 28 May 1936, published January 1937). In the meantime, Emil Post submitted a brief paper in the fall of 1936, so Turing at least had priority over Post. While Church refereed Turing's paper, Turing had time to study Church's paper and add an Appendix where he sketched a proof that Church's lambdacalculus and his machines would compute the same functions.
But what Church had done was something rather different, and in a certain sense weaker. ... the Turing construction was more direct, and provided an argument from first principles, closing the gap in Church's demonstration.—Hodges p. 112
And Post had only proposed a definition of calculability and criticized Church's "definition", but had proved nothing.
In the spring of 1935 Turing as a young Master's student at King's College Cambridge, UK, took on the challenge; he had been stimulated by the lectures of the logician M. H. A. Newman "and learned from them of Gödel's work and the Entscheidungsproblem ... Newman used the word 'mechanical' ... In his obituary of Turing 1955 Newman writes:
To the question 'what is a "mechanical" process?' Turing returned the characteristic answer 'Something that can be done by a machine' and he embarked on the highly congenial task of analysing the general notion of a computing machine.—Gandy, p. 74
Gandy states that:
I suppose, but do not know, that Turing, right from the start of his work, had as his goal a proof of the undecidability of the Entscheidungsproblem. He told me that the 'main idea' of the paper came to him when he was lying in Grantchester meadows in the summer of 1935. The 'main idea' might have either been his analysis of computation or his realization that there was a universal machine, and so a diagonal argument to prove unsolvability.—ibid., p. 76
While Gandy believed that Newman's statement above is "misleading", this opinion is not shared by all. Turing had a lifelong interest in machines: "Alan had dreamt of inventing typewriters as a boy; [his mother] Mrs. Turing had a typewriter; and he could well have begun by asking himself what was meant by calling a typewriter 'mechanical'" (Hodges p. 96). While at Princeton pursuing his PhD, Turing built a Booleanlogic multiplier (see below). His PhD thesis, titled "Systems of Logic Based on Ordinals", contains the following definition of "a computable function":
It was stated above that 'a function is effectively calculable if its values can be found by some purely mechanical process'. We may take this statement literally, understanding by a purely mechanical process one which could be carried out by a machine. It is possible to give a mathematical description, in a certain normal form, of the structures of these machines. The development of these ideas leads to the author's definition of a computable function, and to an identification of computability with effective calculability. It is not difficult, though somewhat laborious, to prove that these three definitions [the 3rd is the λcalculus] are equivalent.—Turing (1939) in The Undecidable, p. 160
When Turing returned to the UK he ultimately became jointly responsible for breaking the German secret codes created by encryption machines called "The Enigma"; he also became involved in the design of the ACE (Automatic Computing Engine), "[Turing's] ACE proposal was effectively selfcontained, and its roots lay not in the EDVAC [the USA's initiative], but in his own universal machine" (Hodges p. 318). Arguments still continue concerning the origin and nature of what has been named by Kleene (1952) Turing's Thesis. But what Turing did prove with his computationalmachine model appears in his paper On Computable Numbers, With an Application to the Entscheidungsproblem (1937):
[that] the Hilbert Entscheidungsproblem can have no solution ... I propose, therefore to show that there can be no general process for determining whether a given formula U of the functional calculus K is provable, i.e. that there can be no machine which, supplied with any one U of these formulae, will eventually say whether U is provable.—from Turing's paper as reprinted in The Undecidable, p. 145
Turing's example (his second proof): If one is to ask for a general procedure to tell us: "Does this machine ever print 0", the question is "undecidable".
In 1937, while at Princeton working on his PhD thesis, Turing built a digital (Booleanlogic) multiplier from scratch, making his own electromechanical relays (Hodges p. 138). "Alan's task was to embody the logical design of a Turing machine in a network of relayoperated switches ..." (Hodges p. 138). While Turing might have been just curious and experimenting, quiteearnest work in the same direction was going in Germany (Konrad Zuse (1938)), and in the United States (Howard Aiken) and George Stibitz (1937); the fruits of their labors were used by the Axis and Allied military in World War II (cf Hodges p. 298–299). In the early to mid1950s Hao Wang and Marvin Minsky reduced the Turing machine to a simpler form (a precursor to the PostTuring machine of Martin Davis); simultaneously European researchers were reducing the newfangled electronic computer to a computerlike theoretical object equivalent to what was now being called a "Turing machine". In the late 1950s and early 1960s, the coincidentallyparallel developments of Melzak and Lambek (1961), Minsky (1961), and Shepherdson and Sturgis (1961) carried the European work further and reduced the Turing machine to a more friendly, computerlike abstract model called the counter machine; Elgot and Robinson (1964), Hartmanis (1971), Cook and Reckhow (1973) carried this work even further with the register machine and random access machine models—but basically all are just multitape Turing machines with an arithmeticlike instruction set.
Today the counter, register and randomaccess machines and their sire the Turing machine continue to be the models of choice for theorists investigating questions in the theory of computation. In particular, computational complexity theory makes use of the Turing machine:
Depending on the objects one likes to manipulate in the computations (numbers like nonnegative integers or alphanumeric strings), two models have obtained a dominant position in machinebased complexity theory:
 the offline multitape Turing machine..., which represents the standard model for stringoriented computation, and
 the random access machine (RAM) as introduced by Cook and Reckhow ..., which models the idealized Von Neumann style computer.
—van Emde Boas 1990:4
Only in the related area of analysis of algorithms this role is taken over by the RAM model.—van Emde Boas 1990:16


Automata theory: formal languages and formal grammars  

Chomsky hierarchy  Grammars  Languages  Minimal automaton 
Type0  Unrestricted  Recursively enumerable  Turing machine 
n/a  (no common name)  Recursive  Decider 
Type1  Contextsensitive  Contextsensitive  Linearbounded 
n/a  Indexed  Indexed  Nested stack 
n/a  Treeadjoining etc.  (Mildly contextsensitive)  Embedded pushdown 
Type2  Contextfree  Contextfree  Nondeterministic pushdown 
n/a  Deterministic contextfree  Deterministic contextfree  Deterministic pushdown 
n/a  (no common name)  Visibly pushdown  Visibly pushdown 
Type3  Regular  Regular  Finite 
n/a  n/a  Starfree  Aperiodic finite 
Each category of languages or grammars is a proper subset of the category directly above it. Any automaton in each category has an equivalent automaton in the category directly above it. 
Contents 
From Alan Turing English mathematician, logician, and cryptographer
Plural 
Turing machine (plural Turing machines)


Turing machine is a term from computer science. A Turing machine is a system of rules, states and transitions rather than a real machine. It was first described by Alan Turing. There are two purposes of a Turing machine. Either it can be used to decide a formal language or it solves mathematical functions. Turing machines are one of the most important formal models in the study of computer science.
Contents 
A Turing machine consists of the following components (simplified):
Furthermore a workingalphabet (set of characters) has to be defined.
When a Turing machine is started, a word (out of the workingalphabet) must be present on the infinite tape of the machine. The read/writedevice on the first character now reads the first character and depending on the current state of Turing machine the read/writedevice overwrites the character with a new one or moves one cell to the left or to the right. Furthermore the current state of the machine can be switched.
A Turing machine is said to decide a language if it is always able to determine whether a given word is contained in a certain language or not. Therefore the machine usually has two special states marked as Accept and Reject. After a while one of the two states will be reached (depending on the input word) and the machine is halted. If only one of the two states will ever be reached, the Turing machine is said to semidecide a language.
If a Turing machine is used for the computation of functions it only has one end state. When the machine comes to that state it is halted and the result of the function (depending on the input) can be found on the tape.
Turing machines were not invented to be built in reality, but they are very important for theoretical computer science as they are one of the simplest models for computers. The ChurchTuring thesis states that all computers are only as powerful as Turing machines. This can be used to prove if a problem is solvable by a computer or not.
