The languages that are accepted by a Turing machine are exactly those that are generated by formal grammars. The lambda calculus is a way of defining functions. The functions that can be computed in the lambda calculus are exactly those that can be computed by a Turing machine. These three formulations, Turing machines, formal grammars, and the lambda calculus all look very different, and were all developed by different people. Yet they are all equivalent, and have the same problem-solving power.
This is generally taken as evidence for the Church-Turing thesis, which is the claim that our intuitive notion of an algorithm or an effective procedure is captured by the mathematical definition of a Turing machine.
Electronic computers, and even quantum computers, are exactly equivalent to Turing machines, if they have access to an unbounded supply of memory. As a corollary, all implementable programming languages are at best equivalent in power to a Turing machine (in practice, very few are less powerful). Such languages are said to be Turing-complete.
Systems equivalent to a Turing machine include:
Turing machine with several tapes
Turing machine with a 2-dimensional "tape" (an infinite number of linear tapes)
Turing machine with a limited number of states and tape symbols
States×symbols can be any of 2×18, 3×10, 4×6, 5×5, 7×4, 10×3, 22×2
Turing machine with 2 states, always reversing the tape value
Turing machine with 2 states, with 3 possible value changes: 0->0, 0->1 and 1->1 (proposed by Hao Wang)
The last three examples use a slightly different definition of accepting a language. They are said to accept a string if any computation accepts (for non-deterministic), or most computations accept (for probabilistic and quantum). Given these definitions, those machines have the same power as a Turing machine for accepting languages.
The Chomsky hierarchy defines those languages that can be accepted by four classes of algorithms. They all assume a machine consisting of a non-deterministic finite state machine combined with some form of memory. If the memory is an infinite tape, then it has the full power of a Turing machine, and can accept exactly those languages that are generated by unrestricted grammars. If it is given only an amount of memory proportional to the size of the input, then it can recognize exactly those languages generated by context-sensitive grammars. If it is given only a stack as its memory, then it can recognize exactly those languages generated by context-free grammars. If it is given no additional memory at all then it can accept exactly those languages generated by regular grammars.
Other restrictions on memory, or time, or other resources have often been considered. The results for those restrictions are usually considered part of complexity theory rather than computability theory.