Recursion is a way of specifying a process by means of itself. More precisely (and to dispel the appearance of circularity in the definition), "complicated" instances of the process are defined in terms of "simpler" instances, and the "simplest" instances are given explicitly.
Mathematical linguist Noam Chomsky produced evidence that unlimited extension of a language such as English is possible only by the recursive device of embedding sentences in sentences. Thus, a talky little girl may say, "Dorothy, who met the wicked Witch of the West in Munchkin Land where her wicked Witch sister was killed, liquidated her with a pail of water." Clearly, two simple sentences — "Dorothy met the Wicked Witch of the West in Munchkin Land" and "Her sister was killed in Munchkin Land" — can be embedded in a third sentence, "Dorothy liquidated her with a pail of water," to obtain a very talky sentence.
Niels K. Jerne, the 1984 Nobel Prize laureate in Medicine and Physiology, used Chomsky's transformational-generative grammar model to explain the human immune system, equating "components of a generative grammar ... with various features of protein structures." The title of Jerne's Stockholm Nobel lecture was The Generative Grammar of the Immune System.
Here is another, perhaps simpler way to understand recursive processes:
Are we done yet? If so, return the results. Without such a termination condition a recursion would go on forever.
If not, simplify the problem, solve those simpler problem(s), and assemble the results into a solution for the original problem. Then return that solution.
A more humorous illustration goes: "In order to understand recursion, one must first understand recursion." Or perhaps more accurate is the following due to Andrew Plotkin: "If you already know what recursion is, just remember the answer. Otherwise, find someone who is standing closer to Douglas Hofstadter than you are; then ask him or her what recursion is."
if a proposition is an axiom, it is a true reachable proposition.
if a proposition can be obtained from true reachable propositions by means of inference rules, it is a true reachable proposition.
The set of true reachable propositions is the smallest set of reachable propositions satisfying these conditions.
This set is called 'true reachable propositions' because: in nonconstructive approaches to the foundations of mathematics, the set of true propositions is larger than the set recursively constructed from the axioms and rules of inference. See also Godels Incompleteness Theorem.
(Note that determining whether a certain object is in a recursively defined set is not an algorithmic task.)
Virtually all programming languages in use today allow the direct specification of recursive functions and procedures. When such a function is called, the computer (for most lanaguages on most stack-based architectures) or the language implementation keeps track of the various instances of the function (on many architectures, by using a stack, although other methods may be used). Conversely, every recursive function can be transformed into an iterative function by using a stack.
Any function that can be evaluated by a computer can be expressed in terms of recursive functions, without use of iteration, and conversely.
Some languages designed for logic programming and functional programming provide recursion as the only means of repetition directly available to the programmer. Such languages generally make tail recursion as efficient as iteration, letting programmers express other repetition structures (such as Scheme'smap and for) in terms of recursion.
Recursion is deeply embedded in the theory of computation, with the theoretical equivalence of recursive functions and Turing machines at the foundation of ideas about the universality of the modern computer.
In set theory, this is a theorem guaranteeing that recursively defined functions exist. Given a set X, an element a of X and a function f : X->X, the theorem states that there is a unique function F : N->X (where N denotes the set of natural numbers) such that