Quantum mechanics is a physicaltheory that describes the behavior of physical systems at short distances. Quantum mechanics provides a mathematical framework derived from a small set of basic principles capable of producing experimental predictions for three types of phenomena that classical mechanics and classical electrodynamics cannot account for: quantization, wave-particle duality, and quantum entanglement. The related terms quantum physics and quantum theory are sometimes used as synonyms of quantum mechanics, but also to denote a superset
of theories, including pre-quantum mechanics old quantum theory (see #History), or, when the term quantum mechanics is used in a more restricted sense, to include theories like quantum field theory.
Wave functions can change as time progresses. For example, a particle moving in empty space may be described by a wave function that is a wave packet centered around some mean position. As time progresses, the center of the wave packet changes, so that the particle becomes more likely to be located at a different position.
Some wave functions describe probability distributions that are constant in time. Many systems that would be treated dynamically in classical mechanics are described by such static wave functions. For example, an electron in an unexcited atom is pictured classically as a particle circling the atomic nucleus, whereas in quantum mechanics it is described by a static, spherically symmetric probability cloud surrounding the nucleus.
The actual measurement of an observable of the system always results in the modification of the system and its wave function. In fact, immediately after a measurement is performed, the wavefunction becomes one of the wavefunctions compatible with the measurement. This process is known as wavefunction collapse. The probability of collapsing into a given wave function depends on the type of measurement, and can be computed from the instantaneous wavefunction just before the collapse. Consider the above example of a particle moving in empty space. If we measure the particle's position, we will obtain a random value x. In general, it is impossible for us to predict with certainty the value of x which we will obtain, although it is probable that we will obtain one that is near the center of the wave packet, where the amplitude of the wave function is large. After the measurement has been performed, the wavefunction of the particle collapses into one that is sharply concentrated around the observed position x. The measurement of the speed of the particle would result in a totally different wave function.
The time evolution of wave functions is deterministic in the sense that, given a wavefunction at an initial time, it makes a definite prediction of what the wavefunction will be at any later time. During a measurement, the change of the wavefunction into another one is probabilistic, not deterministic. The probabilistic nature of quantum mechanics thus stems from the act of measurement.
One of the consequences of wavefunction collapse is that measuring observable A immediately after observable B will generally not give the same result as measuring observable B immediately after A. This helps explain wave-particle duality: measuring a particle-like observable, such as position, impacts the wave-like behavior, such as interference, of the system, and vice-versa. While any system has both particle- and wave-like properties, we can observe only one of them at a time.
In some cases, the wave function of a 2-particle system cannot be separated in 2 independent wave functions, one for each particle: the 2 particles are entangled. This implies, if the theory is correct, that when a measurement is made on one of them the wave function collapses and the second particle is instantaneously impacted, even if far away.
Each observable is represented by a densely-defined Hermitian linear operator acting on the state space. Each eigenstate of an observable corresponds to an eigenvector of the operator, and the associated eigenvalue corresponds to the value of the observable in that eigenstate. If the operator's spectrum is discrete, the observable can only attain those discrete eigenvalues. During a measurement, the probability that a system collapses to each eigenstate is given by the absolute square of the inner product between the eigenstate vector and the state vector just before the measurement. The possible results of a measurement are the eigenvalues of the operator - which explains the choice of Hermitian operators -- all their eigenvalues are real. We can therefore find the probability distribution of an observable in a given state by computing the spectral decomposition of the corresponding operator. Heisenberg's uncertainty principle is represented by the statement that the operators corresponding to certain observables do not commute.
Early attempts to merge quantum mechanics with special relativity involved the replacement of the Schrödinger equation with a covariant equation such as the Klein-Gordon equation or the Dirac equation. While these theories were successful in explaining many experimental results, they had certain unsatisfactory qualities stemming from their neglect of the relativistic creation and annihilation of particles. A fully relativistic quantum theory required the development of quantum field theory, which applies quantization to a field rather than a fixed set of particles. The first complete quantum field theory, quantum electrodynamics, provides a fully quantum description of the electromagnetic interaction.
The full apparatus of quantum field theory is often unnecessary for describing electrodynamic systems. A simpler approach, one employed since the inception of quantum mechanics, is to treat charged particles as quantum mechanical objects being acted on by a classical electromagnetic field. For example, the elementary quantum model of the hydrogen atom describes the electric field of the hydrogen atom using a classical 1/r Coulomb potential. This "semi-classical" approach fails if quantum fluctuations in the electromagnetic field play an important role, such as in the emission of photons by charged particles.
It has proven difficult to construct quantum models of gravity, the remaining fundamental force. Semi-classical approximations are workable, and have led to predictions such as Hawking radiation. However, the formulation of a complete theory of quantum gravity is hindered by apparent incompatibilities between general relativity, the most accurate theory of gravity currently known, and some of the fundamental assumptions of quantum theory. The resolution of these incompatibilities is an area of active research, and theories such as string theory are among the possible candidates for a future theory of quantum gravity.
Semi-classical approximations are techniques that make it possible to formulate a quantum problem with some physical quantities replaced by their classical analogues, in an effort to reduce the complexity of the model. Even within non-relativistic quantum mechanics, a fully microscopic treatment generally requires large-scale numerical computations. Analytic quantum solutions that describe the system behavior in terms of known mathematical functions are available only for a small class of systems, of which the harmonic oscillator and the hydrogen atom are the most important representatives.
Even the helium atom, containing just one more electron than hydrogen, defies all attempts at a fully analytic treatment in quantum mechanics.
In such a situation, approximate semi-classical results can provide valuable insights. The necessary methods rely on a detailed understanding of the corresponding classical mechanics, allowing in particular for the existence of chaos. The study of these approximations belongs to the field of quantum chaos.
Researchers are currently seeking robust methods of directly manipulating quantum states. Efforts are being made to develop quantum cryptography, which will allow guaranteed secure transmission of information. A more distant goal is the development of quantum computers, which are expected to perform certain computational tasks with much greater efficiency than classical computers. Another active research topic is quantum teleportation, which deals with techniques to transmit quantum states over arbitrary distances.
The Copenhagen interpretation, due largely to Niels Bohr, was the standard interpretation of quantum mechanics when it was first formulated. According to it, the probabilistic nature of quantum mechanics predictions cannot be explained in terms of some other deterministic theory, and do not simply reflect our limited knowledge. Quantum mechanics provides probabilistic results because the physical universe is itself probabilistic rather than deterministic.
Albert Einstein, himself one of the founders of quantum theory, disliked this loss of determinism in measurement. He held that quantum mechanics must be incomplete, and produced a series of objections to the theory. The most famous of these was the EPR paradox. John Stewart Bell's theoretical solution to the EPR paradox, and its later experimental verification, disproved a large class of such hidden variable theories and persuaded the majority of physicists that quantum mechanics is not an approximation to a nominally classical hidden-variable theory.
The many worlds interpretation, formulated in 1956, holds that all the possibilities described by quantum theory simultaneously occur in a "multiverse" composed of mostly independent parallel universes. While the multiverse is deterministic, we perceive non-deterministic behavior governed by probabilities because we can observe only the universe we inhabit.
The Bohm interpretation postulates the existence of a non-local, universal wavefunction (Schrödinger equation) which allows distant particles to interact instantaneously. It is not popular among physicists largely because it is considered very inelegant.
These theories, though successful, were strictly phenomenological: there was no rigorous justification for quantization. They are collectively known as the old quantum theory.
The phrase "quantum physics" was first used in Johnston's Planck's Universe in Light of Modern Physics.
Modern quantum mechanics was born in 1925, when Heisenberg developed matrix mechanics and Schrödinger invented wave mechanics and the Schrödinger equation. Schrödinger subsequently showed that the two approaches were equivalent.
Heisenberg formulated his uncertainty principle in 1927, and the Copenhagen interpretation took shape at about the same time. In Source | Copyright