The theory of quantized fields I. Gleason, A. Measures on the Closed Subspaces of a Hilbert Space. Journal of Mathematics and Mechanics , 6 , — Cooke, R. Pitowsky, I. Wilce, A. Zalta, E. Saunders, S. Derivation of the born rule from operational assumptions. Busch, P.
Caves, C. Gleason-type derivations of the quantum probability rule for generalized measurements. Wright, V.
Quantum theory and measurement
A Gleason-type theorem for qubits based on mixtures of projective measurements. Logiurato, F. Born Rule and Noncontextual Probability. Han, Y. Quantum Probability assignment limited by relativistic causality. Frauchiger, D. A non-probabilistic substitute for the Born rule.
Shrapnel, S. Updating the Born rule. New J. Holevo, A. Statistical decision theory for quantum systems. Helstrom, C. Deutsch, D. Quantum theory of probability and decisions. Wallace, D. Everett, Quantum Theory, and Reality. Zurek, W. A 71 , Barnum, H. Schlosshauer, M. Quantum probability from decision theory?
Measurement in Quantum Theory (Stanford Encyclopedia of Philosophy/Summer Edition)
Everett, Quantum Theory, and Reality eds. Baker, D. Measurement outcomes and probability in everettian quantum mechanics. Part B: Stud. Hemmo, M. Quantum probability and many worlds. Lewis, P.
Probability in everettian quantum mechanics. Manuscrito 33 , — Price, H. Everett, Quantum Theory, and Reality , eds. Albert, D. Mohrhoff, U. Probabities from envariance? Quantum Inf. Rudin, W. Functional Analysis. International Series in Pure and Applied Mathematics. McGraw-Hill, New York, Ozawa, M. Quantum measuring processes of continuous observables.
Quantum theory and measurement
Chiribella, G. Measurement sharpness cuts nonlocality and contextuality in every physical theory. Galley, T. Classification of all alternatives to the Born rule in terms of informational properties. Quantum 1 , 15 Davies, E. An operational approach to quantum probability. Hardy, L. Quantum theory from five reasonable axioms. Probabilistic theories with purification. Any modification of the Born rule leads to a violation of the purification and local tomography principles. Quantum 2 , Aaronson, S. Is quantum mechanics an island in theoryspace? Cabello, A. The physical origin of quantum nonlocality and contextuality.
Download references. We are grateful to Jonathan Barrett and Robin Lorenz for discussions about the toy theory, which was independently studied by them. This publication was made possible through the support of a grant from the John Templeton Foundation; the opinions expressed in this publication are those of the authors and do not necessarily reflect the views of the John Templeton Foundation. Correspondence to Thomas D. Journal peer review information: Nature Communications thanks the anonymous reviewer s for their contribution to the peer review of this work.
Reprints and Permissions. By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate. Advanced search. Skip to main content. Subjects Quantum information Theoretical physics. Abstract Understanding the core content of quantum mechanics requires us to disentangle the hidden logical relationships between the postulates of this theory.
Introduction What sometimes is postulated as a fundamental law of physics is later on understood as a consequence of more fundamental principles. Results The standard postulates of QM Before presenting the main result we prepare the stage appropriately. Discussion It may seem that conditions 7—14 are a lot of assumptions to claim that we derive the measurement postulates from the non-measurement ones. Data availability No data sets were generated or analyzed during the current study. References 1. Article Google Scholar Google Scholar The quantum world is probabilistic, whereas the classical world which is where all of our measurements happen contains only unique outcomes.
It tells us with unflagging reliability what to expect. What more do you want? The principle of complementarity seemed a deeply unsatisfying compromise to many physicists, since it not only evaded difficult questions about the nature of reality but essentially forbade them. Still, complementarity had at least the virtue of pinpointing where the problems lay: in understanding what we mean by measurement.
It is through measurement that objects become things rather than possibilities — and furthermore, they become things with definite states, positions, velocities and other properties. What we needed to unite the quantum and classical views, then, was a proper theory of measurement. There things languished for a long time. Now we have that theory. The boundary between quantum and classical turns out not to be a chasm after all, but a sensible, traceable path.
- Confessions of a GP (The Confessions Series).
- New approach to the quantum measurement problem -- ScienceDaily.
- 1. Introduction.
- What is the quantum mechanical definition of a measurement? - Physics Stack Exchange;
A ball has a position, or a speed, or a mass. I can measure those things, and the things I measure are the properties of the ball. What more is there to say?
There, the position of a particle is nothing more than a whole set of possible positions until the moment when it is observed. The same holds true for any other aspect of the particle. How does the multitude of potential properties in a quantum object turn into one specific reading on a measuring device? What is it about the object that caused the device to point to that precise answer?
Quantum objects have a wave nature — which is to say, the theory tells us that they can be described as if they were waves, albeit waves of a peculiar sort. The waves do not move through any physical substance, as do waves in air or water, but are encoded in a purely mathematical object called a wave function that can be converted to probabilities of values of observable quantities. As a result, quantum particles such as photons of light, electrons, atoms, or even entire molecules can exhibit interference, a classical property of waves in which two peaks reinforce each other when they overlap, whereas when a peak coincides with a trough the two can cancel each other out.
Asking if these quantum objects really are particles or waves misses the point, because both of those are classical concepts. Quantum effects such as interference rely on the wave functions of different entities being coordinated the technical term is coherent with one another. That sort of coherence is what permits the quantum property of superposition , in which particles are said to be in two or more states at once.
But if the wave functions of those states are coherent, then both states remain possible outcomes of a measurement. If their wave functions are not coherent, two states cannot interfere, nor maintain a superposition. The process called decoherence therefore destroys these fundamentally quantum properties, and the states behave more like distinct classical systems.
This — and not sheer size per se — is the fundamental dividing line between what we think of as quantum versus classical familiar behaviour. What, though, causes decoherence? This arises because of a long-neglected aspect of quantum entities: their environment. Usually that works fine. But not if we want to observe anything about the quantum world. The foundations of decoherence theory were laid in the s by the German physicist H Dieter Zeh.
Polish by birth and exuberantly curly haired, Zurek displays a laconic calm in the face of the mind-boggling aspects of quantum mechanics that he has uncovered. That composure makes sense once you appreciate that he studied under John Wheeler, the near-legendary American physicist who himself worked with Bohr and had a rare talent for the wry epigram.
He coined the term wormhole and popularised the concept of black holes. Zurek has become one of the key architects and advocates of decoherence theory, helping to establish it as the central concept connecting the quantum and classical worlds. This connection comes from the fact that quantum coherence is contagious. If one quantum object interacts with another, they become linked into a composite superposition: in some sense, they become a single system.
This is, in fact, the only thing that can happen in such an interaction, according to quantum mechanics. The two objects are then said to be entangled. It might sound spooky, but this is merely what happens when a quantum system interacts with its environment — as a photon of light or an air molecule bounces off it, say. As a result, coherence spreads into the environment. In theory, there is no end to this process. An entangled air molecule hits another, and the second molecule gets drawn into the entangled state.
Meanwhile, other particles hit the initial quantum system, too. This spreading of entanglement is the thing that destroys the manifestation of coherence in the original quantum system. Decoherence is not actually a loss of superposition and coherence, but rather a loss of our ability to detect these things in the original system. With or without us, the Universe is always looking. And how can we possibly hope to do that — to monitor every photon that bounces off the original system, every air molecule that collided with it and then subsequently with others?
- 2. Quantum Theory;
- The Measurement Problem!
- The Truth About Grief: The Myth of Its Five Stages and the New Science of Loss.
- Advances in Peritoneal Surface Oncology (Recent Results in Cancer Research).
- Catholicism and Evolution: A History from Darwin to Pope Francis.
The pieces of the puzzle have been scattered so widely that they are lost, for all practical purposes, even though in principle they are still out there, and remain so as far as quantum mechanics tells us indefinitely. It is a gradual and real process that occurs at a particular rate. Quantum mechanics allows us to calculate that rate, so that we can put the theory of decoherence to the test.
The loss of interference between states of the atom owing to decoherence, as calculated from quantum theory, matched the experimental observations perfectly. And in a team at the University of Vienna led by Anton Zeilinger and Markus Arndt watched interference vanish between the quantum waves of large molecules, as they altered the rate of decoherence by gradually admitting a background gas into the chamber where the interference took place, so that the gas molecules would collide with those in the matter waves.
Again, theory and experiment tallied well.
Decoherence is a phenomenally efficient process, probably the most efficient one known to science. For a dust grain th of a millimetre across floating in air, it takes about 10 seconds: a million times faster than the passage of a photon of light across a single proton! But if we watch nature carefully enough, we can see how the trick is done. Notice that this effect of decoherence has nothing to do with observation in the normal sense.
T he decay of quantum superposition and interference by decoherence is only the first element in a quantum theory of measurement, however. We also have to explain why classical measuring instruments register the values they do. Exactly how we define a superposition state depends on how we choose to write the maths. From the quantum perspective, all states are equally valid solutions to the equations. Why do we see the common-sense states but not the imponderable superpositions? There are two parts to the answer.
It specifically selects states that have particular mathematical properties of symmetry, and trashes the others. Zurek calls this environment-induced selection or einselection. Survival means that the state is measurable in principle — but we still have to get at that information to detect the state. So we need to ask how that information becomes available to an experimenter. If we were able, with some amazing instrument, to record the trajectories of all the air molecules bounding off the speck of dust, we could figure out where the speck is without looking at it directly; we could just monitor the imprint it leaves on its environment.
Just as coupling the object to its environment sets decoherence in train, so too it imprints information about the object onto the environment, creating a kind of replica. A measurement of that object then amounts to acquiring this information from the replica. A detailed theoretical analysis of decoherence carried out by Zurek and his colleagues shows that some quantum states are better than others at producing these replicas: they leave a more robust footprint, which is to say, more copies.
These robust states are the ones that we can measure, and that ultimately produce a unique classical signature from the underlying quantum morass.
Related The Quantum Theory of Measurement
Copyright 2019 - All Right Reserved