I think I ought to talk more carefully about the largest-scale theoretical structure of quantum mechanics. A state in quantum mechanics does not fully determine the outcome of a measurement on a system, but it does let us exactly predict the probabilities of various possible outcomes. The possible outcomes of a measurement are called “eigenvalues” of the observable being measured and the possible states of the system in which it will definitely have a given eigenvalue are called the “eigenstates” associated with that eigenvalue. For example, a particle might be in the state |here〉 (I’m using Dirac’s notation for states), which means that a measurement of the particle’s location will find that it’s definitely “here”. States that are eigenstates of one observable are not necessarily eigenstates of other observables. One well-known example is that position eigenstates are not simultaneously momentum eigenstates - in the most extreme situation, if the particle is in the state |here〉 its momentum is entirely undetermined by the state and a measurement of momentum could return any value at all. This is the origin of the position-momentum version of Heisenberg’s famous Uncertainty Principle.
Now, the machinery of quantum mechanics is divided into two quite different parts. We might call them “unitary advance” and “measurement”. The unitary advance part is what happens when we aren’t measuring the properties of the system. The state of the system then changes smoothly and entirely predictably with time as I’ve described elsewhere. When we make a measurement, however, something radically different happens. The state of the system determines the probabilities of the various possible results of the measurement, and after the measurement the system is in the corresponding eigenstate. If we repeat the measurement really fast so that the state hasn’t had any time to change, we will find exactly the same result. If we wait longer between measurements, the unitary advance part kicks in again and we can use that to predict future probabilities. This means that we can always predict the probabilities of outcomes of the next measurement, but after that all bets are off because we aren’t certain of the new start point from which to wind the state further into the future. (We can use statistical mechanics to handle our ignorance if we want to do that. We then have two sources of probability: our uncertainty of the exact state and the quantum randomness. Increasing our knowledge of the system in some way can reduce the former but not the latter.)
All of the above then leads us to consider what exactly constitutes a measurement. Physicists are somewhat embarrassed about all of this, because we have no idea! It’s clear in all experimental conditions when a measurement has occurred, but beyond that we really don’t know. It seems to have something to do with the system interacting with something big, but that’s horribly vague. We don’t even know how big the measuring apparatus has to be! There are quite a few theories, but none of them are very satisfactory:
- There’s one theory called that says that sometimes a measurement (or rather a “state reduction”) just spontenously happens. The people who like this theory then try to figure out how often reductions happen and stuff like that (I have no idea how they decided which observable the reduction corresponds to, and there are an infinite number of choices).
- Another theory is that it’s something to do with gravity: different possible outcomes of the measurement would give different arrangements of matter and hence different gravitational fields, and when this strain on the reality gets too big (perhaps about as large as one graviton, if such things exist) then POP! it snaps to one of the possible geometries.
- Other people seem to think that the whole universe is divided into quantum and classical realms and the measurement happens when information crosses from the former to the latter. This theory seems too absurd for me to take seriously, but for most of the history of quantum mechanics it was the standard interpretation!
- Recently the idea of decoherence has become popular. This states that measurement/reduction is a sort of illusion. What really happens is that the system and the measuring apparatus evolves into a superposition, and then we all get caught up in the superposition too. From this it’s possible to show that what looks like a quantum superposition from the “outside” looks like a statistical combination when viewed from the “inside”.
I’m not really sure I have much of an opinion when it comes to the measurement problem. None of the solutions feel quite right to me. I think we might be looking at the whole thing from the wrong point of view and tying ourselves into horrible knots…
Leave a comment