"Quantum Theory: If a tree falls in forest…"
by
Jim Baggott
February 14th, 2011
Oxford University Press
If a tree falls in the forest, and there’s nobody around to hear, does it make a sound?
For centuries philosophers have been teasing our intellects with such questions. Of course, the answer depends on how we choose to interpret the use of the word ‘sound’. If by sound we mean compressions and rarefactions in the air which result from the physical disturbances caused by the falling tree and which propagate through the air with audio frequencies, then we might not hesitate to answer in the affirmative.
Here the word ‘sound’ is used to describe a physical phenomenon – the wave disturbance. But sound is also a human experience, the result of physical signals delivered by human sense organs which are synthesized in the mind as a form of perception.
Now, to a large extent, we can interpret the actions of human sense organs in much the same way we interpret mechanical measuring devices. The human auditory apparatus simply translates one set of physical phenomena into another, leading eventually to stimulation of those parts of the brain cortex responsible for the perception of sound. It is here that the distinction comes. Everything to this point is explicable in terms of physics and chemistry, but the process by which we turn electrical signals in the brain into human perception and experience in the mind remains, at present, unfathomable.
Philosophers have long argued that sound, colour, taste, smell and touch are all secondary qualities which exist only in our minds. We have no basis for our common-sense assumption that these secondary qualities reflect or represent reality as it really is. So, if we interpret the word ‘sound’ to mean a human experience rather than a physical phenomenon, then when there is nobody around there is a sense in which the falling tree makes no sound at all.
This business about the distinction between ‘things-in-themselves’ and ‘things-as-they-appear’ has troubled philosophers for as long as the subject has existed, but what does it have to do with modern physics, specifically the story of quantum theory? In fact, such questions have dogged the theory almost from the moment of its inception in the 1920s. Ever since it was discovered that atomic and sub-atomic particles exhibit both localised, particle-like properties and delocalised, wave-like properties physicists have become ravelled in a debate about what we can and can’t know about the ‘true’ nature of physical reality.
Albert Einstein once famously declared that God does not play dice. In essence, a quantum particle such as an electron may be described in terms of a delocalized ‘wavefunction’, with probabilities for appearing ‘here’ or ‘there’. When we look to see where the electron actually is, the wavefunction is said to ‘collapse’ instantaneously, and appears ‘here’ with a frequency consistent with the probability predicted by quantum theory. But there is no predicting precisely where an individual electron will be found. Chance is inherent in the collapse of the wavefunction, and it was this feature of quantum theory that got Einstein so upset. To make matters worse, if the collapse is instantaneous then this implies what Einstein called a ‘spooky action-at-a-distance’ which, he argued, appeared to violate a key postulate of his own special theory of relativity.
So what evidence do we have for this mysterious collapse of the wavefunction? Well, none actually. We postulate the collapse in an attempt to explain how a quantum system with many different possible outcomes before measurement transforms into a system with one and only one result after measurement. To Irish physicist John Bell this seemed to be at best a confidence-trick, at worst a fraud. ‘A theory founded in this way on arguments of manifestly approximate character,’ he wrote some years later, ‘however good the approximation, is surely of provisional nature.’
When Bell devised his famous inequality in 1964 these questions returned with a vengeance. Bell sought a way to discriminate between conventional quantum theory and a whole class of alternative, so-called local hidden variable theories which do not need to assume a collapse of the wavefunction. He deduced a mathematical relationship in which local hidden variable theories predict results that are manifestly contradicted by the predictions of conventional quantum theory, providing a direct laboratory test. Some exquisite experiments performed subsequently proved beyond any doubt that quantum theory, with all its apparent ‘spookiness’, is correct.
In 2003, English physicist Tony Leggett took the debate to another level. Local hidden variable theories are characterized by a couple of key assumptions. In one of these, it is assumed that the outcome of a measurement on a quantum particle can in no way be affected by the setting of the device used to make measurements on a second particle with which it is ‘entangled’ (in other words, both particles are described by a single wavefunction). Leggett chose to drop this assumption to see what would happen.
He went on to deduce a further inequality. For a specific combination of measurement settings, quantum theory predicts results which violate this inequality, implying that the outcomes of measurements on distant particles can be affected by some unspecified non-local influence of the device settings. The result we get depends on how we set up another device, even though this may be halfway across the universe. Spooky, indeed.
The results of experiments to test Leggett’s inequality were reported in 2007. Once again, quantum theory was proved to be correct. This kind of result cannot be reconciled with any theory which ascribes fixed properties to the particles prior to measurement. This means that we can no longer assume that the properties we measure necessarily reflect or represent the properties of the particles as they really are. These properties are like secondary qualities – they exist only in relation to our measuring devices. This does not mean that quantum particles are not real. What it does mean is that we can ascribe to them only an empirical reality, a reality that depends on our method of questioning.
Without a measuring device to record it, there is a sense in which the recognisable properties of quantum particles such as electrons do not exist, just as the falling tree makes no sound at all.
‘Reality is merely an illusion,’ Einstein once admitted, ‘albeit a very persistent one.’
The Quantum Story: A History in 40 Moments
ISBN-10: 0199566844
ISBN-13: 978-0199566846
Latest book...
Farewell to Reality: How Modern Physics Has Betrayed the Search for Scientific Truth
ISBN-10: 1605984728
ISBN-13: 978-1605984728
From superstrings and black holes to dark matter and multiverses, modern theoretical physics revels in the bizarre. Now it’s wandered into the realm of “fairy-tale,” says science writer and former “practicing” physicist Baggott (A Beginners Guide to Reality). Quantum theory led scientists to create a Standard Model of physics in the mid-20th century, but that model is really an amalgam of distinct individual quantum theories necessary to describe a diverse array of forces and particles. Meanwhile, astronomical observations have revealed that 90% of our universe is made of something we can’t see (dark matter); some mysterious “dark energy” is pushing all of it apart at an accelerating rate, and physicists are gambling on a “supersymmetry” theory in hopes that it could be the holy grail, a Grand Unified Field Theory that might lend coherence to the Standard Model while explaining some of the phenomena the latter fails to account for—despite the fact, Baggott says, that for “every standard model problem it resolves, another problem arises that needs a fix.” In consistently accessible and intelligent prose, Baggott sympathetically captures the frustrations of physicists while laying out a provocative—and very convincing—plea for a reality check in a field that he feels is now too “meta” for its own good.
No comments:
Post a Comment