The Universe is not Locally Real… Wait, What?

In this level-0 post, we discuss the latest Nobel Prize in physics, awarded to Alain Aspect, John Clauser, and Anton Zeilinger “for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science,” the significance of their experiments, and what it means for the universe to not be locally real.

It has been a little while since I last wrote a blog post, though I do have a couple of ideas in the works. This post was a request from a friend (thank you) who had seen news about the recent Nobel Prize in physics and wanted to know what people meant when they said that the prize-winning work shows that the universe is not “locally real.” To get there, we are going to have to review a little bit of history and quantum mechanics.

Some elements of Quantum Mechanics

Much of physics is all about particles. In classical mechanics, you start by treating any object as a point particle. If you care about the size and shape of an object, then you can construct it out of many point particles. Thermodynamics and fluid mechanics are about the collective properties and motion of particles. Quantum mechanics also deals with particles. What separates these theories is how they describe their systems: position and velocity in classical mechanics; pressure, temperature, and volume (among others) in thermodynamics; state vectors in quantum mechanics.

If state vectors are new to you, don’t worry, we will cover everything about them that we need to know. We can think of a state vector as a mathematical object that contains the likelihood of all possible results of all possible measurements of a particle in that state. If you want to find out where your favorite electron is, its state vector will tell you the probability that it is in your pocket, or in your sandwich, or wherever else. What is really unique about quantum states, versus classical ones, is that you can add quantum states together into a superposition. So let’s say you have a particle that can be measured as being either up (denoted as |\!\uparrow\rangle) or down (denoted as |\!\downarrow\rangle). A particle in the |\!\uparrow\rangle state will always be measured as being up. However, we could also make the state

\frac{1}{\sqrt{2}} |\!\uparrow\rangle + \frac{1}{\sqrt{2}} |\!\downarrow\rangle ,

which would be measured as being up half the time and down the other half of the time. The factors of 1/\sqrt{2} are there because the measurement probability goes as the square of the coefficient on the state, i.e. each measurement outcome has a chance of (1/\sqrt{2})^2=1/2.

Importantly, once a measurement is taken, another measurement taken quickly after the first always agrees with the first measurement. One way to think about this is that the act of measuring the particle changes the state to whichever state corresponds to the measurement.

Now, we can talk about a very interesting consequence of superposition and the effect of measurement, which is at the heart of the Nobel Prize work: entanglement. Entanglement occurs when the states of two particles cannot be described separately. Let’s take two particles (1 and 2) that can each be in either the up or down states from before. The full state of our system could be |\!\uparrow_1\rangle |\!\downarrow_2\rangle, which would mean that particle 1 is always measured as up, and particle 2 always as down. This is not an entangled state because you can consider each particle state separately (you can factor the state, if you will). An entangled state requires a superposition of these, like

\frac{1}{\sqrt{2}}|\!\uparrow_1\rangle|\!\downarrow_2\rangle + \frac{1}{\sqrt{2}}|\!\downarrow_1\rangle|\!\uparrow_2\rangle .

In the above state, the particles will always be measured with opposite outcomes, but there is no description that treats each particle separately. A similar, but not entangled, state would be if we simply flip the |\!\downarrow_1\rangle to a |\!\uparrow_1\rangle:

\frac{1}{\sqrt{2}}|\!\uparrow_1\rangle|\!\downarrow_2\rangle + \frac{1}{\sqrt{2}}|\!\uparrow_1\rangle|\!\uparrow_2\rangle =  |\!\uparrow_1\rangle\,\Big(\frac{1}{\sqrt{2}}|\!\downarrow_2\rangle +\frac{1}{\sqrt{2}}|\!\uparrow_2\rangle\Big).

The above state can be factored into the up state for particle 1 and an equal superposition of up and down states for particle 2. Therefore, it is not entangled. Now you may wonder what the big deal is, and why we bother to differentiate between entangled and non-entangled states to begin with. So let’s talk about the EPR paradox.

A little history

Quantum mechanics represented a fundamental shift in how we perceive the universe. Instead of the models of ideal classical particles developed in the 18th and 19th centuries, with their definite positions and velocities, physicists of the early 20th century were tasked with reckoning with a different, quantum particle that could be in superpositions of states, with indefinite positions and velocities. It can be easy sometimes to forget that these laws of physics were not just written in some eternal textbook, but that real human beings had to struggle with themselves and others to figure out what best describes experimental results. Many decades-long debates sprang up during this early period over whether all these proposed quantum effects made any sense at all.

Einstein, a major contributor to the development of quantum mechanics himself, was also a vociferous critic of many parts of the theory. In 1935, Einstein and two other physicists, Boris Podolsky and Nathan Rosen, published a paper critiquing the idea of entanglement. Their issue was a fundamental one, as evidenced by the title of their paper, “Can Quantum-Mechanical Description of Physical Reality be Considered Complete?” They proposed what is now known as the Einstein-Podolsky-Rosen (EPR) paradox, wherein they considered a thought experiment. Prepare two particles in an entangled state. For concreteness, we will consider the entangled state we constructed in the previous section, where the particles are always in opposite states. Now, separate these particles some distance, and measure each one in quick succession. Quantum mechanics says that the two particles would be measured in opposite states. However, the theory of relativity says that information (or anything else for that matter) cannot travel faster than the speed of light, and somehow the outcome of the measurement of particle 1 is known by particle 2 before that information could possibly reach particle 2. This is the EPR paradox. The resolution of this paradox that EP&R proposed was that there must be some more fundamental property of quantum particles that isn’t described by quantum mechanics, a hidden variable, which determines the results of measurements, rather than pure chance. The principles they argued to be necessary for any true description of the universe are locality (no event can instantaneously affect a far away particle) and realism (if the result of a measurement is known, there must be some “element of reality” that determines that result, often called a hidden variable).

[Aside: It is worth noting that, while quantum mechanics by itself disobeys the principle of locality proposed by EP&R, it is not incompatible with special relativity. No actual information is being transferred. Since the result of the first measurement is random, and there is not enough time for that information to reach the second experimenter, the result of the second experiment is also effectively random (i.e. each has a 50/50 shot of measuring up or down).]

Nearly 30 years later, in 1964, John Stewart Bell published a paper on the EPR paradox. In that paper, Bell showed that there are measurable differences between the standard quantum mechanics and a local hidden variable theory. One way to think about a local hidden variable theory is that in an experiment, like our previous thought experiment, the entangled particles pre-decide the outcomes of every possible measurement before they become separated. However, no matter how the particles decide, Bell found that there is a minimum probability for, say, measuring the two particles in opposite states. Since local hidden variable theories have to lie on or above this bound, we often refer to this result as a Bell inequality. Quantum mechanics violates this inequality, which is why it is experimentally distinguishable from local hidden variable theories. For a concrete example, I highly recommend the recent Sixty Symbols video embedded below that goes a little more into the details of what kind of experiment can test a Bell inequality.

The Nobel Prize

Alain Aspect (with whom I share a birthday), John Clauser, and Anton Zeilinger all conducted significant experiments demonstrating violation of Bell’s inequality in photons between 1972 and 2013. There have been other experiments during that time and to the present day by other physicists (you can find a list of notable experiments here), which raises a couple of questions. One, why do we need so many experiments to answer what seems to be a simple yes-or-no question: is Bell’s inequality violated or not? Two, why were Aspect, Clauser, and Zeilinger selected for the Nobel and not any other physicists?

Let’s address why we needed so many experiments to find a violation of Bell’s inequality. Whether the universe is locally real is one of the most fundamental questions we have meaningfully asked as a species. This might make it seem precise and straightforward to answer, especially since it is a yes-or-no question. The wrinkle is that we don’t know what a hidden variable theory might look like. For example, what if the True Underlying Hidden Variable Theory™ allowed communication between the particles and detector while the particles were in-flight mid-experiment? Then, any experiment that allowed sufficient time for the particles and detectors to communicate would be unable to rule out this type of theory. These loopholes must be closed in order for us to confidently say that we live in a universe governed by quantum mechanics and not a hidden variable theory. At present, all of the major loopholes have been closed. There are, of course, always going to be loopholes like superdeterminism, the most extreme kind of hidden variable theory in which all measurement outcomes are predetermined. However, these sorts of loopholes are untestable, so they are typically ignored. As they say, if the universe walks like it’s quantum, and quacks like it’s quantum, then it’s quantum.

Now let’s talk about our second question: why these physicists. There are a couple rules to the Nobel Prize that are relevant here. The first is that the prize can go to no more than 3 people. Despite the number of physicists who have done excellent work on these experiments, they can’t all get the prize. This tends to erase the work of these other physicists, but I’ll save my critiques of the Nobel Prize for later. Second is that the recipient of the Prize must be alive. This can be difficult when the Nobel is typically awarded decades after the award-winning work was completed. So with this in mind, why Aspect, Clauser, and Zeilinger?

John Clauser was the first to experimentally detect a violation of Bell’s inequality in 1972 with his collaborator Stuart J. Freedman. In preparation for this experiment, and after this experiment, Clauser helped develop the theory around Bell’s inequality to make it more practical for experiment. In fact, you won’t find many experiments testing Bell’s inequality directly, but rather testing the CHSH inequality (the C stands for Clauser), which can be used to obtain Bell’s inequality.

Chronologically, Alain Aspect was next to show violation of Bell’s inequality. Between 1980 and 1982, Aspect and his group conducted 3 experiments testing two of Clauser’s inequalities. Importantly, the last of these mostly closed an important loophole, called the locality loophole, by changing the settings on the measurement device rapidly, preventing any slower-than-light communication between the particles and detectors. The reason this loophole isn’t fully closed is that the process used to switch the detector settings was deterministic. If it was instead random, then the loophole would be closed. One other incredible result of these experiments was that they showed a violation of the CSHS inequality to a significance of over 200 standard deviations. For reference, the standard to claim that you discovered a new particle is 5 standard deviations, and that is considered quite a high standard.

Anton Zeilinger’s group ran further experiments in 1998 and 2013. In these experiments, the locality loophole was fully closed in the way I alluded to earlier. They also closed the detection loophole, where the detected photons could be unrepresentative of the total produced photons. A hidden variable theory could even exacerbate this issue by making photons more detectable if they are more consistent with a Bell inequality violation. By using super efficient detectors, Zeilinger’s group were able to detect enough photons so that no hidden variable theory can “hide” using the detection loophole.

The Nobel Prize in physics is given once the physics community at large accepts a result, and the violation of Bell’s inequality and the triumph of quantum mechanics over hidden variable theories is now accepted in large part due to the experiments from these pioneering physicists and their research teams. It is because of them, that we can have confidence in saying that the universe, on its most fundamental level, is not locally real.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: