The Sleeping Beauty Problem

sb

Image credit: http://yourpsychotherapist.deviantart.com/

There’s an old fairy tale from Probabilia. Like all good Probabilian fairy tales, it has fair coins, maidens and -y godmothers to save us from monsters.

Sleeping Beauty volunteered for an experiment. On Sunday night she went to bed. On Monday, a fair coin was flipped. If the coin comes up heads, then sleeping Beauty is woken on Tuesday. If the coin comes up tails, then Sleeping Beauty is woken on both Tuesday and Wednesday. Whenever she is woken up, she is given a drug that prevents her from forming any memories, so whether it’s Tuesday or Wednesday, it’ll feel like it’s the first time she’s woken up. On Thursday, the experiment ends, and she continues on her way.

Whenever she is awoken, Sleeping Beauty is asked “what is the probability the coin came up heads?”

If you were Sleeping Beauty, how would you answer?

That’s confusing, so click this picture to embiggen a diagram.

SB

There’s one school of thought, “the halfer position,” which claims that she should always answer 50%. That was her belief before the experiment, and she was given no new information on Tuesday or Wednesday morning. (See this paper by David Lewis, Trigger Warning: PDF). To guess anything other than 50% feels like getting something for nothing.

But in a very real way, the halfer position is very wrong. Two times out of three, the right answer will be tails. If Sleeping Beauty were to make bets about the outcome of the coin toss, she would lose money if she believed the halfer position, and if probability theory doesn’t help us win money, then what’s the point?

“To tell us about the world,” an exasperated halfer might reply. “From an outside perspective, the probability is still 50%, and even if tails wakes up Sleeping Beauty 1,000,000 times, that doesn’t change. The events are still independent. Only new, relevant evidence produces change in credence!”

“But there is new evidence!” a thirder might reply. “The probability of the coin coming up heads is 1/2, but the probability of the coin being heads on any given awakening depends on the number of awakenings (which depends on the coin). We have new evidence that isn’t independent.”

I think this is a situation where both sides are giving correct answers to different questions.

The halfer is answering the question as it was asked of Sleeping Beauty. “What is the probability the coin came up heads?” The thirder is answering a slightly different question: “What is the probability I will experience a coin coming up heads?”

For the first question, the only thing that matters is what actually happens. For the second question, you separately weight possibilities by the amount of times you will experience them. If you don’t, you will lose money against a thirder betting partner.

That’s an interesting thought experiment. But does it have real life impacts? And where’s the monster?

The idea that we should act differently than our beliefs suggest is a weird one, and I don’t like it. Fortunately, it’s often not relevant. Most theories don’t suggest that you will undergo the same experience multiple times. However, there are many who theorize that you will experience the universe many times.

There’s a sizable minority of scientists (who all know far more about physics than me) who subscribe to the many worlds interpretation of quantum mechanics. This was recently pushed back into the limelight by a book by MIT Physicist Max Tegmark which claims that there are infinitely many parallel universe (see NYT review here). To be fair, there are other scientists (who also know far more about physics than me) who dismiss this interpretation outright, so this isn’t a completely effective argument from authority.

Even if you have low priors for the MWI, they’re probably nonzero priors. If you take the lesson of Sleeping Beauty at face value, that means you should act as though the MWI theory is likely true, even if you probably think it isn’t.

So before I start treating quantum immortality like a fairy godmother, please let me know if I’m making any mistakes. Does this fairy tale have a ring of truth?

Left as an exercise to the reader:

  1. Should your beliefs about the probability of the MWI being correct change if you witness yourself surviving an event that had a probability of killing you?
  2. Should your beliefs in MWI change if you become aware of more things that made your life unlikely?
  3. Does believing in MWI have any impacts on other outstanding philosophical debates?
  4. How does the Sleeping Beauty argument impact the simulation argument?
  5. Does quantum russian roulette have positive expected value?