Conundrum: Two Boxes
Tuesday, April 17th, 2007Researchers in Germany are working on a way to predict the intentions of human subjects by observing their brain activity. Damn!
For some reason it’s a little disturbing to me that something as personal and ephemeral as an intention can have a physiological manifestation that can be measured. Or maybe I’m just disturbed that they are now starting to measure it. What new “mind reading” technologies might be developed from this science? Could it become prosecutable to merely intend to commit a crime? Intent is already used as a legal concept, and attempted murder is considered a crime, even if nobody is hurt as a result. Could market researchers measure the intent of potential consumers? Will we one day have little handheld devices that can measure intent at a poker table or when our daughter’s date arrives to pick her up?
It all reminds me of a thought experiment made popular by Robert Nozick, which will be this week’s Conundrum. Before we get to it, though, it might be helpful to consider another thought experiment known as Kavka’s Toxin.
Let’s say I offer you $100,000 if you can form an intention to drink a particular toxin. This toxin will make you violently ill for about five or six hours, after which you will be perfectly fine. You’d drink it for the money, but you’re not being asked to drink it. You’re being asked to intend to drink it. After you have the money, you are free to change your mind and not drink it. The question is, can you actually form a genuine intention of doing something unpleasant that you will have no motivation to do?
Turn that one over in your mind for a few moments before moving on to this week’s Conundrum, Newcomb’s Problem.
Imagine there are two boxes, Box A and Box B. You will have the option of choosing to take both boxes, or to take Box B alone. You will keep what you find inside. Box A is transparent and contains one thousand dollars. Box B is opaque. A super-intelligent alien scientist with a proven track record of accurately predicting human behavior has analyzed you and has secretly made a prediction about which you will choose. If he believes you will choose Box B alone, he has put one million dollars inside. If he believes you will take both boxes, then he has left Box B empty. Which do you choose?
The super-intelligent scientist has run this trial with several hundred other humans, and has made a correct prediction each time. The only people who have ended up with the million are the ones who chose Box B alone. On the other hand, our alien friend has already made his prediction and left. Your choice can no longer affect the amounts that are in the boxes. You may as well take them both, right?
Fans of game theory might recognize this as a variation of the Prisoner’s Dilemma. Game theory would likely suggest that you flip a coin, so we’re going to disallow that option. You must rely on reasoning alone.
Unlike last week’s math puzzler, this one doesn’t have a right or wrong answer. It’s a thought experiment designed to test your conceptions of free will vs. determinism.
Or as Nozick put it:
To almost everyone, it is perfectly clear and obvious what should be done. The difficulty is that these people seem to divide almost evenly on the problem, with large numbers thinking that the opposing half is just being silly.
It will be interesting to hear how people answer this.
Will you take both boxes, or Box B alone?
Feel free to answer the question, or continue the discussion of any of the topics covered above.