During my internship at IBM Research this summer, my office mate Renbo and I discussed the following extremely important research problem.

There are twelve men on an island. Eleven weigh exactly the same amount, but one of them is slightly lighter or heavier. You must figure out which. The island has no scales, but there is a see-saw. The exciting catch: you can only use it three times.

To clarify: the riddle is asking us to find the odd-weight islander, but we do not need to determine whether the odd-weight islander is lighter or heavier.

Instead of solving this riddle, we’re going to take the riddle-writer’s perspective: what is the maximum number of islanders such that the riddle is solvable? Along the way, we’ll uncover some hints about how to approach the original riddle.

The Problem

Suppose we have \(n\) islanders and are allowed \(w\) weighings on the see-saw. We ask: under what conditions on \(n\) and \(w\) is the riddle solvable? (In the original riddle, \(n = 12\) and \(w = 3\).) There are two sorts of conditions we might look for.

  • A sufficient condition on \(n\) and \(w\) is one such that if the condition holds, the riddle is solvable.
  • A necessary condition on \(n\) and \(w\) is one such that if the condition fails, the riddle is unsolvable.

The way to prove that a condition is sufficient is pretty intuitive: given the condition, find a solution! For example, solving the original riddle shows that \(n = 12, w = 3\) is a sufficient condition. Of course, as those who paused the Brooklyn 99 episode to try out the riddle quickly found out, finding the solution can still be tricky. But the overall approach to proving a sufficient condition is clear.

Necessary conditions are more difficult. To prove that a condition is necessary, we have to show that if the condition fails, then there does not exist any islander-weighing strategy that solves the riddle. For example, suppose we want to prove that the puzzle is impossible for \(n = 13\) islanders in \(w = 3\) weighings. It’s not enough to take a solution that finds the odd-weight islander for \(n = 12, w = 3\) and demonstrate that it fails for \(n = 13, w = 3\). Instead, we need to somehow consider every islander-weighing strategy.

In this post, we will find a necessary condition on \(n\) and \(w\) for the puzzle to be solvable. Specifically, we’ll prove that the puzzle is solvable only if

\[2n - 1 < 3^w.\]

To do so, we’ll use information theory, a branch of mathematics that helps us prove necessary conditions like this without explicitly considering every possible islander-weighing strategy.

The Short Version

We’re going to start by proving a slightly weaker necessary condition: the puzzle is solvable only if \(2n - 1 \leq 3^w\) (note the non-strict inequality). Proving this weaker condition introduces all the key ideas without using any information theory.

We begin by observing that if the riddle is solvable, then it is solvable with a deterministic strategy (see Appendix, Lemma 1), so we can restrict our attention to deterministic strategies.

Suppose the riddle asked us to both find the odd-weight islander and determine whether they are lighter or heavier. To solve this modified riddle, we need to distinguish between the \(2n\) possible scenarios: there are \(n\) possibilities for the odd-weight islander, and they are either lighter or heavier. Each see-saw weighing has three possible outcomes: the left side is heavier, the right side is heavier, or the sides are balanced. So there are \(3^w\) possible outcomes of a sequence of \(w\) weighings. After observing one of these \(3^w\) outcomes, we should be able to unambiguously determine which of the \(2n\) scenarios we are in. We can’t do this if there are fewer outcomes than scenarios. This means \(2n \leq 3^w\) is a necessary condition to solve the modified riddle.

The original riddle asks us just to find the odd-weight islander, so it seems like we only have to distinguish between \(n\) possible scenarios. But it turns out that, even if we’re not trying to, we almost always end up figuring out whether the odd-weight islander is lighter or heavier. To see why, suppose we figure out that Charles is the odd-weight islander. If we ever put Charles on the see-saw in the process, then we see whether Charles was on the lighter or heavier side of the see-saw. This means the only way we can find the odd-weight islander without finding out whether they are lighter or heavier is if we never weigh the odd-weight islander.

Given an islander-weighing strategy, we say an islander is “lonely” if whenever they are the odd-weight islander, our strategy never weighs them. It turns out that any successful strategy can have at most one lonely islander (see Appendix, Lemma 2). This means that to solve the riddle, we must distinguish between at least \(2n - 1\) possible scenarios:

  • a single scenario in which the odd-weight islander is the lonely islander, and
  • \(2n - 2\) scenarios in which the odd-weight islander is one of the \(n - 1\) non-lonely islanders.

There are still \(3^w\) possible outcomes of \(w\) weighings, so \(2n - 1 \leq 3^w\) is a necessary condition to solve the riddle.

A Stricter Condition

Let’s think about what the \(2n - 1 \leq 3^w\) condition says about the original \(n = 12, w = 3\) riddle. Could we solve it with fewer weighings? How about more islanders?

  • Given \(n = 12\), we must have \(23 \leq 3^w\), which means we need \(w \geq 3\), so the riddle cannot be solved in fewer weighings.
  • Given \(w = 3\), we must have \(2n - 1 \leq 27\), which means \(n \leq 14\). So the riddle might be solvable with more islanders.

It turns out that the riddle is solvable for \(n = 13\) but not \(n = 14\). But we could rule out \(n = 14\) using the necessary condition \(2n - 1 < 3^w\) (note the strict inequality) that we promised in the introduction. Proving this stricter condition is where the information theory comes in.

How Much Information is in Each Weighing?

One of the key insights of information theory is to draw a connection between information and randomness. Information theory views anything we don’t know as a random variable. The entropy of a random variable tells us, roughly speaking, how much information we expect to gain by learning the outcome of that random variable. For brevity, I’m not going to explain in detail how to define entropy, instead explaining just the bits we need for this post. (If you’re curious, sources like Wikipedia have pretty good explanations.)

In our context, the main random variable we want to know is which of the \(2n - 1\) scenarios we are in. Let’s give it a name:

\[S = \text{``the scenario we're in''.}\]

If each of the scenarios is equally likely, then \(S\) has entropy \(H(S) = \log_2(2n - 1)\).

Solving the riddle entails learning the outcome of \(S\), but we never observe \(S\) directly. Instead, we observe the results of each weighing. These weighing outcomes are also random variables, so let’s write

\[T_i = \text{``the outcome of the $i$th weighing''.}\]

Let’s call the possible outcomes of a weighing \(\ell\), \(r\), and \(b\) for tipping left, tipping right, and staying balanced, respectively. Writing \(p_i\) for the probability mass function of \(T_i\) (meaning, for example, that \(p_3(l)\) is the probability that the see-saw tips left on the third weighing), we can write the entropy of \(T_i\) as

\[H(T_i) = p_i(\ell) \log_2\biggl(\frac{1}{p_i(\ell)}\biggr) + p_i(r) \log_2\biggl(\frac{1}{p_i(r)}\biggr) + p_i(b) \log_2\biggl(\frac{1}{p_i(b)}\biggr).\]

To solve the riddle, it must be that the amount of information we get from the weighings is at least the amount of information we would get by directly learning which scenario we are in. That is, a necessary condition for the riddle to be solvable is

\[H(S) \leq \sum_{i = 0}^w H(T_i). \qquad (*)\]

(To be precise, the amount of information we learn from the \(i\)th weighing is never more than \(H(T_i)\), but might be less if later weighings tell us information we already learned from earlier weighings. Look up conditional entropy for details.)

Our question thus becomes: how do we maximize \(H(T_i)\), the entropy of the \(i\)th weighing? It turns out that the entropy of a random variable is maximized when all of its outcomes are equally likely. In the case of \(T_i\), this happens when each outcome has probability \(1/3\), so \(H(T_i) \leq \log_2 3\). Plugging this bound into \((*)\), we get necessary condition

\[\log_2(2n - 1) \leq w\log_2(3)\]

… which is equivalent to the weaker necessary condition \(2n - 1 \leq 3^w\) we already derived. What’s missing?

To get the stricter condition, we need one last observation: we can’t actually make the three outcomes of each weighing equally likely! Suppose that in the first weighing we put \(k\) islanders on each side of the see-saw. The probability of each outcome is

\[p_1(\ell) = \frac{2k}{2n - 1} \quad p_1(r) = \frac{2k}{2n - 1} \quad p_1(b) = \frac{2n - 1 - 4k}{2n - 1}.\]

That is, of the \(2n - 1\) scenarios, there are \(2k\) that would make the see-saw tip left: \(k\) where someone on the left is heavier and \(k\) where someone on the right is lighter. (Remember that, by definition, the lonely islander is not on the see-saw.) But we cannot possibly have \(2k = 2n - 1 - 4k\), because \(2k\) is even and \(2n - 1 - 4k\) is odd. So we can’t make the three outcomes equally likely, which means \(H(T_i) < \log_2(3)\). Plugging this bound into \((*)\), we get necessary condition

\[\log_2(2n - 1) < w\log_2(3),\]

which is equivalent to \(2n - 1 < 3^w\), as desired.


Appendix

Here are a few of the details we skipped above. The proofs don’t use any fancy techniques, so they make good exercises if you can resist peeking.

Lemma 1.

If the riddle is solvable, then it is solvable with a deterministic islander-weighing strategy.

Proof.

Suppose we have a nondeterministic islander-weighing strategy solves the riddle. That means when we get to a nondeterministic step, we choose between one of several options for to proceed. But our strategy always works, which means it should work no matter which option we pick. In particular, it still works if we always arbitrarily pick the first option, which makes our strategy deterministic. \(\square\)

Lemma 2.

Any deterministic islander-weighing strategy that solves the riddle has at most one lonely islander.

Proof.

Recall that an islander is “lonely” for a given strategy if whenever they are the odd-weight islander, they are not placed on the see-saw. We’re going to show that if a strategy has a lonely islander, then none of the other islanders are lonely. Say that Jake is the lonely islander and consider Charles, another islander. We need to show that if Charles is the odd-weight islander, then we have put Charles on the see-saw at least once.

The key is this: because Jake is lonely, we can’t put Jake on the see-saw until we get at least one unbalanced weighing. This is because until we see an unbalanced weighing, Jake could still be the odd-weight islander.

Suppose now that Charles is the odd-weight islander. If we never put Charles on the see-saw, then all the weighings are balanced. But because all the weighings are balanced, we must never have put Jake on the see-saw. Because neither Jake nor Charles have been weighed, we cannot possibly tell the difference between them. Therefore, to solve the riddle, if Charles is the odd-weight islander, our strategy must at some point put Charles on the see-saw. \(\square\)