Are You Smarter Than Google? — Part Two

treasureToday I’ll offer the “official” solution to yesterday’s puzzle — that is, the solution that Google has apparently expected from its job candidates. This is also the solution I gave when I first saw the puzzle, and the solution I usually get from my best students, and the solution given yesterday by some astute commenters.

But this solution has a gaping hole in it. Can you find it?

We have, as you recall, 10 ferocious pirates in possession of 100 gold coins. The most ferocious pirate proposes a division, which is put up for a vote. If half or more of the pirates approve, the plan is implemented and the game is over. Otherwise, the most ferocious pirate is thrown overboard and the game begins again, with the second most ferocious pirate proposing a divison. A pirate’s first priority is to stay alive, his second is to amass as many gold coins as possible, and his third is to see others thrown overboard.

Let’s start with some simpler problems:

Problem 1. Suppose there’s just one ferocious pirate. Call him James. Obviously, James allocates 100 coins to himself, votes for his own plan, and wins.

Problem 2: Suppose there are two ferocious pirates — Igor and James, with Igor the most ferocious. Now Igor has half the votes and can therefore impose any plan he wants to. He therefore allocates 100 coins to himself, votes for his own plan, and wins. James gets nothing.

Problem 3 adds a third pirate — Howard, who is more ferocious even than Igor. If Howard wants to avoid being thrown overboard, he needs at least two votes for his plan. He’s got his own vote, so he’s got to buy a vote from either Igor or James — both of whom realize that if Harold goes overboard, they’ll be playing the game from Problem 2. Igor likes that game so much that he can’t be bought. James, on the other hand, knows he gets zero in that game, and can therefore be bought for a single coin. So Howard’s proposal is: 99 for Howard, 0 for Igor, 1 for James. Howard and James vote yes and the plan is approved.

Problem 4: Now let’s add a fourth pirate, the even more ferocious George. To pass a plan, George needs two votes — his own and someone else’s. If his plan fails, we’ll be playing Problem 3, where Howard gets 99 coins, Igor gets 0, and James gets 1. Therefore Howard won’t sell his vote for less than 100 coins, while Igor will sell for just 1, and James will sell for 2. Igor’s vote is the cheapest, so that’s the one George buys. So George’s proposal is: 99 for George, 0 for Howard, 1 for Igor, 0 for James. George and Igor vote yes and the plan is approved.

Problem 5: If there’s a fifth and even more ferocious pirate — call him Freddy — then Freddy needs three out of five votes to pass his plan. In addition to his own vote, he’lll buy the two cheapest votes available. In view of what happens in Problem 4, George’s price is 100, Howard’s is 1, Igor’s is 2, and James’s is 1. Therefore Freddy buys Howard’s and James’s votes for 1 coin each. Freddy gets 98, George gets 0, Howard gets 1, Igor gets 0, James gets 1.

And so on (you can fill in Problems 6, 7, 8 and 9 yourself!) until we reach:

Problem 10: With 10 pirates, Arlo (the most ferocious) gets 96, Bob gets 0, Charlie gets 1, Dave gets 0, Eeyore gets 1, Freddy gets 0, George gets 1, Howard gets 0, Igor gets 1 and James gets 0. Arlo, Charlie, Eeyore, George and Igor all vote yes, because if they vote no, Arlo will be thrown overboard and Bob will propose a division in which Arlo is thrown overboard while Charlie, Eeyore, George and Igor all get nothing.

Conclusion: The only possible outcome is 96-0-1-0-1-0-1-0-1-0. You’ll find the same solution, with pretty much the same argument, in yesterday’s comments. But the conclusion is wrong.

It’s partly right, in the sense that that 96-0-1-0-1-0-1-0-1-0 is one possible outcome. But the argument above seems to prove it’s the only possible outcome. That’s wrong. Many other outcomes are also possible.

So here are today’s puzzles:

1) What other outcomes are possible?

2) What’s wrong with the above argument, which appears to rule those outcomes out?

I’ll follow up later in the week.

Print Friendly, PDF & Email
Share

76 Responses to “Are You Smarter Than Google? — Part Two”


  1. 1 1 Rasmus Faber

    One interpretation of the problem as posed is to find a set of strategies in which no single pirate can change their strategy to achieve a better outcome (a Nash equilibrium).

    The strategies described above is one such set of strategies.

    Another such set of strategies is: all pirates vote no to allocations where they do not receive all 100 coins. All pirates except the fourth weakest (George) propose all 100 coins to themself. George proposes all 100 coins to the third weakest (Howard). It is a bad outcome for everyone else than Howard, but no single pirate can change their strategy to achieve a better outcome.

    There are many other sets of strategies that have the same property.

  2. 2 2 Rasmus Faber

    I believe that the problem with the argument above is, that it forces the pirates to decide their strategies in a particular order: first they must decide their strategy in the one pirate game, then they must decide their strategies in the two pirate game with the restriction that all pirates use the strategy from the one pirate game in the relevant sub-game etc.

    The problem does not specify that the pirates should choose their strategies in that way.

  3. 3 3 Hendrik Lipka

    I think that 96-0-1-0-1-0-1-0-1-0 is not even an optimal outcome. All pirates except the two most ferocious will vote no since then the outcome is (by the above logic) would be t-t-98-0-1-0-1-0-1-0 which means they get the same amount of coins (or more), but enjoy two pirates thrown overboard.

  4. 4 4 Mike H

    @RasmusFaber looks good to me…

  5. 5 5 Nathan

    It seems to hinge on what is meant by the phrase “taking the other pirates’ strategies as given.”

    Steve – can you clarify what that means?

  6. 6 6 Ernold Same

    Can pirates make credible promises to follow ex post sub-optimal strategies that are ex ante optimal?

    For example, can’t pirate 2 simply credibly promise every pirate two coins if they vote pirate 1s offer down?

    (Obviously that’s not the answer: just an example of concept)

  7. 7 7 Harold

    Not sure I like the Idea of throwing Harold overboard in Problem 3 paragraph when Howard is the pirate involved.

  8. 8 8 Nawaaz

    Yeah, like I discussed before, the big problem is sub game Nash equilibrium: if we consider sub games unreached in the “100 or reject” solution, not everyone is playing a best response (even though in the entire game everyone is playing a best response) which, I think, seems inappropriate given the way we typically make decisions and question the validity of other’s threats.

  9. 9 9 Keshav Srinivasan

    I think insofar as there’s a flaw, the flaw would be in problem 3: James could precommit to rejecting 99-0-1 in the hope that Howard will make an offer that will give him more than 1 gold coin.

  10. 10 10 Harold

    Problem 1 has only one outcome.
    Problem 2 could have more than one outcome, Pirate 9 could if he wished offer some coins to pirate 10. With only 2 players there is no need, so as a two player game this is the best strategy.

    But it is not a 2 player game. In order to get to a 2 player game, pirates 1-8 must have been thrown overboard.

    Pirate 9 realises that if it gets to a 3 player game he will get nothing. Therefore his original strategy is a bad one, since it will never progress to a 2 player game. The strategy he initially considered – i.e. to offer pirate 10 nothing if it gets to a two player game is a bad one, since it inevitably ends up with him getting nothing.

    However, if his strategy is to offer pirate 10 TWO coins even in a two player game, then when we are in a three player game, pirate 8’s offer of a single coin for pirate 10 will be rejected in favor of the two coins from pirate 9 after pirate 8 is ejected.

    Pirate 8 can foresee this, so he must offer more than 2 coins to pirate 10 – in effect we are in a bidding war to give more coins to pirate 10.

    Taken to its conclusion, all coins would end up with pirate 10.

    I think Ken B had this solution yeaterday. I did not see it then, but it seems reasonable now I have worked it through. I may have it totally wrong.

  11. 11 11 Sub Specie Æternitatis

    At this point, the only thing I have to add is that this “gaping hole” better be good or there is going to be a riot in the comments section.

  12. 12 12 Sub Specie Æternitatis

    I still think that our solution is correct, unless you assume credible pre-commitment (and how credible are the words of pirates?) or impute some fancy meaning to “given strategies” (such as the exchange of robot pirate’s source code at the beginning of the game).

    This and the other pirate puzzles brings up a meta-pirate puzzle though. Given how much effort pirates must have spent mastering game theory, how did they ever find the time to rob all that gold?

  13. 13 13 Steve Landsburg

    Sub Specie Aeternitatis:

    I still think that our solution is correct, unless you assume credible pre-commitment (and how credible are the words of pirates?) or impute some fancy meaning to “given strategies” (such as the exchange of robot pirate’s source code at the beginning of the game).

    The gaping hole does not rely on credible pre-commitment or on any fancy meanings.

  14. 14 14 Steve Landsburg

    Ernold Same:

    Can pirates make credible promises to follow ex post sub-optimal strategies that are ex ante optimal?

    For example, can’t pirate 2 simply credibly promise every pirate two coins if they vote pirate 1s offer down?

    No, they cannot.

  15. 15 15 Steve Landsburg

    Nathan: “Taking the other pirates’ strategies as given” means that there is no collusion, no side bargains, etc. It means that Pirate A knows how Pirate B would vote on Distribution X, and that nothing Pirate A does can change that vote.

  16. 16 16 jgreene39

    Can the pirates “buy” votes by proposing divisions that allow the other pirates to throw someone overboard without actually having to give them coins?

  17. 17 17 Alex

    #3 Hendrik Lipka, wouldn’t the pirates realize that they, too, would eventually be thrown overboard? There’s no reason to iterate only once.

  18. 18 18 Joe

    I suppose the following may be outisde of the parameters of “taking each of the strategies as given” but is it possible for a pirate to credibly commit to a strategy that is different from the 96-0 etc outcome:

    If pirate 9 says “I will vote against any offer where I only get 0 or 1 coin”. In the conventional answer, Pirate 1 would not see this as credible and put in the 96… bid. But another pirate (say pirate 10) could, in turn, voice a similar strategy. Or Pirate 2 could announce that he will offer others greater than 0/1.

    To Pirate 1, those strategies might not seem credible but he can see the pay offs that all remaining pirates have by throwing him in the water. He may therefore choose a bid that splits the coins differently depending on what the announced stategies are. I suppose the announcements of strategies may be an interative process that I am struggling to work through.

  19. 19 19 David J

    There may not be enough information. You do stipulate that pirates like seeing other pirates thrown overboard less than they like getting a single extra coin, but do they like seeing two other pirates thrown overboard more or less than getting a single extra coin? Also, how is the order decided? “Ferocity” is not really a tangible thing. If “ferocity” is analogous to the number of coins a pirate would allocate to himself, then this solution cannot work because the most “ferocious” pirate suggests allotting himself less than less “ferocious” pirates further down the line. If ferocity is not analogous to either the size of the bounty a pirate allocates to himself (or the amount he enjoys seeing other pirates thrown overboard), then how do you determine order? This solution only works if each pirate knows their exact place in the order of ferocity. You also have to assume that all the pirates (at least the ones counted on to vote yes at the 10 pirate iteration) are smart enough to come up with this solution. The pirates are certainly ferocious, but are they intelligent and/or are they familiar with game theory? In my opinion, there are still too many unknowns to be confident in this solution.

  20. 20 20 Keshav Srinivasan

    Steve, I’m confused. In comments 13 and 14 you say pirates can’t make credible pre-commitments, but then you say in comment 15 that “It means that Pirate A knows how Pirate B would vote on Distribution X, and that nothing Pirate A does can change that vote.” Doesn’t that involve Pirate B making a credible pre-commitment about how he would vote on Distribution X?

    Am I misunderstanding something?

  21. 21 21 Cameron

    There are other equilibria. One way to see this is that player 8 doesn’t have to offer player 10 a whole coin. He only needs to make him an offer that makes player 10 at least indifferent between accepting and rejecting. Suppose that player 10 gets utility W from seeing another pirate get thrown overboard. Here the value of W = W. Here “q” doesn’t have to be 1.

  22. 22 22 Steve Landsburg

    Keshav: The pirates cannot make credible precommitments to do anything against their own interests. Given distribution X, pirate B will cast a vote that serves his own interests, and Pirate A knows what that vote will be.

  23. 23 23 maznak

    David J: I guess that “ferocity” is just an arbitrary preset order. And you have a point, I think, in wondering whether seeing 2 mates go overboard might be worth more than one coin.

  24. 24 24 Ken B

    @Keshav 20.
    Imagine all decision trees are published before the first vote occurs. The voting can all be done by a computer that takes the trees as input. Once the trees are published the decision is logically made and the ceremony of actually casting votes is only one implementation of how the solution can be revealed. Imagine a robot who wanders along ejecting pirates and distributing coins.

  25. 25 25 Sub Specie Æternitatis

    @Cameron That possibility is precluded by the premise that while each pirate receives utility from seeing other pirates tossed, the utility of any number of tossings is less than one coin. Moreover, the coins are indivisible as was clarified. So q does have to be 1.

  26. 26 26 Joseph

    We know that the pirates value seeing someone thrown overboard, but we do not know what fraction of a coin in value they get from seeing someone thrown overboard. As others have mentioned, a pirate may be willing to forego 1 coin to see 2 pirates thrown overboard.

    So, given the first solution, anyone offered 1 coin in any round should be willing to play a mixed strategy where they will vote to throw a pirate overboard some percentage of the time.

    This is a more risky strategy for the 3rd pirate than the 9th pirate because the 3rd pirate has a greater chance of getting 0 or getting thrown overboard given a mixed strategy. Therefore, the probability that pirate 3 defects is less than the probability that pirate 5 defects, is less than the probability that pirate 7 defects, etc.

    So, all pirates that receive 0 should vote to throw the first pirate overboard and all pirates that receive 1 in the first solution should play a mixed strategy based on their position in the pecking order.

    The first pirate realizes this, and knowing that he values his own life more than 100 gold coins, he has to divide the coins such that the probability he will be thrown overboard is 0.

    We know the probability of the 9th and 10th pirate being thrown overboard is 0, so they will be the most expense to pay off.

    This is where I get stuck.

    Given a mixed strategy, it should be cheapest to pay off pirates 2 through 5 rather than 6 through 10. So, pirates 6 through 10 are offered 0 and pirates 2 through 5 are offered some number of coins greater than 1 based on the value of watching someone thrown overboard and the probability that they will be thrown overboard.

  27. 27 27 roystgnr

    Perhaps saying that each pirate’s vote “serves his own immediate interests” might be a less ambiguous wording?

    One could say that two-boxing players in Newcomb’s paradox are serving their own immediate interests, for example, but if you think they are serving their own interests then I can find 999,000 reasons to disagree.

  28. 28 28 Keshav Srinivasan

    @Ken B Steve is saying more than that; he’s putting a constraint on what kind of trees are permissible. He’s saying that a tree is not allowed to contain any suboptimal decisions in any of its branches, even as a means of making others avoid those branches.

  29. 29 29 Keshav Srinivasan

    @Ken B So what you said in the previous thread wouldn’t work: “So pirate 9 will have already filed the proposal “99 1″ by the time the voting occurs on pirate 8′s proposal, should that proposal be “99 0 1″. (ie his fully specified set of strategies includes the statment ‘if there are three left and the proposal of 8 is “99 0 1″ then my proposal is …)” Regardless of what 8’s proposal is, 9 is not allowed to make any proposal other than 100-0. So he has no means of influencing 10’s vote on 8’s proposal of 99-0-1.

    So the case for 8 offering 99-0-1 seems pretty airtight to me.

  30. 30 30 Steve Landsburg

    Keshav:

    @Ken B Steve is saying more than that; he’s putting a constraint on what kind of trees are permissible. He’s saying that a tree is not allowed to contain any suboptimal decisions in any of its branches, even as a means of making others avoid those branches.

    Correct.

  31. 31 31 Ken B

    Keshav:
    “Regardless of what 8′s proposal is, 9 is not allowed to make any proposal other than 100-0. So he has no means of influencing 10′s vote on 8′s proposal of 99-0-1.”

    No, I don’t think that is right. A strategy contains a choice for each contingency, meaning in this case each proposal. We are way down on a branch of a tree. At this point in the tree I can offer 99 1 while on a separate branch I can offer 98 2 or on a third 100 0.

    Let’s play NIM. You go first. Here’s my strategy: [If Keshav plays d4 I play e6; if Keshav plays e4 I play d5; if Keshav plays Nf3 I play e6 ..] I am not commited to a move regardless of your play.

  32. 32 32 Nick

    The game has to occur the way it does for 1 pirate remaining and 2 pirates remaining, but when there are three pirates remaining, Howard can award himself 100, and everyone votes for it. If either Igor or James vote against the plan, it still wins with 2 out of 3, so they are fine with voting for the plan.

    If Howard awards himself 40, and the other two 30, and they all vote for it, then again he still wins if any single pirate changes his vote. But Howard might want to give himself 99, and James 1, in which case James will still vote for it. So Howard won’t award himself less than 99, and with three pirates there are two possibilities: 100-0-0 and 99-0-1.

    With four pirates, 100-0-0-0 is again a possible outcome. And 99-0-1-0 is a possible outcome. If the proposed division is 99-1-0-0, and everyone votes for it, then the most ferocious pirate might want to switch to 100-0-0-0, in which case the remaining three vote against the plan. So, I think 99-1-0-0 and 99-0-0-1 are also possible outcomes.

    I can’t think past 4 but my guess is, every outcome where the first pirate gets at least 96, and the remaining coins are divided in any way among the remaining pirates, is a possible outcome.

  33. 33 33 Ken B

    Assume that there are several different solutions, that is different Nash Equilibria. Then there can be an ordering issue here. I might prefer NE 2 but if I am presented with NE 1 first, and cannot communicate or collude, I will vote for NE 1 providing it is not reachable in one play or forced series of moves. (Especially as we are told pirates value their lives above all things so will not risk drowning for a chance of gold).

  34. 34 34 John W

    I just posted this in part I, before I saw that there was a part II:

    Relying on the logic in post #89, how about this:

    If there is a possibility that one or more of P5,P7,P8,P9,P10 will not vote for any offer that gives them less than 100 coins, then P2,P3,P4,P6 will vote yes every time — regardless of the offer — since they do not want to risk being thrown overboard. So P1 says that he gets 100 coins for himself, and P1,P2,P3,P4,P6 vote for it, and it passes with 50%.

  35. 35 35 Keshav Srinivasan

    @Ken B “No, I don’t think that is right. A strategy contains a choice for each contingency, meaning in this case each proposal. We are way down on a branch of a tree. At this point in the tree I can offer 99 1 while on a separate branch I can offer 98 2 or on a third 100 0.” Steve agreed with my statement that you’re not allowed to put a suboptimal decision with a branch in an attempt to make others avoid that branch. In any branch where there are two pirates left, an offer of 99-1 in an attempt to influence people’s earlier decisions wouldn’t be allowed, because 99-1 is suboptimal to 100-0.

  36. 36 36 Sub Specie Æternitatis

    @Joseph “a pirate may be willing to forego 1 coin to see 2 pirates thrown overboard”

    No, that is precluded by the premise: “each pirate enjoys seeing other pirates thrown overboard, but not as much as he enjoys receiving one additional coin” Note the plural on “pirates”

  37. 37 37 Keshav Srinivasan

    Typo: comment 35 should say “suboptimal decision in a branch”, not with a branch.

  38. 38 38 Ken B

    Keshav 37
    I don’t see who is making a suboptimal choice to make someone else *avoid* that same branch.

  39. 39 39 Keshav Srinivasan

    @John W In my post #89, P6 has no reason to vote for P1’s offer, because if P1’s offer fails, P5’s offer will succeed, so P6 has no risk of death.

    In any case, it’s now a moot point, because Steve has clarified that credible promises to do sub-optimal things in order to influence the decisions of others are not allowed. So the logic of comment 89 doesn’t apply.

  40. 40 40 Keshav Srinivasan

    Steve, one more clarification. You agreed with my statement “He’s saying that a tree is not allowed to contain any suboptimal decisions in any of its branches, even as a means of making others avoid those branches.”

    You would still agree with the statement if I replaced “avoid those branches” with “choose those branches”, right?

    I’m almost certain that your answer is yes, but Ken B is unclear on this point.

  41. 41 41 John W

    #39:

    How can P6 be sure that P5’s offer will succeed, if there is a possibility that, say two or more of P7,P8,P9,P10 have a strategy to vote no unless they receive 100 coins?

    Also, why do you say that there must be promises to do suboptimal things for there to be a pirate whose strategy is to accept only 100 coins? Must a strategy of accepting only 100 coins be suboptimal for every pirate? If P9’s strategy is to vote no unless he is offered 100 coins, what is his expected value (in coins), assuming that there is a possibility that other pirates may also follow a strategy of voting no unless they are offered 100 coins?

  42. 42 42 Keshav Srinivasan

    @John W P5’s offer is guaranteed to go through because P5 and P6 will vote for it guaranteed, so all P5 needs to do is offer 100 coins to one of the remaining pirates and that will give him the third vote he’ll need to win.

    “Also, why do you say that there must be promises to do suboptimal things for there to be a pirate whose strategy is to accept only 100 coins?” Consider the case where there are three pirates left. For P10 to reject P8’s offer of 99-0-1 would be sub-optimal, because if he rejects it then he’ll receive no gold coins. So P10 has no ability to get P8 to offer him a better deal than 99-0-1. It’s only if he can credibly commit to rejecting offers of less than 100 gold coins that he can get P8 to offer him a better deal.

  43. 43 43 Ken B

    Keshav wrote: “because Steve has clarified that credible promises to do sub-optimal things in order to influence the decisions of others are not allowed.”

    I think this is an uncertain assertion. It is much stronger than your earlier one, where I wasn’t allowed to make a suboptimal choice on a branch to make you *avoid* the same branch. But if if I cause you to *choose* the branch, then what counts as “suboptimal” isn’t so clear.

  44. 44 44 John W

    #42:

    I still don’t see why you say that P6 can be sure that P5’s offer will be accepted. How can P6 know that P5 will offer 100 coins to one of the remaining pirates?

    Furthermore, you claim that it would be suboptimal for P10 to reject an offer of less than 100 coins on the 8th round, but I was not talking about the 8th round, I was discussing the 1st round (or possibly the 5th round). It is not at all clear to me that it would be suboptimal for, say P9, to reject all offers less than 100 coins on the early rounds (1-5).

  45. 45 45 nd

    I’d suggest Pirate 1 should offer: 85-0-0-0-0-0-0-0-5-5-5.

    My reasoning is as follows:

    A flaw in the 96-0-1-0-1-0-1-0-1-0 conclusion is the assumption that pirate 7 will vote for a deal that only gives him one coin. Pirate 7 knows he can get a 50% vote in favor of a 98-0-0-2 division, so there is no reason to accept anything less than 98. The 98-0-0-2 division is sure to pass because, as noted by other commentators, by that point the best Pirate 10 can do is two coins. With two pirates, he gets nothing and with three pirates he only gets one.

    Pirate 6 also knows that he can get definitely get 98 coins, with a 98-0-1-1-0 offer. Pirates 8 and 9 will vote for this, as they will get nothing if Pirate 7 gets control. Pirate 6 will also therefore vote against anything that gives him less than 98 coins.

    Pirate 5 has to get two other pirates’ support to get a proposal passed, as there will be six pirates left when he has control. As pirate 6 and 7 will reject anything less than 98 coins, he has to buy off two others. Pirates 8 and 9 will get one coin and the pleasure of seeing Pirate 5 thrown overboard if they reject an offer of one coin from Pirate 5, so Pirate 5 has to offer them two coins each. That means Pirate 5 should offer 96-0-0-2-2-0. (Buying Pirate 10 will require three coins, so it’s more expensive.)

    Pirate 4 needs three supporting votes, so he has to offer 91-0-0-0-3-3-3. Pirate 5 knows he can get 96, and Pirates 6 and 7 know they can get 98, so they will reject anything less. Pirates 8 to 10 won’t get anything more than two coins from Pirates 5, 6 or 7, so they will accept three from Pirate 4.

    Pirate 3 also needs three votes to get 50%. He can’t just match Pirate 4’s future offer, as pirates 8, 9 and 10, will reject three coins from him, when then can get three coins from Pirate 4 and the pleasure of seeing Pirate 3 thrown overboard. That means Pirate 3 has to offer 88-0-0-0-0-4-4-4. Again, Pirates 4, 5, 6 and 7 know they can get more coins if they have control of the pot, so they will reject anything less than their maximums.

    Pirate 2 is in an impossible situation, if he has control, as he needs to win agreement from four other pirates to get a majority. He can only get the support of pirates 8, 9 and 10, by offering five coins each. (Four isn’t enough, as they can get four and the pleasure of seeing Pirates 2 thrown overboard from Pirate 3.) That then leaves Pirate 2 with 85 coins, which isn’t enough to secure the support of any other pirate. They will all get more if they have control of the pot, so they will reject anything Pirate 2 offers. As such, Pirate 2 can’t avoid being thrown overboard even if he offers to take zero coins.

    Knowing all this, Pirate 1 should offer: 85-0-0-0-0-0-0-0-5-5-5. Pirates 8, 9 and 10 will support it as it’s the most coins they can get. Pirate 2 will also have to support it, as it’s the only way to avoid being thrown overboard. Pirate 1 will vote for it, as it’s the best he can do. Pirates 3, 4, 5, 6 and 7 will reject it, but that’s not enough for a majority.

  46. 46 46 Keshav Srinivasan

    @John W “I still don’t see why you say that P6 can be sure that P5′s offer will be accepted. How can P6 know that P5 will offer 100 coins to one of the remaining pirates?” It’s simple: if P5 doesn’t offer 100 coins to one of the remaining pirates, he’ll die. So since he wants to avoid death at all costs, he will offer 100 coins to one of the remaining pirates.

    “Furthermore, you claim that it would be suboptimal for P10 to reject an offer of less than 100 coins on the 8th round, but I was not talking about the 8th round, I was discussing the 1st round (or possibly the 5th round). It is not at all clear to me that it would be suboptimal for, say P9, to reject all offers less than 100 coins on the early rounds (1-5).” You can’t really analyze the early rounds without reference to the later rounds.

    The reason why it would be suboptimal for P9 to reject P5’s offer of 98-0-1-0-1-0 is because if P5’s offer fails, then P6 will make a successful offer of 98-0-1-0-1. And the reason that P6’s offer would be successful is that P8 and P10 will support it, because if it fails then P7 will make a successful offer of 99-0-1-0. And the reason P7’s offer will be successful is that P9 will vote for it, because if P7’s offer fails then P8 will make a successful offer of 99-0-1.

    So ultimately the reason why it would be suboptimal for P9 to demand 100 coins in the 5th round is because P8’s offer of 99-0-1 will be successful.

    And I can similarly show that it would be suboptimal for P9 to demand 100 coins in rounds earlier than the 5th round as well.

  47. 47 47 Keshav Srinivasan

    @Ken B We’ll see whether Steve answers yes to my comment #40. I’m confident that he will.

  48. 48 48 Xan

    I think I’ve got it. An example should make it clear. One alternative equilibrium is that the most ferocious pirate, Arlo, proposes 100 coins for himself, and 0 for everyone else; meanwhile, everyone else votes yes.

    The reason this is a (subgame perfect) Nash equilibrium is that *no individual pirate can change the outcome by deviating to vote no*. A unilateral deviation would change the vote to 9 vs 1, but the measure would *still pass*. Thus, the equilibrium is supported by the fact that, given the voting strategies of the other pirates, each pirate is indifferent between voting yes and no.

  49. 49 49 John W

    #45:

    How can P6 be sure that P5 will not offer something like 98,0,1,0,1,0 ? Certainly a lot of commenters here think that is a reasonable offer for P5 to make, including you (later in your #45 comment). But if P6 thinks there is any chance that P7 or P9 will reject any offer less than 100, then P6 will conclude that there is some chance to be thrown overboard even if P6 votes in favor of P5’s offer. So, if P6 cannot be certain of P5’s offer being accepted, then P6 should vote in favor of every offer, including P1’s.

    You seem to hold contradictory opinions. On one hand, you say that P6 can be sure that P5 will offer 100 coins to one of the other pirates. But then you say that if P5 offers 98,0,1,0,1,0 that it is sure to be accepted.

    To me, there is enough uncertainty here that I think P6 (and P2,P3,P4) will just always vote yes, to minimize their chances of going overboard.

  50. 50 50 John W

    I meant to reference comment #46 in my comment #49 (not #45).

  51. 51 51 Keshav Srinivasan

    @nd I think there’s many problems with your reasoning, but let’s start with this: why are you talking about 7 offering 99-0-0-2? He’d be better off offering 99-0-1-0. So 6 should offer 98-0-1-0-1. And thus 5 should offer 98-0-1-0-1-0. And 7 would be a fool to turn down 5’s offer, because if he does, then 6’s offer will be successful and 7 will get nothing.

    So there’s no point for 7 to demand high numbers of coins.

  52. 52 52 Keshav Srinivasan

    @Xan Why in the world would pirates 2-10 ever vote yes in the first place?

  53. 53 53 Keshav Srinivasan

    @John W “How can P6 be sure that P5 will not offer something like 98,0,1,0,1,0 ? Certainly a lot of commenters here think that is a reasonable offer for P5 to make, including you (later in your #45 comment”. Well, in the context of comment #89, we’re assuming that everyone is making a precommitment to reject any offer less than 100 (unless rejection kills them. So P5 would be foolish to offer something like 98-0-1-0-1-0, because he would surely die. They only way for him to survive is to offer one of the remaining people a 100 gold coins, so that’s what he’s guaranteed to do.

    “You seem to hold contradictory opinions. On one hand, you say that P6 can be sure that P5 will offer 100 coins to one of the other pirates. But then you say that if P5 offers 98,0,1,0,1,0 that it is sure to be accepted.” Let’s keep two things separate: One thing is the assumption of comment #89, where people make credible precommitments to reject offers of less than 100 coins, even when such rejection is sub-optimal. In that context, P5 has to offer someone 100 coins.

    The other thing is the Steve’s actual problem, which he’s clarified does not allow credible precommitments to do sub-optimal things. So in Steve’s actual problem, P5 will offer 98-0-1-0-1-0 and that’s guaranteed to be accepted.

    “To me, there is enough uncertainty here that I think P6 (and P2,P3,P4) will just always vote yes, to minimize their chances of going overboard.” There’s actually no uncertainty at all, so there’s no reason for P6 to vote yes.

  54. 54 54 Capt. J Parker

    I think the gaping hole is to assume that the weakest pirate – James – would ever need more than 1 coin to secure his Yes vote. If you consider problem 4 the argument there is that James needs to be offered 2 coins to guarantee that he vote yes since if he is offered only 1 coin he would be indifferent between taking that one coin or letting the game move to problem 3. But, James has a strong preference for making the game stop as long as he gets one coin because he has some risk of getting nothing and no chance of doing any better than 1 coin if the game proceeds to the next lower level. So, the standard solution to game 4 is 99-0-1-0 but 99-0-0-1 is also a solution. The standard solution in problem 10 is 96-0-1-0-1-0-1-0-1-0 But 96-0-1-0-1-0-1-0-0-1 is a solution and any combination that takes a coin away from any of the one coin pirates in the standard solution and gives it to John the weakest pirate is also a solution.

  55. 55 55 Capt. J Parker

    Sorry, Last line of comment 54 should be “James the weakest” not “John the weakest.”

  56. 56 56 Keshav Srinivasan

    @Capt. J Parker 99-0-0-1 wouldn’t work. James would reject it, because then George would get thrown overboard and James could accept Howard’s offer of 99-0-1. He would prefer that, because remember these pirates like to see other pirates thrown overboard, at least if it doesn’t cost them anything.

  57. 57 57 Roger

    If they were real human pirates, I think that James would be unlikely to accept 99-0-1 for Problem 3. The difference between 0 and 1 coin is not that much, so he would tell Howard: “Give me at least 20 coins, or I will vote against you.” Howard might think that James is bluffing, but he would rather keep 80 coins than risk getting thrown overboard.

    You could argue that this strategy of James is against the rules, but I would say that he is staying alive and getting more coins for himself, so it conforms to the rules.

  58. 58 58 Xan

    My solution (48) reminds me of another interesting pair of problems.

    Problem 1. Consider a linear “beach” represented by the interval [0, 1]. Customers are distributed uniformly along [0, 1] and patronize the closest hot dog stand, breaking ties at random. Three hot dog vendors must decide where to locate on the beach. Taking the location of other vendors as given, a vendor’s goal is to maximize the number of customers who patronize his stand. Can you find an equilibrium?

    Problem 2. This is similar to Problem 1, but instead of a beach with vendors, we have a political ideology spectrum and voters. Consider a linear “political ideology spectrum” represented by the interval [0, 1]. Voters are distributed uniformly along [0, 1] and vote for the closest politician, breaking ties at random. Three politicians must decide where to locate on the ideology spectrum. Taking the location of other politicians as given, a politician’s goal is to maximize the chance of winning the election, which is accomplished by receiving the most votes (with ties broken randomly if multiple politicians receive the same measure). Can you find an equilibrium?

    It is well-known that with only two hot dog vendors or politicians, there is a unique equilibrium in which both vendors/politicians locate at 1/2. However, with three, it becomes more interesting. I believe there is actually no equilibrium for three vendors. But for three politicians, an equilibrium exists!

  59. 59 59 Zachary

    I think this has been mentioned before (@Harold @Ken B), but perhaps the gaping hole in the conventional solution is it doesn’t distinguish between a three-pirate game and the sub-game where only P8, P9 and P10 remain, which is a situation that presupposes P1 and others could not make an acceptable offer (otherwise they would have in order to survive). Imagine programming this game such that, in the sub-games, the machine will have learned rules including [P1 could not make an acceptable offer] and [P2 could not make an acceptable offer despite knowing that P1 could not make an acceptable offer]. Is it suboptimal for P9 (or others) to veto P1’s proposal and let that logic play out? (What rules of the game does that violate? P9 survives to the next round, where new rules/info has been learned.) It seems to me the only way for P1 to ensure his own survival is not to propose anything and stall out the entire process.

  60. 60 60 Xan

    To follow up on my solution (comment 48), I should probably answer Steve’s actual stated puzzles for today:

    1) In addition to the unique outcome with only two pirates remaining (Steve’s Problem 2), ANY outcome with more than two pirates is possible. For instance, the outcome where five pirates remain and pirate #2 gets all the gold is supported by equilibrium strategies in which everyone votes “no” in the first five rounds, and in the sixth round everyone votes yes if (0, 100, 0, 0, 0) is proposed, but no if any other allocation is proposed. Because there is no swing voter, an individual pirate cannot change the voting outcome at any stage leading to this outcome.

    Once we get to only two remaining pirates, though, the proposing pirate is a swing voter, so he can have it any way he wants, and thus there is just the (100, 0) outcome.

    2) I think comment #48 makes it clear what the problem is with the Google logic.

  61. 61 61 Tom Stewart

    I think there is a restriction on the possible solution. So we know 96-0-1-0-1-0-1-0-1-0 is an equilibrium.

    This means if the most ferocious pirate offers it, it will be accepted and he will be happy with what he offered. Therefore, there is no equilibrium that gives Arlo more than 96, since he would have choosen that equilibrium over this one. Also, there is no equilibrium that gives him less than 96 since this equilibrium dominates that one for him, and he can in effect choose it by offering it.

    The only equilibria in this game give Arlo 96. Are there other offers he can make that distribute the single coins differently? I can’t see it. Offer a coin to someone who would be offered a coin by Bob and that’s a waste, it won’t buy their vote.

    So I’m pretty confused where the other equilibria come from.

  62. 62 62 Keshav Srinivasan

    @Xan I don’t think your solution is what the problem intends at all. I think we’re assuming when a person votes, he votes for his preferred outcome. So people will vote to reject any deal that they don’t want. Any individual person might be indifferent if everyone else was indifferent. But I think the reasonable situation would be for everyone to try to influence the outcome of the vote as best they can.

  63. 63 63 Ken B

    Keshav re 62
    Imagine all the outcomes are arranged in some order, as proposals. They must be voted on in this order. I am a pirate and my preferred choice comes in position j. In position i, less than j, is an equilibrium I dislike most, and in h, less than i, an equilibrium I prefer to i but not to j. How will I vote?

  64. 64 64 dictum

    I actually have a pretty distinct memory of sitting in Landsburg’s 208 class 10 years ago and listening someone offer up Xan’s solution as an alternative equilibrium to this problem. This seems like a nice example of a subgame perfect NE.

  65. 65 65 Bob Murphy

    Wow this shows The Big Questions blog at its finest–cool puzzles posted by the host, and excellent discussion in the comments.

    At first, I was absolutely sure that Steve was alluding to the distinction between the unique (sic) subgame perfect equilibrium and plain old Nash equilibria. (I think that’s the hill where Ken B. staked his flag.) Then I was thinking Steve was being a bit sloppy when he was clarifying the acceptable strategies to Keshav, since it seemed Steve was insisting on subgame perfection.

    But then Xan blew my mind by pointing out that you can support weird outcomes even along the equilibrium path, so that even weird stuff can survive in a subgame perfect Nash equilibrium.

  66. 66 66 Bob Murphy

    Xan and Ken B., do you endorse what I’m saying in #65? I want to make sure I summarized everything correctly.

  67. 67 67 Zachary H

    To answer the two questions posed:

    1) The other possible outcomes are those in which Arlo’s proposal is rejected.

    2) What’s wrong with “Google’s” argument is this part of Problem 10: “Arlo, Charlie, Eeyore, George and Igor all vote yes, because if they vote no, Arlo will be thrown overboard and Bob will propose a division in which Arlo is thrown overboard while Charlie, Eeyore, George and Igor all get nothing.” They don’t all HAVE to vote yes, because it’s not necessarily the case that Bob’s subsequent proposal will offer nothing to Charlie, Eeyore, George and Igor, or that Bob’s proposal will be accepted. The reason Bob’s subsequent proposal will not necessarily be the same as his proposal in “Problem 9” is that Problem 9 is not equivalent to the situation in which Arlo has been thrown overboard: we have NEW INFORMATION that is not present in Problem 9. Namely, the remaining pirates now know that Arlo proposed the “Google” solution and it was rejected. This doesn’t bode well for anyone who continues to propose the “Google” solutions to Problems N<10, because the pirates no longer live in those particular information bubbles.

    Try as I might, I can't explain why it would violate the priorities of the other pirates to defect from how "Google" envisions the first vote, or why defecting somehow would *not* send the pirates into a new game that's different from Problem 9, or why that new information wouldn't matter. So I'm guessing that the other possible outcomes are those in which Arlo's proposal is rejected.

  68. 68 68 Capt. J Parker

    Ok, I know this is weak but here goes:If 96-0-1-0-1-0-1-0-1-0 is one solution then the 8 weaker pirates should realize that their best deal is 1 coin and they should not risk 1 coin now against the possibility of a pirate overboard now and no coins later. So, any solution that gives one coin each to four of the eight weakest pirates is a solution. By similar reasoning for the 9 pirate game any solution that gives one coin each to four of weakest 7 pirates is a solution. So, in the 10 pirate game none of the 7 weakest pirates are assured they will be offered a coin should the game move to 9 pirates so any of the 8 weakest pirates offered a coin in the 10 pirate game should say yes.

  69. 69 69 Xan

    @Bob Murphy (65 & 66): Yes. At first I got the Google-endorsed solution and could not see anything else that was subgame perfect, which is clearly the right solution concept here. And I thought, gosh, I sure hope the answer is *not* that I should be using a different solution concept. But I had faith because if you can’t trust a Landsburg logic puzzle, what can you trust? Then I remembered that I have been burned before on game theory questions that specifically involve voting (for example, see comment 58).

    The key is that, while individuals can affect the fraction of votes for a given alternative, they can’t necessarily change which alternative gets the *most* votes, and the latter is what they actually care about.

  70. 70 70 Mike H

    Steve said:

    “A strategy consists of a full contingency plan for how to behave in any situation that could arise (i.e. how to vote on any possible plan that could be offered, and a plan to offer in any possible situation were one might be called on to do so). Each player takes each other player’s strategy as given.”

    So, can pirate I’s “full contingency plan” include “reject H’s offer, then if J also does, offer J 1 gold coin”.

    Note that: “The pirates cannot make credible precommitments to do anything against their own interests. Given distribution X, pirate B will cast a vote that serves his own interests, and Pirate A knows what that vote will be.”

    Such a “strategy” is certainly not “against I’s own interests”. If, by contrast, I’s plan is “…. offer J nothing”, it permits H to offer J a 99-0-1 split, which is the worst possible outcome for I when there are 3 pirates left.

    Question 1: Can pirate I commit to a particular distribution before votes have been taken on H’s distribution, as part of his “full contingency plan”?

    Question 2: Can pirates offer negative numbers of coins to other pirates? This is another way the logic of the standard answer could break down.

  71. 71 71 Keshav Srinivasan

    @Ken B You vote yes on h if you think a majority of people would vote for i, and no otherwise. What’s your point?

  72. 72 72 Keshav Srinivasan

    @Capt. J Parker “So, any solution that gives one coin each to four of the eight weakest pirates is a solution.” No, pirates 8 and 10 would reject it, because if 1’s offer fails, they can accept 2’s offer of 96-0-1-0-1-0-1-0-1. So they get to see a pirate thrown overboard, and they’re still guaranteed to get the same number of coins (1 coin).

    “By similar reasoning for the 9 pirate game any solution that gives one coin each to four of weakest 7 pirates is a solution.” No, 7 and 9 would reject it, because they would prefer to see 9 thrown overboard and then accept 8’s offer of 97-0-1-0-1-0-1-0.

  73. 73 73 Ken B

    @Bob Murphy 66
    Yes, I think so. Xan’s solution is a lot like my last pirate gets all.

    There is something important here about weakly dominated strategies too I think, but I can’t put my finger on it.

  74. 74 74 Ken B

    @Keshav 71
    That the order of presentation matters and hence that there will not be a unique solution unless you can demonstrate the order in which proposals are presented. This raises the possibility that your solution would be voted down if further down the list is a NE that appeals more to at least one of the 5 yes voters in your proposal. Hole in your proof.

  75. 75 75 Keshav Srinivasan

    @Ken B I don’t understand what you’re saying. There’s no way for any of the yes voters on P1’s proposal to get a better offer down the road. They’ll all get 0 if P1’s proposal fails, because P2’s offer will succeed. And the reason it will succeed is that there’s no way for any of the yes voters on P2’s offer to get a better offer down the road, because if P2’s offer fails then they’ll all get 0 because P3’s offer will succeed. Etc. What am I missing?

  76. 76 76 Roger

    @Steve(83): 1) No, 2 pirates, 100-0 solution is good.
    2) 3 pirates, I disagreed with 99-0-1 yesterday.

    With 10 pirates, I think that the most obvious and rational strategy for pirates 5-10 is to vote against any deal that gives them less than 10 coins each. Pirate 1 will realize this, and offer them 10 each. If not, then he dies, and pirate 2 gets a similar choice. Watching a couple of pirates get thrown overboard can be very convincing. Pirates 5-10 will all get their 10+ coins.

    Following the Google-Landsburg solution only gets them 0 or 1 coin each. Explain to me how that is more rational than the above strategy, which gets them 10+ coins each?

    This whole puzzle is based on pirates anticipating what others will do, but you refuse to consider what seems like the most obvious scenario to me.

Leave a Reply