Are You Smarter Than Google?

pirateI’m not sure where this problem originated. I heard it first from John Conley, and have often assigned it to my classes. Google has used it to weed out job candidates. The answer that Google expects is the same answer I gave John Conley, and the answer I usually get from my best students. That answer is wrong. (Long time readers might feel a sense of deja vu.)

Can you get it right?

Here’s the problem:

Ten ferocious pirates have come into possession of 100 gold coins, which they must divide among themselves. The procedure for dividing them is as follows:

  1. The most ferocious pirate proposes a divison (such as: “I’ll take 50, the second most ferocious pirate gets 10, and everyone else gets 5 apiece”.)
  2. The pirates vote on whether to accept the division. As long as half or more of the pirates vote yes, the division is accepted, the coins are divided, and the game is over.
  3. If, instead, fewer than half the pirates vote yes, the most ferocious pirate is thrown overboard. The second most ferocious pirate now becomes the most ferocious pirate and we return to Step One.

You should also know that each pirate enjoys seeing other pirates thrown overboard, but not as much as he enjoys receiving one additional coin. So, for example, any pirate, contemplating the following three outcomes, prefers A to B and B to C:

  1. I get six coins and another pirate is thrown overboard.
  2. I get six coins and no other pirate is thrown overboard.
  3. I get five coins and another pirate is thrown overboard.

Finally, you should know that each pirate’s last choice is being thrown overboard.

Each pirate devises a strategy that is designed to make himself as happy as possible, taking the other pirates’ strategies as given.

What happens?

I’ll reveal the correct answer a little later in the week.

Print Friendly, PDF & Email
Share

116 Responses to “Are You Smarter Than Google?”


  1. 1 1 Jonatan

    Ok, so the first one will suggest:

    96 – 0 – 1 – 0 – 1 – 0 – 1 – 0 – 1 – 0

    His logic being that the 1’s will be happy to get at least 1, since the next most ferocious pirate will swap around and offer all the pirats who previously got 0 coins 1 instead, and all the pirates who previously got 1 coin 0 instead.

    But let’s say I’m one of the less ferocious pirates who is offered 1. I may just reject this meagre offer. Worst case scenario I lose 1 coin. Best case scenario, future most ferocious pirates will realize that the other pirates may reject low offers, fear for his life, and offer more. Maybe even much more. Seems like a gamble worth taking.

  2. 2 2 Mike

    I don’t understand what the division will be after the first pirate is thrown overboard.

    50 to the ferocious guy
    10 to the next.
    I now have 40 coins to divide between 7 pirates? Do we work in fractions?

  3. 3 3 PG

    90-0-1-0-2-0-3-0-4-0
    because the person offered 1 by the n-th Pirate will prefer the 1 by the n+2 pirate as two people will have been thrown overboard by that time

  4. 4 4 Tim Harford

    Let’s try backward induction. If only two pirates are left – the 9th and 10th most ferocious – then 9 proposes to keep all the coins and votes for his own division. His vote is 50% of the votes cast, which carries the motion.

    Pirate 10 is thus hoping for something better than zero coins and watching lots of other pirates thrown overboard.

    With pirates 8, 9 and 10, pirate 8 proposes 99 – 0 – 1. Pirate 10 prefers 1 coin to 0 coins and accepts.

    With 7, 8, 9, 10, pirate 7 proposes 98 – 0 – 0 – 2. Pirates 7 and 10 vote in favour and the motion passes.

    With 6, 7, 8, 9, 10, pirate 6 proposes 96 – 0 – 0 – 1 – 3. Pirates 6, 8, 9, 10 all vote in favour.

    Each iteration, as PG says, a pirate needs to be bribed with an extra coin, otherwise he will prefer to wait and see one more pirate thrown overboard.

    Working backwards along these lines I get to the following allocation.

    Pirate 1 proposes 79 – 0 – 0 – 0 – 0 – 2 – 5 – 6 – 8. Pirates 7, 8, 9, 10 all prefer this to their likely allocation if they wait.

    PG seems to have reached a similar conclusion. Normally I’d be pretty sure that the basic approach is right, but less sure that I’ve got the details right.

    However, given Steve Landsburg’s preamble, I’m very unsure about everything. Looking forward to reading other thoughts…

  5. 5 5 Ken

    The first pirate offers: 80, 0, 0, 0, 0, 0, 2, 4, 6, 8.

    Work backwards:

    If 2 pirates remain, pirate 9 offers 100, 0. One vote each, so pirate 10 gets nothing.

    If 3 pirates remain, pirate 8 offers 99, 0, 1. Pirate 10 takes 1 coin rather than throw 8 overboard and get nothing from 9.

    Similarly:

    4 pirates: 98, 0, 0, 2
    5 pirates: 96, 0, 0, 1, 3
    6 pirates: 94, 0, 0, 0, 2, 4
    7 pirates: 91, 0, 0, 0, 1, 3, 5
    8 pirates: 88, 0, 0, 0, 0, 2, 4, 6
    9 pirates: 84, 0, 0, 0, 0, 1, 3, 5, 7
    10 pirates: 80, 0, 0, 0, 0, 0, 2, 4, 6, 8

  6. 6 6 Joe

    The patterns with 0s appear ‘correct’ from bottom-up reasoning. But do they not break down as there are always offers that those lower down will prefer (as even 0 and seeing someone thrown in is preferable)? Every offer breaks down, and the pirates know this, so they agree to split it equally with 10 each?

  7. 7 7 Nathan

    I get the same as Jonatan (96 – 0 – 1 – 0 – 1 – 0 – 1 – 0 – 1 – 0)

    The reason why offering every other pirate just 1 is enough is because the next pirate will offer those pirates nothing and (in theory) that offer should be accepted by enough pirates that the n+2 offer will never happen.

    But is that the (already counterintuitive) answer that Steve says is wrong. In which case what’s the twist?

    Is it adding in the uncertainty that J refers to?
    Is it something to do with making deals?

  8. 8 8 Nawaaz

    Agree with Jonatan on (96,0,1,0,1,0,1,0,1,0) – got there through backwards induction.

    Don’t agree with his last paragraph though: there is only one scenario, the second pirate will offer (0,96,0,1,0,1,0,1,0,1) and it will be voted through; 2, 4, 6, 8 and 10 will vote for it) and this will happen because the third pirate … etc! Backwards induction is the best way to demonstrate this i.e. choose the best strategy for each pirate working back from the point where pirate 10 keeps 100 gold.

    Further, the preference relation outlined must mean it is this way (assuming enjoyment of watching pirates thrown overboard = epsilon, where n*epsilon < 1 i.e. it is just a tie breaking rule).

  9. 9 9 Jonatan

    @PG

    But if you reject an offer for getting less than 2/3/4, you risk that the the second most ferocious pirate’s offer will be accepted, in which you gain 0. You can’t be certain that the n+2 pirate’s offer will ever be offered to you.

    You can of course take your chances and reject a low offer anyway, accepting the uncertainty.

    An actual solution to the puzzle would have to include information about how much pirates value coins compared to their lives, and how risk averse they are, and whether they are aware of these preferences for each other.

    If we assume that pirates value their lives infinitely more than coins, then I believe the solution to the puzzle is as follows:

    The most ferocious pirate takes all 100 coins. The 7-10th most ferocious pirates reject the offer because their lives can never be in danger. The 2-6th most ferocious pirates accept the offer, because they will increase the risk of losing their lives by rejecting. And thus the offer is accepted.

  10. 10 10 Jonatan

    @Nawaaz

    But maybe the second pirates’ offer will be rejected by pirate 8 or 10. Because p8 and P10 know that they can’t die, and if they reject, maybe the next pirate(s) will realize that this sort of distribution doesn’t work and offer them much more than 1.

    That’s certainly what I would do. Take my chances and maybe get 20 coins instead of this lousy 1.

  11. 11 11 Joe

    Pirates 2,4,6,8 and 10 would prefer anything (other than being thrown out of the boat) to 96,0,1,0,1,0,1,0,1,0. How would it get through?

  12. 12 12 Steve Landsburg

    Commenters have asked whether pirates value their lives more than anything. The answer is yes. I’ve added a line to the post saying that each pirate’s last choice is being thrown overboard.

  13. 13 13 Nawaaz

    @Jonatan: not sure I see your logic. If P8 and P10 reject, P3 will offer (0, 0, 97, 0, 1, 0, 1, 0, 1, 0) which will be accepted by P3, P5, P7 and P9 (enough to pass the vote). Much easier to see via backwards induction and you’ll see no need to take into consideration risk aversion. If you go through this logic and I’m sure you’ll agree:

    If all offers are rejected and all but P10 is killed, P10 will definitely keep all 100 coins, call this (0,0,0,0,0,0,0,0,0,100) – there is only him left to vote for himself.

    Aware of this, if all offers are rejected and all but P9 and P10 are killed, P9 will keep all 100 coins, i.e. (0,0,0,0,0,0,0,0,100,0), as he can vote for himself and the vote will pass. This is his best response.

    Aware of this, if all offers are rejected and all but P8, P9 and P10 are killed, P8 needs to pay off one pirate to pass his vote. The cheapest pirate to pay off is P10 as P10 KNOWS that he will get no coins as if P8’s allocation is rejected, P9 will get his allocation as 1 vote is enough to pass his proposal. Thus P8 and can offer (0,0,0,0,0,0,0,99,0,1) and be sure that this will pass. Note the certainty of this passing is important.

    Aware of this, if all offers are rejected and all but P7, P8, P9 and P10 are killed, P7 needs to pay off only one pirate to pass his vote (2 are needed, himself and one other). The cheapest pirate to pay off is P9 as P9 KNOWS that he will get no coins as if P7’s allocation is rejected. Thus P7 and can offer (0,0,0,0,0,0,99,0,1,0) and be sure that this will pass. Note the certainty of this passing is important.

    … continue on, finding the minimum each pirate needs to payoff the other pirates to get at least 50% of the pirates to agree. There is a shortcut; notice each regressive pirate needs only to reverse the 0’s and 1’s, and take what’s left over from the next pirate for himself. Thus we end up with your solution (and mine (and the correct one I believe)): (96,0,1,0,1,0,1,0,1,0).

    No need for risk preferences as each knows the next pirate will have need to offer some passable allocation so as not to die, and given this restriction will maximise their payoffs. The beauty of these sequential, full information games is that, given no player is choosing between equal payoffs, there is always a unique “subgame perfect Nash” equilibrium (sorry for the jargon!). But then again, Landsberg may have something up his sleeve … perhaps he’s considering all NE which will give me a headache.

  14. 14 14 Jonatan

    @Nawaaz

    I don’t think P3 would offer that. Put yourself in P3’s shoes / wooden legs. He has just seen P2 making an offer using that same logic as he is about to apply, and P2 got thrown overboard. Then P3 would clearly be afraid that the same will happen to him. And therefore he would offer the remaining pirates more, possibly much more, in order to save his life.

  15. 15 15 David Pinto

    The MFP takes one, and everyone else gets 11. The MFP lives, still MF, and robs the rest of them in their sleep.

  16. 16 16 Joe

    My above point was numbskull as you only need 50% for acceptance, not rejection. But it feels like there might be mechanisms that always break down the offers. Particularly as they can collude.

  17. 17 17 Keshav Srinivasan

    @Tim Hartford and Ken: With 7, 8, 9, 10, why wouldn’t 7 offer 99-0-1-0?

  18. 18 18 Steve Landsburg

    Joe (#16):

    Particularly as they can collude.

    It is specified in the problem that each pirate takes the others’ strategies as given. This is meant to rule out collusion.

  19. 19 19 Joe

    Understood, then I can’t get away from the 96-0-1… answer. Which I am assuming is the incorrect “correct” answer.

  20. 20 20 Ally

    At first I thought the solution was the 96 – 0 – 1 – 0 – 1 – 0 – 1 – 0 – 1 – 0 one proposed by Jonatan (#1), Nathan (#7) & Nawaaz (#8), but that doesn’t take into account the pirates enjoyment for seeing other pirates thrown overboard (i.e. this is the answer you would come up with if this statement didn’t appear in the question wording).

    With the preference for seeing pirates thrown overboard (albeit a less strong preference than gold coins) explicitly stated in the question, I am inclined to agree with Ken (#5) and with the logic behind Tim Harford’s (#4) solution (I think there’s a typo in Tim’s answer):

    80 – 0 – 0 – 0 – 0 – 0 – 2 – 4 – 6 – 8

    As the others, I arrived at this via reverse induction:

    The 9th and 10th pirate’s lives are never in danger, since if it came down to them they would each hold 50% of the voting power.

    The 9th pirate, if it came to him to make an offer, could keep all 100 coins for himself (since with only 2 pirates left he can’t lose a vote).

    The 10th pirate would therefore accept an offer of just 1 coin from the 8th pirate.

    The 8th pirate, recognising this, would therefore offer:
    0-0-0-0-0-0-0-99-0-1

    2 pirates vote in favour (8 & 10), 1 against (9), the motion carries.

    The only way the 7th pirate could keep himself from being thrown overboard would be by bargaining an extra coin to one of the other pirates, so he could offer:
    0-0-0-0-0-0-98-0-0-2

    2 pirates vote in favour (7 & 10), 2 against (8 & 9), the motion carries.

    Similarly, the only way the 6th pirate could keep himself from being thrown overboard would be by bargaining an extra coin to two of the other pirates, so he would offer:
    0-0-0-0-0-96-0-0-1-3

    3 pirates vote in favour (6, 9 & 10), 2 against (7 & 8), the motion carries.

    Following the same logic, the 5th pirate would offer:
    0-0-0-0-94-0-0-0-2-4

    3 pirates vote in favour (5, 9 & 10), 3 against (6, 7 & 8), the motion carries.

    The 4th would offer:
    0-0-0-91-0-0-0-1-3-5

    4 pirates vote in favour (4, 8, 9 & 10), 3 against (5, 6 & 7), the motion carries.

    The 3rd would offer:
    0-0-88-0-0-0-0-2-4-6

    4 pirates vote in favour (3, 8, 9 & 10), 4 against (4, 5, 6 & 7), the motion carries.

    The 2nd would offer:
    0-84-0-0-0-0-1-3-5-7

    5 pirates vote in favour (2, 7, 8, 9 & 10), 4 against (3, 4, 5 & 6), the motion carries.

    The 1st pirate will therefore offer:
    80-0-0-0-0-0-2-4-6-8

    5 pirates vote in favour (1, 7, 8, 9 & 10), 5 against (2, 3, 4, 5 & 6), the motion carries and no-one gets thrown overboard.

  21. 21 21 John Hall

    Based on Pete Leeson’s research, the actual outcome of pirates would have been something like
    19-9-9-9-9-9-9-9-9-9
    where 19 is the Captain (rounded up for his double share).

  22. 22 22 Ken

    Although the various 96,0,1,0,1,0,1,0,1,0 solutions look convincing, I still prefer my answer in comment 5, (which is almost identical to comment number 4).

    If pirate 1 makes this offer, pirate 2 can say vote against and I’ll offer 84,0,0,0,0,1,3,5,7.

    This would result in pirates 2, 8, 9, 10 increasing their coins, while pirates 4, 6 and 7 would receive the same coins but enjoy the spectacle of pirate 1’s demise. Pirate 1 is overboard 7 votes to 3.

    I’m sure I’ve missed something though!

    Ken Hendrickson.

  23. 23 23 Alan Wexelblat

    I”m amused to read others’ answers and look forward to your post on right and wrong answers. It’s also worth noting that Google has determined problems like these to be unhelpful in selecting employees and has discarded that portion of the application process. I’m not sure if anyone else has done data analysis of the usefulness of puzzle-solving skills, but I feel certain they’re good for something other than entertainment value.

  24. 24 24 Ken

    Damn, the last 2 lines in comment 21 should refer to pirate 1’s demise and pirate 1 being overboard.

    Sigh.

    Note: I’ve fixed this — SL

  25. 25 25 Sub Specie Æternitatis

    This is nice puzzle. Reminds me of my favorite puzzle–strangely also concerning rational pirates and the division of gold–but, rather than asking about the outcome of a given mechanism, asks to design a mechanism that ensures the optimal outcome.

    @Nawaaz’s argument by backward induction appears correct. It will be amusing to see it either confirmed or an error pointed out.

    @Jonatan’s error is the assumption that $P_{k+1}$’s offer was rejected. It is true that $P_{k}$ only gets to make a choice if $P_{k+1}$’s offer was rejected, but $P_{k+1}$ of course choose his offer so as to ensure it is not rejected. By the same logic, $P_{k+1}$ never got to choose because $P_{k+2}$ would have done the same–if at all possible and in the ties go to aye version (but not the more regular procedure which at first I assumed)–that is always the case. So $P_{n}$ always prevails.

  26. 26 26 Gary

    When I’ve seen a variation of this puzzle before, offers were always made by the most senior pirate. It was assumed that there was a clear seniority ordering that all the pirates understood. Are the same assumptions true about most ferocious pirate in that there is a pre-determined ordering that all pirates understand, or is “most ferocious” determined by which pirate is ferocious enough to make the next offer, and the pirates themselves do not know who will go next?

  27. 27 27 Jim W K

    As soon as I started to read Steve’s post I thought the puzzle was going to be the Sailors & Gold Coins problem In Joseph Conrad’s novel Typhoon. It goes like this. A number of sailors store gold coins in private boxes kept in the ship’s safe. The ship hits stormy weather, the boxes break open, and the coins are hopelessly mixed. Each sailor knows how many coins he started with, but nobody knows what anybody else started with. The captain’s problem is to return the correct number of coins to each sailor. How should the captain solve the problem?

    The answer is (I might as well tell you because anyone can Google it) is this. Have each sailor write down the number of coins he is entitled to. Collect the papers and distribute the coins. But announce in advance that if the numbers on the papers don’t add up to the correct total, he will throw all the coins overboard.

  28. 28 28 Keshav Srinivasan

    @Ally Why wouldn’t the 7th pirate offer 0-0-0-0-0-0-99-0-1-0?

  29. 29 29 Sub Specie Æternitatis

    I tried this inefficient, inelegant code:

    step s = takeFirst (\t -> 2*(length $ filter id $ zipWith (>) (tail t) s)+1>=n) [p0:ps|p0<-[g,g-1..], ps<-splitNtoK (g-p0) n]
    where g=sum s; n=length s
    iterate step [100]

    It generates a list of accepted solutions for every number of pirates. As @Nawaaz predicted, they are:
    1: [100]
    2: [100,0]
    3: [99,0,1]
    4: [99,0,1,0]
    5: [98,0,1,0,1]
    6: [98,0,1,0,1,0]
    7: [97,0,1,0,1,0,1]
    8: [97,0,1,0,1,0,1,0]
    9: [96,0,1,0,1,0,1,0,1]
    10: [96,0,1,0,1,0,1,0,1,0]
    11: [95,0,1,0,1,0,1,0,1,0,1]
    12: [95,0,1,0,1,0,1,0,1,0,1,0]
    13: [94,0,1,0,1,0,1,0,1,0,1,0,1]
    14: [94,0,1,0,1,0,1,0,1,0,1,0,1,0]
    15: [93,0,1,0,1,0,1,0,1,0,1,0,1,0,1]
    16: [93,0,1,0,1,0,1,0,1,0,1,0,1,0,1,0]

    Note:
    (1) Each of these proposals will obtain at least half the votes as it is an improvement over the alternative (that, is the previous solution) for at least half the pirates.
    (2) Each of these proposals maximizes the gold received by the most ferocious pirate (i.e., all other potential proposals would either fail resulting in the proposers death or result in no more gold for the proposer).
    (3) I assume, but have not proven, that there is only a single optimal solution at each stage. Otherwise, we would need to know about the pirate's preferences between states where they are equally alive, they have an equal amount of gold, and an equal number of pirates have been tossed overboard.
    (4) Note that for sufficiently many pirates (about 200), the most ferocious pirate will have to accept a negative amount (i.e., throw some of his other gold into the pot) or die.

  30. 30 30 Steve Landsburg

    Gary (#26): You should assume that all pirates know in advance who is the most ferocious, the second most ferocious, etc.

  31. 31 31 Nawaaz

    @ally and others: I don’t believe I have assumed away the enjoyment of seeing others go overboard, just that it’s less than one coin. Pirate 7 in your working is making a sub optimal choice – he is offering 2 coins for pirate 10 when he only needs to offer 1 coin to pirate nine who will accept because it’s better than watching pirate 7 be thrown and getting 0 in the next offer which we know is accepted. Thus as long as the enjoyment is less than one, every line of mine works AND is optimal for each pirate (that’s why I don’t prefer @ken’s soln). *writing not intended to be aggressive, only intended to be a rigorous reply (I’m still looking for Landsburg’s trick)

    @Jonatan 14: ah, an age old problem! What do you do if you are required to make an allocation, when, if all players where playing rationally, you wouldn’t have had to make an allocation? My response is a bit of a cop out but the question does say everyone acts in there best interest and I’m also taking that to mean that they’re aware everyone is acting in their best interests. As such, this problem never arises as it is a situation where some prirates are not playing their best responses. Again, assuming we are only looking at sub game perfect nash, which, for the game theorists out there, is a subset of Nash, so my (and other supporters’) answer will still hold.

  32. 32 32 Keshav Srinivasan

    @Ken “If pirate 1 makes this offer, pirate 2 can say vote against and I’ll offer 84,0,0,0,0,1,3,5,7.” No, we’re assuming no communication.

  33. 33 33 Jonatan

    I’m going to try and be a little more clear.

    I will call the pirates P1… P10. Where P1 is the most ferocious and P10 is the least ferocious.

    Now, P1 may think that offering
    (96, 0, 1, 0, 1, 0, 1, 0, 1, 0)

    Is his best option.

    But:

    Claim 1.
    P9 could reject this claim hoping that some future most ferocious pirate will offer him more. For this to happen it is important that claim 2 is true.

    Claim 2.
    If P2 has seen P1 apply the (96, 0, 1, 0, 1, 0, 1, 0, 1, 0) scheme, and saw that he was thrown overboard, P2 will realize that he should not make the same mistake and offer (96, 0, 1, 0, 1, 0, 1, 0, 1). Instead maybe he should offer something like (10, 0, 10, 10, 10, 10, 10, 10, 10), so that he can survive.

    Claim 3. P9, when making the decision to reject or accept, realizes that claim 2 is true, and he may get 10 coins instead of 1. So by rejecting, in worst case doesn’t get the 1 coin and in the best case he can get 10 coins or even more. Therefore he rejects.

    Claim 4. P1 realizes all this, and therefore does something different than (96, 0, 1, 0, 1, 0, 1, 0, 1, 0)

  34. 34 34 Keshav Srinivasan

    I would be absolutely shocked if the correct solution was something other than 96-0-1-0-1-0-1-0-1-0.

  35. 35 35 roystgnr

    Worst case scenario I lose 1 coin. Best case scenario, future most ferocious pirates will realize that the other pirates may reject low offers, fear for his life, and offer more. Maybe even much more. Seems like a gamble worth taking.

    Indeed. Rather than “taking the other pirates’ strategies as given”, perhaps we need to specify “taking the other pirates’ strategies as Nash equilibria for a single-iteration game”?

    Otherwise, consider the Ultimatum game. From single-iteration game theory we see that player 1 should offer to give $1 and keep $99; from human psychology we see that player 2 will then say “fuck you” and forgo $1 in order to watch that greedy bastard lose $99. If you try to figure this out via game theory alone then you end up in an endless Schelling “who gets to pre-pre-etc-commit first?” recursion, to which the only applicable answer is contingent, not mathematical: “evolution did”.

  36. 36 36 Nawaaz

    @subspecie, love that last comment! Alternatively we could adjust the pot of gold to less then 10 coins.

    Another implicit assumption I’ve just realised we’ve been making is the indivisibility of a coin, which I think is a fair reading of the post (am I wrong Landsburg?)

  37. 37 37 Nawaaz

    On another point, the backwards induction method shows us there can be no sustainable coalitions, so we don’t have to be concerned about “join me and I’ll promise you this” because there can be no punishments.

  38. 38 38 Steve Landsburg

    Nawaaz (#36): Yes, you should assume that coins are indivisible.

  39. 39 39 Sub Specie Æternitatis

    By the way, here is a set of solutions if ties went to the nays, rather than the ayes. More interesting numerical pattern:

    [100]
    [-1,101]
    [100,0,0]
    [98,0,1,1]
    [97,0,1,0,2]
    [96,0,1,2,1,0]
    [96,0,1,0,0,2,1]
    [95,0,1,0,1,1,0,2]
    [95,0,1,0,1,0,2,1,0]
    [94,0,1,0,1,0,1,0,2,1]
    [94,0,1,0,1,0,1,0,1,0,2]
    [93,0,1,0,1,0,1,0,1,2,1,0]
    [93,0,1,0,1,0,1,0,1,0,0,2,1]
    [92,0,1,0,1,0,1,0,1,0,1,1,0,2]
    [92,0,1,0,1,0,1,0,1,0,1,0,2,1,0]
    [91,0,1,0,1,0,1,0,1,0,1,0,1,0,2,1]
    [91,0,1,0,1,0,1,0,1,0,1,0,1,0,1,0,2]
    [90,0,1,0,1,0,1,0,1,0,1,0,1,0,1,2,1,0]

  40. 40 40 Phil

    For any division proposed by the most ferocious pirate, every other pirate votes no.

    In the end, only one pirate remains and get the whole treasure.

  41. 41 41 GregS

    Pirate 1 is the most ferocious, Pirate 10 the least.
    Here is what I get working backwards from 1 pirate:

    1 pirate scenario: Pirate 10 only – Pirate 10 gets all 100 coins.

    2 pirate scenario: Pirates 9 and 10 only – Pirate 10 will vote to throw 9 overboard, even if 9 offers (0,100), because 10 gets to see a pirate thrown overboard and gets to keep all the coins anyway.

    3 Pirate scenario: Pirates 8, 9, and 10 – knowing the above, 9 will support any offer. 8 should offer 100,0,0.

    4 Pirate scenario: Pirates 7, 8, 9, and 10 – knowing the above, Pirate 8 will vote to throw overboard. 9 and 10, seeing they will get 0 in the next round, will support any positive offer. If 9 and 10 are offered 0 they will vote to throw overboard, because they get the same outcome plus they get to see a pirate thrown overboard. 7 should offer 98,0,1,1 (he needs to buy 2 votes in addition to his own).

    5 Pirate scenario: Pirates 6, 7, 8, 9, and 10 – knowing the above, Pirate 7 will vote to throw overboard. 8, 9 and 10 will demand the above offers plus 1 or they will vote to throw overboard. 6 should offer 97,0,1,2,0 or 97,0,1,0,2 (he only needs to buy 2 votes in addition to his own).

    I won’t share scenarios 6-10; I’ll hold back and see if someone else got the same answer. I get an answer like “Pirate 1 should offer 91,0,1,2,2,2,2,0,0,0.” The last 7 are some permutation of four 2s and three 0s. It will be interesting to see Landsburg’s solution. In scenario 6-10, the last few pirates are looking forward to an expected payoff of coins, not a guaranteed number, in the next round. In a couple of these scenarios, the expected value is not an integer, so I wasn’t sure how to treat this. I assumed that the pirates preferred any fractional number of coins to seeing a pirate thrown overboard, even though the problem only specifies that they prefer 1 coin to chucking a pirate. Also, in the 2 pirate scenario, does Pirate 10 manage to throw Pirate 9 overboard, even though we specified that 9 is fiercer? Not to Lube up a logic good puzzle, but that bothered me.

  42. 42 42 Keshav Srinivasan

    @Jonatan I really want to understand your thinking. Can you please tell me the earliest statement that you disagree with?

    1. 9 will offer 100-0 and not be rejected.
    2. 8 will offer 99-0-1 and not be rejected.
    3. 7 will offer 99-0-1-0 and not be rejected
    4. 6 will offer 98-0-1-0-1 and not be rejected.
    5. 5 will offer 98-0-1-0-1-0 and not be rejected.
    6. 4 will offer 97-0-1-0-1-0-1 and not be rejected
    7. 3 will offer 97-0-1-0-1-0-1-0 and not be rejected.
    8. 2 will offer 96-0-1-0-1-0-1-0-1 and not be rejected.
    9. 1 will offer 96-0-1-0-1-0-1-0-1-0 and not be rejected.

    Now as far as your claim 2 goes, you’re assuming that there’s a chance that 1’s offer will be rejected. But if my reasoning is correct, then that’s impossible.

  43. 43 43 Keshav Srinivasan

    @Phil No, your solution cannot possibly be correct, because if it’s down to two pirates then there’s nothing the 10th pirate can do to make the 9th pirate go overboard.

  44. 44 44 Keshav Srinivasan

    @GregS “Pirate 10 will vote to throw 9 overboard, even if 9 offers (0,100), because 10 gets to see a pirate thrown overboard and gets to keep all the coins anyway.” Everything in that sentence is correct, but what you’re neglecting is that 10’s vote doesn’t matter; as long as half or more of the pirates accept the offer, then the offer is approved. So as long as 9 supports his own offer, his offer is approved. 10 can’t do anything about it.

  45. 45 45 Sub Specie Æternitatis

    I think this is a formal proof @Nawaaz’s solution:

    Let the number of pirates be n.

    Let a proposal be a sequence of the form ${p_n,p_{n-1},…,p_1}$ with the condition that that all $p_k$ be non-negative integers and $sum p_k=100$.

    Let an acceptable proposal be a proposal that at least half of the n pirates will vote for.

    Does an acceptable proposal always exist? Not clearly. But if we allow $p_n$ to be negative (representing an unlimited kitty of other coins that most ferocious pirate could dip into to ensure that the proposal is accepted), then there must be at least one acceptable proposal for the most ferocious pirate could always offer some quantity of gold larger than what any pirate would otherwise receive.

    Let an optimal acceptable proposal for every n be one which leaves the most ferocious pirate with at least as much gold as any other acceptable proposal.

    Is there a unique optimal acceptable proposal every n? Not clear in the general case. I believe so, if ties go to the ayes, but not if ties go to the nays.

    But if there were multiple optimal acceptable proposals, the problem would be ill-defined. One would not be able to predict from the rules which proposal the most ferocious pirate would pick. Assuming that the problem is well-defined, there must be a unique optimal acceptable proposal for every n.

    Let $s_n$ be that unique optimal acceptable proposal for n. This will be the definite outcome if the number of pirates ever reaches n.

    Now the question of which proposals are acceptable becomes tractable. Any pirate will vote for any proposal if and only if that proposal leaves the pirate with more gold than the definite, defined alternative $s_{n-1}$. So to determine $s_n$ all we need
    to find the proposal leaving the most ferocious pirate with the most gold that will also grant at least half the pirates at least one more coin that $s_{n-1}$.

    From that @Nawaaz’s solution follows immediately.

  46. 46 46 GregS

    @ Keshav Srinivasan
    D’oh! I’ve solved a puzzle other than the one posted. I misread the problem and thought that it only took 50% to *reject* the offer.

  47. 47 47 Jonatan

    @ Keshav

    What if you think about it like this:

    8b. P2 will offer 96-0-1-0-1-0-1-0-1 and not be rejected.
    Given that P1 offered 96-0-1-0-1-0-1-0-1-0 and was rejected.

    Do you agree that this is not necessarily true? (P2 will realize that he is applying the same logic as P1 did, and this did not work out for P1. Therefore P2 may offer something else.)

  48. 48 48 Keshav Srinivasan

    “P2 will realize that he is applying the same logic as P1 did, and this did not work out for P1. Therefore P2 may offer something else”. But that assumes that it’s possible for P1 to be rejected. If my logic is valid, then it’s impossible for P1 to be rejected. I agree that if it’s possible for P1 to be rejected, then my statement 8 is incorrect. But it’s not possible.

    In any case, I asked you what the earliest statement you disagree with is. Is 8 the earliest statement you disagree with, or is there an even earlier statement?

  49. 49 49 Keshav Srinivasan

    @Jonatan Even if you think my statement 8 is not necessarily true, do you at least agree that it follows logically from statements 1-7?

  50. 50 50 Jonatan

    Right.

    So now let’s look at (9), and at P9. He is offered 1 coin. If he accepts he gets 1 coin. If he rejects we go to (8b). And in (8b) it is not clear what he will be offered. The offer will not necessarily be (96-0-1-0-1-0-1-0-1). It could be way more. Maybe he gets 10 coins. Therefore P9 rejects this offer. And thus (9) is not true.

    I’m not entirely sure what the first one I disagree with is.

  51. 51 51 Jonatan

    Yes, I can see the logical argument for all the statements. This is the line of thought I used to arrive at (96 0 1 0 1 0 1 0 1 0) back in comment #1. I just don’t think it’s correct. I think a rational pirate #9 would also see this logical statement, but realize that he can invalidate this method of making offers, and thereby basically change the game. And this would be at low cost to himself, at a maximum of 1 coin.

  52. 52 52 Keshav Srinivasan

    @Jonatan “The offer will not necessarily be (96-0-1-0-1-0-1-0-1). It could be way more. Maybe he gets 10 coins.” But if 7 is true, then the offer will necessarily be 96-0-1-0-1-0-1-0-1. So by saying that the offer will not necessarily be 96-0-1-0-1-0-1-0-1, then you are either saying that 7 is not true, or you are saying that 7 doesn’t imply 8. Which of those two things are you saying?

  53. 53 53 Keshav Srinivasan

    ” I think a rational pirate #9 would also see this logical statement, but realize that he can invalidate this method of making offers, and thereby basically change the game.” Well, if P9 votes no when P10 makes his offer, then assuming statement 7 is correct, then statement 8 is correct, so the offer that P2 will make will necessarily be 96-0-1-0-1-0-1-0-1 and that offer will not be rejected. So assuming statement 7 is true, P9’s rejection of P10 will lose him a coin and cannot possibly do anything else.

    So by saying that there’s another possibility, you are asserting that statement 7 is false.

  54. 54 54 Jonatan

    I don’t think the offer P2 will make will necessarily be 96-0-1-0-1-0-1-0-1. Admittedly, the recursive logic seems to say that this makes sense. But on the other hand P2 just saw P1 making an offer using the exact same logic, and he is in the ocean now. It seems rather insensible to try the exact same thing as P1, and run a large risk of joining him.

    (I read it as that you agreed to this back in post #48.)

    Yes, I think statement 7 is false, and also that 7 doesn’t imply 8.

  55. 55 55 Keshav Srinivasan

    @Jonatan By the way won’t your logic work just as well (or as poorly) in statements 2 and 3 and in statements 8 and 9? Can’t you conceive of a scenario 2b, where P8 realizes that P7’s similar offer in 3 has been rejected, and thus P8 decides to offer 80-0-20 or something. And knowing that 2b is possible, P9 rejects P7’s offer of 99-0-1-0. And anticipating all that, P7 makes an offer different from 99-0-1-0.

  56. 56 56 Keshav Srinivasan

    Typo: In comment 55 I meant “as in statements 8 and 9”, not “and in statements 8 and 9”.

  57. 57 57 Sub Specie Æternitatis

    @Jonatan I think you err in assuming that P2 saw P1’s 96-0-1-0-1-0-1-0-1-0 be rejected. It is true that this is the assumption we make in calculating P2’s offer.

    But that is merely a hypothetical. In fact, we calculate P1’s offer so that it would not be rejected. So P2 never gets a chance to make his offer. We only calculate P2’s offer in order to calculate how high P1 would have to go to beat it. Once P1 does, P2’s offer become inoperational.

  58. 58 58 Keshav Srinivasan

    @Sub Specie Aeternitatis Let me summarize Jonatan’s reasoning for you, even though I think his reasoning is fallacious. Jonatan is saying that P9 will reason as follows: “If P1’s offer of 96-0-1-0-1-0-1-0-1-0 fails, then P2 will realize that offering 96-0-1-0-1-0-1-0-1 would be a bad idea, so he will offer something where I get more than one gold coin. Therefore, it is in my interest to make P1’s offer fail.”

  59. 59 59 Sub Specie Æternitatis

    If there is a subtlety we have all been missing it must be in the issue of whether Pk can make an acceptable offer. If Pk can, he will for the alternative is death. But perhaps there is some situation in which he cannot?

    One such would be for n>200, but that is not the case here and I have been assuming that possibility away by postulating an unlimited kitty from which the most ferocious pirate could reluctantly borrow. But perhaps that induces other errors, so that a lower-ranked pirate might have a strategy to get an infinite amount of gold by extorting the most ferocious one.

    Let’s see.

    P10 can make an acceptable offer: all to P10.

    So P9 too can make an acceptable offer: all to P9.

    The next step is P8. Is the offer [99,0,1] acceptable? P8 votes aye and P9 votes nay. So it depends on P10’s vote. If P10 votes no, P9’s offer goes through and P10 gets nothing. Under P8’s offer, he gets 1 gold. So [99,0,1] will be supported by P8 and P10 and is hence acceptable.

    Then, P7. Is the offer [99,0,1,0] acceptable? P7 is aye and P8 and P10 are nay. What about P9? He gets one more gold than under the certain alternative of P8’s offer, so he votes aye and [99,0,1,0] is acceptable.

    Then, P6. Is the offer [98,0,1,0,1] acceptable? P6 is an aye and P7 and P9 are nays. Are P8 and P10 ayes? Compares to the definite alternative of P7’s offer, they should be.

    Where does this chain break down?

  60. 60 60 Keshav Srinivasan

    @Sub Specie Aeternitatis “If P10 votes no, P9′s offer goes through and P10 gets nothing. Under P8′s offer, he gets 1 gold. So [99,0,1] will be supported by P8 and P10 and is hence acceptable.” I would just like to play Devil’s advocate for a moment. What if P10 were to precommit to rejecting any offer that P9 makes which offers him less than 20 gold coins? Then wouldn’t the rational response to that be for P9 to make an offer other than 99-0-1, like 80-0-20? It’s like a game of chicken.

  61. 61 61 Keshav Srinivasan

    @Sub Specie Aeternitatis And we don’t even need to assume any communication. Even if P9 thought there was a possibility that P10 might be pursuing some kind of precommitment strategy like that, wouldn’t he make a better offer than 99-1-0 just in case?

    The solution may be sort of like this:
    http://wiki.lesswrong.com/wiki/Acausal_trade

    Again, I’m just playing Devil’s Advocate, just in case Steve has something wacky like that in mind.

  62. 62 62 Ken B

    Consider three pirates.
    Do we believe 99 0 1? I don’t. Then we have a problem don’t we at 5 pirates? 98 0 1 0 1 does not work, because the tail end pirate, #10, is denied the joys of seeing #6 tossed over board. This makes me sceptical of any scheme where the last assenting pirate gets only 1.
    But with 3 pirates 0 1 99 is stable.
    With 4 pirates I get
    0 1 2 97

  63. 63 63 Keshav Srinivasan

    @Ken B Why don’t you believe 99,0,1? If P10 votes no on it, then he’s guaranteed to get nothing. So what is his incentive to vote no?

  64. 64 64 Keshav Srinivasan

    @Sub Specie Aeternitas I’m so sorry, in comments 60 and 61, I kept saying P9’s offer, but I meant P8’s offer.

  65. 65 65 Keshav Srinivasan

    @Sub Specie Aeternitatis Just so that there’s no confusion, let me rewrite comments 60 and 61.

    Comment 60:

    @Sub Specie Aeternitatis “If P10 votes no, P9′s offer goes through and P10 gets nothing. Under P8′s offer, he gets 1 gold. So [99,0,1] will be supported by P8 and P10 and is hence acceptable.” I would just like to play Devil’s advocate for a moment. What if P10 were to precommit to rejecting any offer that P8 makes which offers him less than 20 gold coins? Then wouldn’t the rational response to that be for P8 to make an offer other than 99-0-1, like 80-0-20? It’s like a game of chicken.

    Comment 61:

    @Sub Specie Aeternitatis And we don’t even need to assume any communication. Even if P8 thought there was a possibility that P10 might be pursuing some kind of precommitment strategy like that, wouldn’t he make a better offer than 99-1-0 just in case?

    The solution may be sort of like this:
    http://wiki.lesswrong.com/wiki/Acausal_trade

    Again, I’m just playing Devil’s Advocate, just in case Steve has something wacky like that in mind.

  66. 66 66 Sub Specie Æternitatis

    @Keshav I think you are right to focus on the case of P8’s 99-0-1 offer. P9 and P10 seem pretty easy and the difficulty arises in its simplest case for P8’s offer.

    The pre-commitment idea might work among people or where there is some enforcement mechanism, but if we are talking about purely rational actors optimising on a very specific goal and without an enforcement mechanism, I don’t think it would.

    Let’s say, before P8 makes his offer, P10 screams that he’ll accept no less than 20 gold and if votes for anything less, he’ll jump in the sea himself.

    Now what if P8 proceeds to offer 99-0-1 nonetheless. What will P10 do? If he votes nay, he gets 0 gold, we all agree. If he votes aye, he gets 1 gold, we all agree. He lives either way and the prospect of seeing any number of pirates drown, however pleasant, is worth less than 1 gold. Nothing forces him to followup on his commitment. He does not care about the other pirate’s respect for honesty. So P10 votes aye.

    And because P8 knows that he’ll get P10’s vote for 99-0-1, that is what he’ll offer and we are back to our solution.

  67. 67 67 Ken B

    Keshav 63.
    I was unclear. All strategies must be presented at once up front is my assumption. So pirate 9 will have already filed the proposal “99 1” by the time the voting occurs on pirate 8’s proposal, should that proposal be “99 0 1”. (ie his fully specified set of strategies includes the statment ‘if there are three left and the proposal of 8 is “99 0 1” then my proposal is …) And in that case a wise pirate 9 will have not have filed [“99 1’ in case of “99 0 1”]. Hence pirate 10 gets 1 plus the pleasure of an ejection by refusing that proposal of pirate 8.

  68. 68 68 Ken B

    arg, in 97, will have filed instead of “will not have filed”

    I am assuming that before the voting starts each pirate proposes a complete tree visible to all players indicating his play in the current situation, for all situations that might occur.
    Thus player 9’s filing may look like this as an example:
    [
    99 0 1: 99 1
    98 0 2: 98 2
    97 0 3: 97 3
    97 1 2: 98 2
    etc
    ]

  69. 69 69 Sub Specie Æternitatis

    One issue, beyond our host’s strong hint that our solution is wrong, is the parallel to the paradox of the unexpected hanging. There, exactly the sort of iterated argument about rationality leads to a manifestly incorrect result.

    With respect to that paradox, I always thought that the problem was in the hidden, internal inconsistency of the premises which led, as any contradiction in axioms will do, to false conclusions.

    But I can see no internal inconsistency in the premises here and posing an ill-defined problem would be rather mean.

  70. 70 70 Ken B

    @Keshav
    My mechanism relies on the fact Steve does specify “taking the other pirates’ strategies as given.”

  71. 71 71 Sub Specie Æternitatis

    @Ken B. If every pirate at the beginning of the game (or every round?) filed a complete strategy tree listing how they would respond to every possible offer in every round (or just this round?), and this filing was binding, and the proposer could see all the trees before proposing, you might very well be right.

    But there is no hint in the premise that anything like that happens and, if it did, it would constitute rather a different game with different outcomes.

  72. 72 72 Steve Landsburg

    Ken B and Sub Specie Aeternitas (and others to whom it is relevant): A strategy consists of a full contingency plan for how to behave in any situation that could arise (i.e. how to vote on any possible plan that could be offered, and a plan to offer in any possible situation were one might be called on to do so). Each player takes each other player’s strategy as given.

  73. 73 73 Sub Specie Æternitatis

    @Ken B I now see what phrase gave rise to your interpretation of the game.

    Yet, I read that phrase as only meaning that each pirate in devising his strategy would not affect the strategy of other pirates. Your almost completely inverse reading–that each pirate actually knows the exact strategy of each other pirate and that these strategies are binding, regardless of events–is not unreasonable.

    But it would make the picking of strategies often impossible. For A is allowed to pick his strategy based on B’s known strategy and B is allowed to pick his strategy based on A’s known strategy, we would likely have endless cycling. Just imagine such a scheme for a simple game of rock-paper-scissors.

    If I recall correctly we can reach equilibrium, under broad conditions which may fulfilled here, by allowing mixed strategies (i.e., those that do not absolutely commit, but rather commit to follow a later random variable in a pre-defined way). Perhaps that is the answer.

  74. 74 74 Sub Specie Æternitatis

    Ken B and I disagree about the meaning of the premise that “Each player takes each other player’s strategy as given”

    Does it mean that each player is actually given all the other players’ strategies, permitted to alter his strategy accordingly, triggering possibly further revisions in others’ strategy? In other words, does each player announce a strategy at the beginning and is then permitted to alter it in response to others’ announcements? And each player’s final chosen strategy becomes binding before the first proposal?

    Or does it mean that each player merely assumes that all the other players have strategies which are not affected by their choice of strategy?

  75. 75 75 Ken B

    @Steve 72
    Yes, I think we are saying the same thing. I said “I am assuming that before the voting starts each pirate proposes a complete tree visible to all players indicating his play in the current situation, for all situations that might occur.” Have I got this wrong somehow?

    So I am assuming that everyone can look at everyone’s tree at any time. My example pinches off just the last twigs of the tree to save space.

    This assumption has a consequence: if pirate 9 “promises” to pay pirate 10 he must actually do so, because the vote is on the filed strategy tree.

  76. 76 76 Keshav Srinivasan

    Steve, when you say “Each player takes each other player’s strategy as given.”, do you just mean that each player assumes that the other players’ strategies are fixed, or are you assuming that all the players announce their strategies to everyone right at the outset?

  77. 77 77 Keshav Srinivasan

    I suggest that everyone read this paper by Eliezer Yudkowsky et al.:

    http://arxiv.org/abs/1401.5577

    It’s about how to optimally choose your strategy in game theory problems where all strategies are publicly announced at the outset.

  78. 78 78 Keshav Srinivasan

    I think the problem would be simpler to think about if Steve had just said “assume no communication”, rather than talking about strategies being taken as given and all that.

  79. 79 79 Sub Specie Æternitatis

    Interesting paper, Prof. Srinivasan.

    Indeed, thinking of our pirates knowing each others’ strategies can be given useful meaning by imagining that they were robot pirates and each was given the source code of all the others before the game begins.

    Yet, we do not quite control the problem. If one of the robot pirates was programmed to vote aye iff a matched robot pirate (let’s say the pirate robot the negative index mod n) was programmed to vote nay in a certain situation. Now assume that a matched pair of robot pirate had that same source code. If the situation arose, neither of the matched pair would terminate and hence they would fail to render a decision.

    So we cannot allow the general case of all strategies being permissible and be sure that the game will reach an end.

  80. 80 80 Sub Specie Æternitatis

    “I think the problem would be simpler to think about if Steve had just said “assume no communication”, rather than talking about strategies being taken as given and all that.”

    I agree. Or one could just say that the pirates are allowed to talk all they want, but are allowed to lie with impunity (i.e., no commitment mechanism). That is what I had assumed the problem meant.

  81. 81 81 Steve Landsburg

    Ken B (#75): If I understand you correctly, then we are saying the same thing.

  82. 82 82 Keshav Srinivasan

    @Sub Specie Aeternitatis The paper actually addresses your concern: “A natural-seeming strategy involves simulating the other agent to see what they do when given one’s own source code. Unfortunately, this leads to an infinite regress when two such agents are pitted against one another.” That’s the whole reason why provability logic is used.

  83. 83 83 Keshav Srinivasan

    Steve, is each person’s strategy a function of everyone’s else’s strategies, which are in turn dependent on their strategies? Or are you proposing something simpler than that? I found this problem much simpler before all this strategies talk.

  84. 84 84 Jonatan

    @ Sub specie æternitatis

    “This is nice puzzle. Reminds me of my favorite puzzle–strangely also concerning rational pirates and the division of gold”

    I’m curious about what this puzzle is.

  85. 85 85 Steve Landsburg

    Jonatan: I believe Sub specie is referring to a puzzle you can find in a book called The Armchair Economist.

  86. 86 86 Jonatan

    They were described as sailors though. But they might easily be of the subtype pirates. It is a little suspicious that they all have so much gold.

  87. 87 87 Ken B

    I think pirate 10 gets it all. Pirates 1 through 5 should vote for this on the first round, whilst pirate 10 hopes for some drownings first. But I cannot prove it yet.

  88. 88 88 Ken B

    The Good Puzzle Theorem suggests that in the opening round all pirates vote to give it all to pirate 10, except pirate 10 who votes against it!

  89. 89 89 Keshav Srinivasan

    OK, I tried solving it in the case where everyone uses precommitments to reject any offer that doesn’t give them all 100 coins (unless such a rejection will lead to their death).

    P9 offers 100-0
    P8 offers 0-0-100
    P7 can offer either 0-100-0-0 or 0-0-100-0
    P6 can’t make any acceptable offer.
    P5 can offer 0-0-100-0-0, 0-0-0-100-0, or 0-0-0-0-100
    P4 can’t make any acceptable offer.
    P3 can’t make any acceptable offer.
    P2 can’t make any acceptable offer.
    P1 can make any offer where all 100 coins go to either P5, P6, P7, P8, P9, or P10

    The reason for that last statement is that since P2, P3, and P4 will die if P1’s offer isn’t accepted, they’re automatic yes votes. So P1 already as four votes locked up. So all he needs to do is buy a fifth vote for a 100 coins.

    Can someone verify my logic?

  90. 90 90 Keshav Srinivasan

    Actually, on further reflection P1 can’t offer 100 coins to P8, P9, or P10, because P5 would precommit to making the exact same offer to the same pirate, so that the pirate would reject P1’s offer and thus P5 would get to see more killings.

    So assuming the precommitment to demand 100 coins assumption is the correct way to think about the problem, I think P1 has to make an offer of 100 coins to either P5, P6, or P7.

  91. 91 91 Keshav Srinivasan

    Typo: Comment 84 should say “P5 can offer 0-0-0-100-0-0, 0-0-0-0-100-0, or 0-0-0-0-0-100” I put too few zeroes in front.

  92. 92 92 Sub Specie Æternitatis

    @Jonatan I meant the puzzle subsequently described in comment 27 by Jim K W. He calls them sailors too. But in my mind, they were always pirates.

  93. 93 93 Jonatan

    Yeah. That is also the one from Armchair Economist. I sadly never got to think about that one as I just immediately read the next sentences.

    As a sailor I could do a kind of similar thing as I want pirate 9 to do. If I didnt have a lot of coins, I could yell out “no matter what happens, Im gonna write down 10 more coins than I had.” I may lose my coins, but I may also gain 10.

  94. 94 94 Jan Mikkelsen

    The initial most ferocious pirate wants to be as happy as possible, so will want to make an offer that is accepted. That way he doesn’t have maximal unhappiness in he sea.

    A first guess is then an offer of an even split, ten coins each. The question is whether that offer can be optimised further for the most ferocious pirate to get a few more coins.

  95. 95 95 Nawaaz

    OK, I’m going to take Landsburg’s bait and outline the full NE i.e. full strategy profile that no-one can unilaterally deviate and get a higher payoff.

    The strategy profile is as follows where Oi is the offer vector by pirate i to all pirates voting in order of ferocity, Oij is the amount offered to j by pirate i, aij is the acceptance action of pirate j to pirate i’s offer which is associated with a rule and if this rule isn’t satisfied they reject the offer:*
    – P1: (O1=[96,0,1,0,1,0,1,0,1,0], a11 always)
    – P2: (a12 if O12 > 96, O2=[96,0,1,0,1,0,1,0,1], a22 always)
    – P3: (a13 if O13 > 0, a23 if O23 > 97, O3=[97,0,1,0,1,0,1,0], a33 always)
    – P4: (a14 if O14 > 1, a24 if O24 > 0, a34 if O34 > 97, O4=[97,0,1,0,1,0,1], a44 always)
    – P5: (a15 if O15 > 0, a25 if O25 > 1, a35 if O35 > 0, a45 if O45 > 98, O5=[98,0,1,0,1,0], a55 always)
    – P6: (a16 if O16 > 1, a26 if O26 > 0, a36 if O36 > 1, a46 if O46 > 0, a56 if O56 > 98, O6=[98,0,1,0,1], a66 always)
    – P7: (a17 if O17 > 0, a27 if O27 > 1, a37 if O37 > 0, a47 if O47 > 1, a57 if O57 > 1, a67 if O67 > 99, O7=[99,0,1,0], a77 always)
    – P8: (a18 if O18 > 1, a28 if O28 > 0, a38 if O38 > 1, a48 if O48 > 0, a58 if O56 > 1, a68 if O68 > 0, a78 if O78 > 99, O8=[99,0,1], a88 always)
    – P9: (a19 if O19 > 0, a29 if O29 > 1, a39 if O39 > 0, a49 if O49 > 1, a59 if O59 > 0, a69 if O69 > 1, a79 if O79 > 0, a89 if O89 > 100, O9=[100,0], a99 always)
    – P10: (a1x if O1x > 1, a2x if O2x > 0, a3x if O3x > 1, a4x if O4x > 0, a5x if O5x > 1, a6x if O6x > 0, a7x if O7x > 1, a8x if O8x > 0, a9x if O9x > 100, Ox=[100], axx always)

    *instead of writing 10 in the subscripts i’ve used x to avoid confusion between subscripts.

    This is the usual solution, where the first offer is accepted and is a NE. HOWEVER there may be other NE that aren’t sub game perfect (like this one, which I believe is the unique SPNE) involving “non-credible” threats. I think SPNE is a reasonable extension of rationality in this game.

  96. 96 96 Keshav Srinivasan

    @Nawaaz Do you think what I outlined in comment 89 would correspond to one of those potential Nash equilibria with non-credible threats?

  97. 97 97 Nawaaz

    @Keshav: I see your logic, and yes I believe it is a NE. All one has to do to prove that this it is a NE is to show no person given the other strategies, can change their strategies to make themselves better off. To make it easier I’ll just test your first NE, where the first strategies in the list of “or’s” is chosen. (it holds for all your other NEs as well):
    – As P6-P10 all reject and the bill is passed, thus accepting or rejecting doesn’t alter their payoffs so they have no incentive to deviate.
    – As P2-P4 cannot possibly make a deal given the others strategies, they all want to vote aye as 0 is preferred to death.
    – As P1 cannot offer anything but 100 to P5-P10 as no-one accepts unless they get 100 (using my notation aij is Oij = 100 or threatened by death for all i and j) so they must offer 100 to P5 to avoid certain death.
    – We can now consider P5 changing ANY part (or collections of parts) of their strategy. As he get’s 100 though, he has no incentive to change as that’s his highest payoffs.
    QED

    So it is a NE and I suspect there a plenty more (perhaps hundreds) of NE for all sorts of threats i.e. accept offers greater than 10 (this is the most obvious starting point though).

  98. 98 98 Mike H

    I worked it out before looking at any of the comments, and got the same answer as #1 (Jonathan)

    Number the pirates 1,…,10, from least ferocious to most.

    If there’s 2 pirates, it’s easy. P2 keeps all the booty, gets half the votes and survives. P1 gets nothing.

    If there’s 3 pirates, P2 will vote against whatever P3 proposes. P3 must secure the vote of P1. He can do this by offering a single gold coin, giving nothing to P2.

    If there’s 4 pirates, he can secure P2’s vote with a single gold coin. That’s enough for him to survive, so he keeps the other 99 coins for himself. P1 and P3 get nothing.

    If there’s 5 pirates, he can secure the votes of P1 and P3 with a single coin each. That’s enough, so he keeps the other 98 coins for himself. P2, P4 get nothing.

    Inductively,

    If there’s an even number of pirates, the most ferocious can secure the votes of the other even numbered pirates with one gold coin each. That’s enough for him to survive, so he keeps the rest of the booty for himself. The odd-numbered pirates get nothing.

    If there’s an odd number of pirates, the most ferocious can secure the votes of the other odd numbered pirates with one gold coin each. That’s enough for him to survive, so he keeps the rest of the booty for himself. The even-numbered pirates get nothing.

    With 10 pirates, this means 96-0-1-0-1-0-1-0-1-0.

    One might think P2 would like to drive the other even-numbered pirates to extinction, and claim the lion’s share of the booty for himself. To do that, he’d have to secure P1’s cooperation. P1, however, knows he’ll be betrayed at the end, and will vote for the first ferocious odd pirate who gets to propose a division. Unless there’s some credible way for P2 to assure P1 of cooperation, but there doesn’t seem to be in this puzzle.

  99. 99 99 Mike H

    Do the pirates discuss their strategies, finally committing in a credible way either to one particular strategy from the set of all strategies?

    Or, if there’s no stable set of strategies they can commit to, do they commit credibly to choose a strategy from a probability distribution whose sample space is the set of all strategies?

    That would change the answer.

    Eg, if there are three pirates, and if 2 precommits to a strategy that includes “I will offer 0 to P1 when there are only two”, this guarantees he’ll get 0. If 2 can precommit to give the least ferocious pirate a bigger share, then 3 and 2 will bid for his vote, and in the end the least ferocious pirate gets everything, and the most ferocious of the three is dumped in the drink.

  100. 100 100 Mike H

    … and now I’m not even sure of that. I don’t think there’s an equilibrium set of strategies for even 3 pirates, if they must precommit to the strategies beforehand.

  101. 101 101 Mike H

    For the simpler version, where there are three pirates and one coin, I know what will happen. 3 dies, and 1 gets the coin.

    Here, I’m assuming pirates must choose *at the start* how they’re going to respond to any situation, then not change their minds halfway through.

    I wrote a computer program to search all possible combinations of strategies – at least, after excluding strategies where pirates don’t vote for their own proposals, which are obviously dominated by strategies where they do.

    A strategy for the strongest pirate (P3) is “I give the coin to…”

    A strategy for the weakest (P1) is “how will I vote, given each possible offer from P3”

    A strategy for the middle pirate (P2) is “how will I vote, given each possible offer from P3” AND “who gets the coin if I get to choose”

    For P2, not matter what the others do, he’s better off rejecting any offer from P3, and offering the coin to P1.

    Given that, it doesn’t matter what P3 offers, P1 should reject it, so he gets to enjoy seeing him swim with the sharks.

    Given that, it doesn’t matter what P3 offers.

    So, P3 will make an offer, P1 and P2 will reject it, then P2 gives the coin to P1.

  102. 102 102 John W

    Combining some of Jonatan’s ideas with the analysis in comment #89, what if P9 rejects any small offers (like 1 coin) because it might get down to P5 and P5 might offer P9 100 coins, plus P9 might get to enjoy seeing P1,P2,P3,P4 thrown overboard.

    By the way, we know that a pirate values 1 coin higher than seeing 1 pirate thrown overboard. But how do they value 1 coin versus seeing, say 4 pirates thrown overboard?

  103. 103 103 Jonatan

    @Keshav

    I put some numbers into your summary of my opinion:

    (1) P9 will reason as follows: (2) If P1′s offer of 96-0-1-0-1-0-1-0-1-0 fails, then P2 will realize that offering 96-0-1-0-1-0-1-0-1 would be a bad idea, (3) so he will [NB I would put ‘might’ here] offer something where I get more than one gold coin. (4) Therefore, it is in my interest to make P1′s offer fail.

    I wonder which of these you think is fallacious?

    (I would also add (5) P1 realizes all this and offers something other than (96-0-1-0-1-0-1-0-1-0) )

  104. 104 104 Nawaaz

    @Jonatan: those are some fair points to raise, especially if we’re considering what would happen in reality. It is, however, at odds with this assumption from the question: “Each pirate devises a strategy that is designed to make himself as happy as possible, taking the other pirates’ strategies as given.” Your suggestion is that P9 will change his strategy hoping that P2 will change theirs, which isn’t the same as taking the other strategies given (a.k.a. only considering unilateral deviations). So while it is a good point to make between choosing which equilibrium we think is the most realistic, it’s not an argument against this particular equilibrium as an answer to this particular question.

  105. 105 105 Ally

    Keshav Srinivasan (#28) & Nawaaz (#31): Good point! The 7th pirate could offer 0-0-0-0-0-0-99-0-1-0

    I’m now doubting my logic on this whole thing, argh!

  106. 106 106 Jonatan

    I think it’s a little tricky to interpret what exactly ‘taking other strategies as given’ mean. What if they mutually depend on each other, or are unknowable with the information they have?

    Let me stress that I don’t think it will actually happen, that P9 changes his strategy hoping that P2 will change his. I think that P2 would hypothetically change his strategy (from 96-0-1-0-1-0-1-0-1) if it ever became his turn to choose, because he knows the other pirates are capable of rejecting it, since he is taking the other pirates’ strategies are given. But it will probably not be P2’s turn to choose, since P1 will also realize all this to begin with, and offer something other than (96-0-1-0-1-0-1-0-1-0).

    But yes, if you think this line of thought from P1 is not allowed when ‘taking the other pirates’ strategy as given’, that is a disagreement I think is fully sensible.

  107. 107 107 Sub Specie Æternitatis

    @Jonatan Yes, I do recall reading The Armchair Economist with great pleasure and on many subsequent occasions giving additional copies as presents to friends.

    Unfortunately that is the only thing I remember about The Armchair Economist. Not that I do not remember things which I read in it; I just don’t remember that this is where I read them. Obviously, that is where I must have first read the other pirate puzzle (unless I read it first in another of book of puzzles).

    The greatest peril of this sort of selective memory is that a sufficiently dimly remembered idea can arise from one’s memory in a manner that is subjectively indistinguishable from coming up with the idea oneself.

    On several occasions I shamefacedly recall, I’ve shared some of “my” ideas only to be reminded–fortunately before I made a fool of myself in front of audiences too large–that somebody else has published these same ideas somewhere I must surely have read them. Hence, I speak and write in terror of committing unintentional plagiarism.

  108. 108 108 Sub Specie Æternitatis

    @Keshav Srinivasan I understood how the paper addresses the situation. It is, as I said, a good paper.

    Arguably, when the problem speaks of “strategies” it implies that only provably terminating strategies are valid. Other strategies, even if they do in fact terminate, would not be considered valid. But that is a heavy burden to place by implication on the word “strategy.”

  109. 109 109 Sub Specie Æternitatis

    @Keshav Srinivasan

    Perhaps, one of the rules of the game should be that any pirate who cannot within one minute of the exchange of strategies provide a full list of his responses to every possible situation is thrown overboard immediately before the game even begins.

    Would that rule encourage strategies (i.e., source codes) which are so complex that they would induce other strategies to become non-terminating, even while itself always terminates within one minute?

  110. 110 110 John W

    Relying on the logic in post #89, how about this:

    If there is a possibility that one or more of P5,P7,P8,P9,P10 will not vote for any offer that gives them less than 100 coins, then P2,P3,P4,P6 will vote yes every time — regardless of the offer — since they do not want to risk being thrown overboard. So P1 says that he gets 100 coins for himself, and P1,P2,P3,P4,P6 vote for it, and it passes with 50%.

  111. 111 111 Johannes

    80-0-0-0-0-0-2-4-6-8
    #10 knows he will get nothing from #9 so he accepts 1 from #8, 2 from #7 … 8 from #1
    #9 knows he will get nothing from #7 so he accepts 1 from #6… 6 from #1

  112. 112 112 Johannes

    Or: 90-0-1-0-2-0-3-0-4-0

  113. 113 113 Keshav Srinivasan

    @Johannes I think your logic doesn’t work. 10 would be foolish to demand three coins from 6. If 6 offers 98-0-1-0-1 and 10 rejects it, then 7 will offer 99-0-1-0 and it will be successful, so 10 would get nothing.

    By the way, what makes you think 9 will get nothing from 7?

  114. 114 114 lovos

    Starting backwards, 2 pirates left. The lastwill always reject.
    Therefore the 2ndtolast must acceptan earlier offer, even if it is just 1.
    This means the3rdtolast pirate can offer 99-1-0. Pirate 2 is forcedto accept. This also means that pirate 3 will reject any offer frompirate 4, because he stands to gain 99 if he does so.
    Etc etc2=:0 1=:100 altijd nee
    3=:99 2=:1, 1=:0
    4:=97 3=:0, 2=:2, 1=:1
    5:=97 4=:0, 3=:1, 2=:0, 1=:2
    6:=96 5=:0, 4=:1, 3=:2, 2=:1, 1=:0
    7=:96 6=:0, 5=:1,4=:2, 3=:0, 2=:0, 1=:1
    8=:95 7=:0, 6=:1, 5=:2, 4=:0, 3=:1, 2=:1, 1=:0
    9=:95 8=:0, 7=:1, 6=:2, 5=:0, 4=:1, 3=:0, 2=:0, 1=:1
    10=:94 9=:0, 8=:1, 7=:2, 6=:0, 5=:1, 4=:0, 3=:1, 2=:1, 1=:0

  115. 115 115 lovos

    Amendment. I assumed the pirate that madethe proposal does not have a rightto vote. If hehas aright to vote then, as already mentioned,
    99-0-1-0-1-0-1-0-1-0

  116. 116 116 Keshav Srinivasan

    @lovos It should be 96-0-1-0-1-0-1-0-1-0, since there are only 100 coins.

  1. 1 Are You Smarter Than Google? — Part Two at Steven Landsburg | The Big Questions: Tackling the Problems of Philosophy with Ideas from Mathematics, Economics, and Physics
  2. 2 Are You Smarter Than Google? — Part Three at Steven Landsburg | The Big Questions: Tackling the Problems of Philosophy with Ideas from Mathematics, Economics, and Physics
  3. 3 Google and Game Theory
  4. 4 Google and Game Theory - Freedom's Floodgates

Leave a Reply