## A Rock Paper Scissors Brainteaser 167

New submitter arsheive (609065) writes with a link to this interesting RPS brainteaser: "How do you play against an opponent who _must_ throw Rock 50% of the time, and how much would you be willing to pay to play against them?"

## 100% paper (Score:1)

## Re: (Score:2)

The opponent is forced to pick 50% rock, but he has no limitations other than that. If you went 100% paper, you'd not beat me playing 50% rock by more than a tiny sliver of a percent it takes me to realize you are going 50% paper.

And the problem has nothing to do with the someone not just picking rock, but doing so in a very predictable manner. Otherwise, we'd not be talking about someone picking rock 50% of the time, but playing against someone that plays randomly, but tells you what he is picking half the

## Re:100% paper (Score:5, Informative)

From TFA: "At the start of each round an independent judge flips a fair coin and tells your opponent the result but does not tell you. If the coin came up heads your opponent must play rock."

The opponent isn't forced to get at least 50% rock after any number of plays.

## Re: (Score:2)

100% paper strategy will win 50% of the time. Of the remaining 50% of games played, (assuming even distribution of the remaining picks) 25% will be losses and 25% will be tied

No, you are describing another game where the opponent is forced to play rock 50% of the time, paper 25% of the time and scissors 25% of the time.

This game is different.

(I'm repeating my self from another post but many people are making the same mistake)

## For money, you mean? (Score:1)

With no such restriction, random choices on both sides lead to 33% win, 33% draw, 33% loss, right? With the opponent throwing Rock 50% of the time, assuming the other 50% is evenly divided between Paper and Scissors, if I always throw paper I'll win 50% of the time, lose 25% of the time, and draw 25% of the time.

So depending how the betting works, I'd be pretty willing.

## Re: (Score:3)

With no such restriction, random choices on both sides lead to 33% win, 33% draw, 33% loss, right? With the opponent throwing Rock 50% of the time, assuming the other 50% is evenly divided between Paper and Scissors, if I always throw paper I'll win 50% of the time, lose 25% of the time, and draw 25% of the time.

So depending how the betting works, I'd be pretty willing.

No, you are describing another game where the opponent is forced to play rock 50% of the time, paper 25% of the time and scissors 25% of the time.

This game is different.

## Play paper 50% of the time. (Score:1, Interesting)

Expect them to play scissors a lot to beat your paper. Play rock as often as they play scissors.

## put a spin on it (Score:4, Interesting)

Rock, paper, scissors, lizard, Spock!

Scissors cut paper

Paper covers rock

Rock crushes lizard

Lizard poisons Spock

Spock smashes scissors

Scissors decapitate lizard

Lizard eats paper

Paper disproves Spock

Spock vaporizes rock

Rock crushes scissors

## Re: (Score:2)

I'd argue that this wouldn't change the fundamental outcome of the experiment, since by definition no one selection has a fundamental advantage or disadvantage over any other. In either game, you make a decision, then have a 50/50 chance of winning the round.

beats paperand lizard;loses to rockand Spock ( 50% win, given even distribution)beats rockand spock;loses to scissorsand lizard (50% win, given even distribution)beats scissorsand lizard;loses to paperand spock (50% win, giv## 50% paper? (Score:2)

I think the winning strategy is to randomly throw 50% paper to cover his rock. I'm just guessing though. No idea how much to pay.

## Impossible (Score:1)

## Re: (Score:1)

## Re: (Score:2)

## Re: (Score:2)

## Edward Scissorhands would hate this opponent (Score:2)

## 50% paper and 50% rock ? (Score:1)

Assuming we randomly use 50% paper and 50% rock, we get:

- on rounds he is forced to play rock, we get half victory and half tie, so 25% win, 25% tie so far

- on other rounds he can:

- always use scissors, which will turn into our 25% win, 25% lose - we win overall, 50% win, 25% tie, 25% lose

- always use rock, which will turn into our 25% tie, 25%win - we win overall, 50% win, 50% tie

- always use paper, which will turn into 25% win, 25% lose - we win overall, 50% w

## Re: (Score:2)

I think you have an error in there. His best strategy in your example would be to always use paper, in which case he's doing the exact same thing as you are, and so the odds are even.

I think your best strategy would have to involve reacting to his. Start with 100% paper until you see what he's throwing when he's not throwing rock. This gives you at least even odds regardless of what he does.

He'll probably start throwing scissors, at which point you switch to you switch to 50% rock 50% paper. This will h

## Re: (Score:2)

Yes, you are right - paper will be 25% tie, 25% lose, ending up with 50% tie, 25% win, 25% lose, so still purely random.

I don't think that strategy we are looking for involves "reactions" - this can be always defeated by opponent which overguess you by one step. I would hope to find a strategy which would lead you to win > 50% even if opponent knows it (strategy itself, not the result of random choice at given stage).

## Re: (Score:2)

How would the opponent overguess you? You can delay your strategy switch for as little or as much time as you like, since their best strategy only breaks even.

If they don't know what you're doing, you could rack up more wins by exploiting flaws in their strategy, and rack up even more wins when they "wise up" and try to counter you. I guess the difference between our answers is that I'm treating this as an actual game against another person, whereas you're looking for more a game theoretical optimum.

Now,

## 38% rock and 62% paper ? (Score:2)

Ok, my current guess would be around 1/3 rock and 2/3 paper. Opponent cannot replicate this strategy, because of requirement of having at least 50% rock. If he goes 50% scissors, he will have 1/6 tie, 2/6 lose, 1/6 lose and 2/6 win, so 3:2 in my favor. With 50% paper, he gets 1/6 tie, 2/6 lose, 1/6 win, 2/6 tie, so 2:1 in my favor. 100% rock is obvious lose. I think that mixing scissors and paper will be clearly worse than pure scissors on his part (because with scissors, he has at least 2/3 chances of winn

## Programming Course Topic (Score:4, Interesting)

I've been having students in my introductory programming courses work on this class of problem for a few years.. They all seem to really enjoy it. I code up bots to play RPS with certain biases just like the OP and they have to program a single player that identifies the bias in an opponent and adjusts its play to give it an advantage. They all routinely can generate solutions that perform far better than random against predictable, dumb bots, but things get very interesting when I throw the students' bots against each other in a throwdown tournament. :)

## recalculate every round (Score:1)

If i know the number of rounds in advance (atleast your oponent has to right if he needs to calculate the 50%). I would recalculate the odds after every round taking into account the number of rock move remaining for him...

## Like most decisions, it's partly mental (Score:2)

Like drawing to an inside flush, an "optimized" strategy is not necessarily what the opponent plays. There is no reason inherent in the description to make assumptions about the opponent's other play. They may also be constrained to play paper the other fifty percent of the time, and to play paper , then rock, then paper, then rock. In the real world, don't assume that the minimal description of the problem gives all the important data.

## Re: (Score:2)

## BAD SUMMARY (Score:1)

there is a factual problem with the summary.TFA says it all so better read it. If not read this (I hope not to mess up it too much).

It is not required of the opponent to play rock 50% of the time. The referee is using a fair coin to determine if your opponent is to play rock or not. If s/he is not forced to play rock s/he is free to chose allowing him also to chose rock 100% of time too if s/he so wishes.

## Re: (Score:2)

It totally is required to play rock 50% of the time. It's not required to play rock

exactly50% of the time.## Re: (Score:2)

No. Depending on the luck of the coin, they might be required to play rock anywhere from 0 to 100% of the time.

## Re: (Score:2)

Obviously it's a long-time average. The chance of deviating from 50% by any given amount goes to zero as the number of trials becomes arbitrarily large. The summary doesn't spell this out in iron-clad logic, but anyone who spends time thinking about math, stats, or especially game theory knows the score.

## Re: (Score:2)

there is a factual problem with the summary...

It

isa bad summary, but only because the wording is ambiguous, not that it's factually incorrect. The statement you're objecting to is perfectly correct in one interpretation, and dead wrong in another. Your own counter-statement, "it is not required of the opponent to play rock 50% of the time," is equally ambiguous. In fact, 50% of the time (assuming a fair coin), the opponent is required to play rock, so it's true that "itisrequired of the opponent to play rock, 50% of the time". Leaving out the c## Re: (Score:2)

## Strategy... (Score:1)

## Solution (Score:1)

Lets call the guy with the restriction player 1 and the other player 2.

If you think about it player 1 got 3 "pure" strategies (as in: each other strategy he can play can be seen as a mixture of these 3):

(1) rock 100%,

(2) paper 50%/rock 50% and

(3) scissors 50%/rock 50%.

Against (1) rock gives 0, paper -1 and scissors 1.

Against (2) rock gives 1/2, paper -1/2 and scissors 0.

Against (3) rock gives -1/2, paper 0 and scissors 1/2.

In each case, the number is the probability of player 1 winning minus player 2 winnin

## Re: (Score:1)

Sorry :/ There are some mistakes in the last part. The strategy for player 1 wins 1/2-1/6=1/3 and not 1/6 as claimed. Also, the strategy for player 2 wins 2/3 against pure rock and not 1/3 as claimed. Still, it just makes it even clearer that you should not play rock with probability more than 50% as player 1 and not play scissors at all.

Also, to be more precise, the strategy for player 1 is to play rock with probability 1/2, paper with probability 1/6 and scissors with probability 1/3.

## Re: (Score:2)

I still think that 38.4% rock and rest paper is better solution for 'free' player, than 1/3 rock and 2/3 paper if we assume that 'limited' player is very smart and knows 'free' player tactic upfront.

Can you tell me, what is the strategy which 'limited' player can take against 38.4% rock, rest paper which would lead to worse result for 'free' player than what you propose? I would suggest simulating the results before jumping to conclusion. I was also originally thinking that 1/3 rock is optimal, but increasi

## Re: (Score:2)

I don't agree that x = 0. In that case, for 50% paper strategy for him, win is 1/3 vs 1/6 which is 2:1 in my favor. For 50% scissors strategy it is 1/2 versus 1/3 which is 3:2 in my favor. This means he WILL pick up 50% scissors in such case. Increasing x will reduce his winning chance. It will also increase his winning chance on paper strategy, but it will take longer to offset 2:1 disadvantage compared to 3:2.

Equation to solve is

(1/3-x/2)/(1/6+x/2)=(1/2)/(1/6-x/2)

http://www.wolframalpha.com/in... [wolframalpha.com]

gives x =

## Re: (Score:2)

So you are saying that (some theoretical) game where I win 50% and draw 50% is equivalent to game where I win 75% and lose 25%, because in both cases difference between his win and my win are equivalent (50% = 50-0 or 50%=75-25)? And you are suggesting and amount best to bet on both games is the same?

Thats completely wrong. In 50%win/50tie situation I cannot lose, so I can bet at any distadvantage against him. With 75% win 25% lose I can bet at most triple of his bet to come equal. And replay on tie or lack

## Randomize 2/3 paper, 1/6 rock, 1/6 scissors (Score:1, Insightful)

Over the long term, the strategy must converge to stable, therefore true random can be the only optimized strategy.

50% of time opponent must play R. The remaining 50% of the time they can equally choose R,P,S.

R -> P = 1/2 + 1/3 * 1/2

S -> R = 1/3 * 1/2

P -> S = 1/3 * 1/2

For a guaranteed win, roll a die: 1-4 => P, 5 => R, 6 => S.

By how much? Consider you are random as above and opponent is fixed wlog at 100% R. You win 2/3 of the time and lose 1/6 of the time.

The expected payoff to play i

## Re: (Score:2)

I think this is not quite correct, because scissors are a losing proposition. Throwing scissors cannot possibly, in and of itself, net a profit in the long run. At least 50% of the adversary's throws are rocks, which causes scissors to lose. It can only break even if the opponent chooses a 50% paper 50% rock solution. So if the goal is to do better than chance, then every scissors throw is suboptimal, regardless of the adversary's strategy.

I would redistribute your 1/6 scissors to rock and make it 2/3 p

## Two Games (Score:4, Insightful)

The key (if you RFTA) is that whether or not your opponent plays rock is determined by a coin toss. So really you are playing a compound game. You are playing a coin toss and rock paper scissors (RPS). Since the coin toss determines your opponents move, you can think of it as playing 50% coin toss and 50% RPS. The RPS is a subgame of the coin toss.

Since the coin toss is the dominate game, you play with win that first. But instead of heads/tails, it is paper/other. The answer to the coin toss is a 50/50 guess of heads/tails, so the answer to the paper/other is 50% paper, 50% other.

The "other" is the RPS game. And since the answer to the RPS game is 1/3 rock, 1/3 paper, 1/3 scissors, we know what the solution to the other 50% of the game is.

So the equations are:choice = (Coin Toss) + (RPS) so: paper = 1/2 + 1/3, rock = 0 + 1/3, scissors = 0 + 1/3. Or paper = 4/6, rock = 1/6, scissors = 1/6.

## Re: (Score:3)

The opponent could respond to this by playing scissors on all non-forced-rock turns. If the opponent plays rock, you win 4/6 of the time and lose 1/6 of the time, but if the opponent plays scissors you lose 4/6 of the time and win 1/6 of the time, so overall you'd be even.

## Re: (Score:2)

1/2 scissors x 4/6 paper = 2/6 = 1/3 victory for the opponent. 1/2 scissors x 1/6 scissors is 1/12 tie. And 1/2 scissors x 1/6 rock is 1/12 lose. So the "all scissors" strategy only nets him 1/3 victory not 4/6.

## Re: (Score:2)

I listed the chances in the context of the opponent move ("if the opponent plays rock"). The chance of playing rock or playing scissors is 1/2 each (the coin toss), so if you list it as overall chances you get 1/3 win and 1/12 loss (same as you wrote) due to the opponent playing scissors and also 1/3 loss and 1/12 win due to the opponent playing rock; the expected result result is still 0.

## Re: (Score:2)

How the fck would you manage to lose 4/6 of the time to an opponent who must play rock at least 50% of the time?

## Re: (Score:2)

Re-read the GP. The claim is that when the opponent responds by playing scissors 50% and rock 50%, you will win 4/6 of the time when they play rock and you will lose 4/6 of the time when they play scissors, which makes it 50/50. The stronger claim is that the opponent can adjust to any consistent strategy that you choose, ultimately making it a 50/50 game.

## Re: (Score:2)

Yes, that's what I meant. I originally though the stronger claim might be true but it is not: as Reaper9889 pointed out in another post, you should never play scissors. If you stick to that and are not so greedy to play 100% paper (to be exact: 1/2 < paper < 1, optimum at 2/3), you make a profit no matter how the opponent responds.

## Re: (Score:2)

There is a flaw in your reasoning. You do not know that your oppoent flipped so you can not condition on it like you do here (you can not play paper all the time if he "flips" rock because you do not know his coin flip). If you think about it you should NEVER play scissors. In the best case for you he plays rock 50% and paper 50% and you get 0 in expectation and clearly you got an advantage so 0 is not good.

The optimal strategy is to play 1/3 rock, 2/3 paper. It gives at least 1/6 against anything he could

## Re: (Score:2)

If I play paper 4/6 of the time, than I should expect 1/2 of my paper to align with his rock. So 4/6 * 1/2 = 2/6 = 1/3. So I should expect to win 1/3 of the time, plus my winnings on the other combinations. That means 1/3 is the lower bound.

If you play 1/3 rock and 2/3 paper, his response will be 1/2 paper and 1/2 rock. So you are going to get 2/3 * 1/2 = 1/3 for your paper. But your 1/3 rock will never win because he will never play scissors either.

## Re: (Score:2)

This is where the two games key comes in. You and I both recognize that 2/3 paper is the right move because 1/2 of his moves will be rock. But by playing the other half as regular RPS with a win/tie/loss of 1/1/1 you can expect the win/loss to cancel out, leaving you with your 1/3 lower bound advantage

If you're playing 2/3 paper and 1/3 rock vs 1/2 rock and 1/2 paper, the regular RPS subgame is 2/3 paper and 1/3 rock vs paper, which has an expected result of 1/3 loss for the subgame, or a 1/6 loss contribution to the total game. It won't cancel out: you can't get a consistent 0 result from the regular RPS subgame since you play paper more than 1/3 of the time and the opponent can take that into account by not playing rock in the subgame at all.

Versus 2/3 paper and 1/3 rock, it actually doesn't matter in

## Re: (Score:2)

You've come up with a fair answer, but the flaw in your reasoning is that the RPS game is less independent of the coin toss game.

In particular, the coin toss game makes scissors an obvious loss. No solution that includes scissors can possibly be optimal, because scissors loses at minimum 50% of the time.

Redistribute your scissors over to rock and work it out on a calculator; you'll see an improvement in outcomes.

## No actual advantage? (Score:4, Insightful)

First, make sure you read TFA, since it explains what the summary doesn't: how the 50% is determined and how the opponent can play in the non-forced turns.

If you play using a deterministic algorithm, for example always play paper, the opponent can figure it out and beat you on all the non-forced turns. At best you'll get an even result.

If you play using a random algorithm, the opponent can figure out the frequencies you're using and compensate for that. For example, if you decide to play paper 50% of the time and rock and scissors 25% of the time, you'd win against an opponent playing rock 50% of the time and paper and scissors 25% of the time. However, if the opponent decides to play rock 50% of the time and scissors the other 50%, the result is even again. If the opponent would be forced to play rock more than 50% of the time, there is no room to compensate and you would win consistently with 100% paper. I think that with 50% rock, there is enough room to respond to any frequency distribution you can come up with, although I have no proof for that.

You could change your algorithms during play, but if there isn't any algorithm that results in an advantage when playing it consistently, gaining an advantage from changing your algorithm would depend on how well your opponent responds to your changes. In other words, you're playing mind games. I don't think the 50% rock restriction is going to be of any help here.

## Re: (Score:2)

You can get an advantage. The important point is to notice that you should not play scissors ever. You can only get 0 in expectation IF he plays paper 50% and rock 50% and he gets an advantage otherwise and 0 is not good for you :/ See my above post for further details (spoiler: The optimal choice for you is 1/3 rock, 2/3 paper).

## Re: (Score:2)

First, there is some uncertainty about what ratio of your opponent's throws will be paper. Let's call that ratio x. This means that we can define the likelyhood of his throws this way:

HeDoesRock = .5, HeDoesPaper = x, HeDoesScissors = .5 - x

Now you need to figure out the optimal winning ratio of your throwing either rock or paper.

YouWinIfRock = HeDoesScissors = (.5 - x); YouLoseIfRock = HeDoesPaper = x .5; YouLoseIfPaper = HeDoesScissors = .5 - x

YouWinIfPaper = HeDoesRock =

W

## Re: (Score:2)

A very cool problem! Thank you for sharing it and helping us along with the discussion.

You're right that I ruled out the opponent voluntarily using rock, based on the informal idea that it could only make things worse, since the weakness of the opponent comes from what is already a strategy with too many rocks, which will be countered. I also ruled out ever using scissors against that rock-heavy strategy in a similarly informal way. And this makes the problem fairly easy to calculate.

So, here's something I

## Re: (Score:2)

You're right about never playing scissors. Since the perfect opponent will know you're never going to play scissors, he won't play rock any more than is required, so 50% of the time. This leads to an overall win frequency (profit) of (1 - 3 * Rp) * Po + Rp / 2, where Rp is how often you play rock and Po how often the opponent plays paper.

With 1/3 rock, the profit becomes 1/6 no matter what the opponent does. If you play less than 1/3 rock, Po is positive for your profit, so the opponent will opt to never pl

## Throw rock (Score:2)

## Simple. (Score:1)

If your opponent must throw rock 50% of the time, then you throw paper 100% of the time.

You will win AT LEAST 51% of the time, because you get the 50% gifted to you, and the other non-0% of the times that your opponent throws paper will cause a rematch.

## Progressive Betting (Score:1)

## What should opponent do on non-forced turns? (Score:2)

To figure out what you should do, first assume your opponent is rational, and will make good choices whenever he is able. Since he knows that you will play a paper-heavy strategy to counter his rock-heavy strategy, it would not be rational to

voluntarilychoose more rocks. That could only make things worse.But if he tried to exploit your paper-heavy strategy by throwing scissors on turns when he gets a choice, you'd have a perfect strategy against this: All rock. On forced rock, you get a redo, and on non-f

## How many games are we playing? (Score:2)

## 50 percent of the time (Score:2)

doesn't include a consideration of my lifespan. If I only live 100 years my opponent can play scissors every single game and play rock continuously after I am dead and the restriction would still be satisfied. In other words, "having to play rock 50 percent of the time" doesn't give you any relevant information about your opponent's behaviour.

## Re: (Score:2)

Good point, the important specification would be "50% of what time?"

I've always had this problem about the whole idea of probability. If the odds of you dying in a car accident are 1/1000000, and you still die tomorrow, what good is the low number of one millionth? You either die or you don't. Probability is only a measure or a larger population, i.e. the fraction that gets the rock, death or whatever. The idea of a probability for a unique event is meaningless.

This is why I like the many-worlds interp

## I know this one... (Score:3)

You kick him in the testicles and let him keep the chicken.

## Full analytical solution (Score:2)

As some have said, the optimum is a mixed strategy of playing paper 2/3 of the time and rock the rest of the time.

Let's say we're player 1 and they're player 2. The way to calculate this exactly is to first observe that the general form for each players mixed strategy is:

P(r1) + P(p1) + P(s1) = 1

1/2 + P(p2) + P(s2) = 1

where, for example, P(r1) is the probability of a rock being played by player 1.

This means that for a given P(r1), P(p1) and P(p2),

P(s1) = 1 - P(r1) - P(p1) [1]

P(s2) = 1/2 - P(p2)

## Re: (Score:2)

Another correction (this time to the second to last line): "Finally it's nice to observe that when player 1 sets x = 1/3, the second player's choice of z has no effect on the gains function whatsoever."

I really should have proofread this post.

## Re: (Score:2)

Oh my, another correction and this one is bigger! I assumed that player 2 would always play the rock half the time when in fact he is still able to choose rock even when the coin-toss doesn't force it. This adds another variable for which I didn't account.

Still, without going into expanding the probabilities, my intuition tells me that it doesn't improve his strategy to increase the rate of playing rock as this would only makes him more exploitable. Whether my intuition is correct is something I'll test at

## A solver and a simulation of the solution, in Pyth (Score:2)

## Over what time does he have to do this? (Score:2)

## Re: (Score:2)

Come again? It is specified that the player forced to play 50% rock (as mandated by a fair coin flip you don't get to see) plays intelligently and will adapt to your play. When he figures out you're always playing rock, he'll always play paper when he doesn't have to play rock. You tie 50% of the time and lose 50% of the time. That's lousy strategy.

You're guaranteed to break even by always playing paper. When the oppon

## Re:Always do rock. (Score:5, Funny)

Of course. You use the rock to smash him in the head while he tries to stab you with the scissors. Your friend then uses the paper to write a letter to your parents about how you died in a stupid fight about statistics.

## Re: (Score:2)

After that you need to run a probability set on all the possible combinations, with the unknowns for his paper and scissors, only knowing that they total to 50%.

## Re:Simple.... Odds are even (Score:5, Funny)

Actually now that I think about it more

... you realize that you're doing the submitter's homework?

## Re:Simple.... Odds are even (Score:4)

...but...but... but it's :-)

interestinghomework!## Re: (Score:2)

## Re: (Score:2)

## Re: (Score:1)

Is it harder than spelling seems?

## Re: (Score:2, Funny)

A hem! I'm a frayed knot.

## Paper and rock? (Score:2)

Since he can expect you to tthrow paper pretty often, he'd want to make scissors take up at least some of the non-rock ones he can throw, so if you throw rock, he would lose and if you were countering one of his rocks, you'd negate.

## Re: (Score:1)

actually, as each choice beats one and either ties or is beaten by the other two, the odds of winning any random round of RPS is 33%.

## Re: (Score:3)

actually, as each choice beats one and either ties or is beaten by the other two, the odds of winning any random round of RPS is 33%.

But the whole point is that it is

not random,since your opponent's choices are constrained. He is forced to play rock 50% of the time. If he plays randomly the other 50%, then you can always play paper and win 2/3rds of the time, lose 1/6 of the time, and tie 1/6th. But then he can adapt, and play rock 50% and scissors the other 50%, resulting in a tie (1/3 win, 1/3 loss, 1/3 tie). But you can adapt to that by playing rock 50% and paper 50%. You will win 50%, lose 25%, and tie 25%.## Re: (Score:2)

But then he adapts by always playing paper when he doesn't have to play rock. Now you win 25%, tie 50%, lose 25%, and you're back to even up.

## Re:Simple.... Odds are even (Score:5, Insightful)

The Nash Equilibrium [wikipedia.org] is for you to play paper 2/3rds of the time, and rock 1/3rd. His best counter strategy is to play rock 50% (he cannot go lower) and scissors 50%. He cannot do better. If you deviate from 2/3 paper and 1/3 rock, he can adjust his strategy to do better. With the optimal strategy, you will win 1/2, lose 1/3, and tie 1/6.

Here is my search for the Nash Equilibrium:

#include

struct rps {

double rock;

double paper;

double scissors;

};

static double

eval(struct rps *a, struct rps *b)

{

return

(a->rock * (b->scissors - b->paper)) +

(a->paper * (b->rock - b->scissors)) +

(a->scissors * (b->paper - b->rock));

}

int

main(void)

{

struct rps you;

struct rps him;

him.rock = 0.5;

double worst_best_eval_for_him = 1.0;

double best_rock_for_you = 0;

double best_paper_for_you = 0;

double worst_best_paper_for_him = 0;

double dx = 0.001;

for (you.rock = 0; you.rock best_eval_for_him) {

best_eval_for_him = p;

best_paper_for_him = him.paper;

}

}

if (worst_best_eval_for_him > best_eval_for_him) {

worst_best_eval_for_him = best_eval_for_him;

best_rock_for_you = you.rock;

best_paper_for_you = you.paper;

worst_best_paper_for_him = best_paper_for_him;

}

}

}

printf("worst_best_eval_for_him = %f\n", worst_best_eval_for_him);

printf("best_rock_for_you = %f\n", best_rock_for_you);

printf("best_paper_for_you = %f\n", best_paper_for_you);

printf("worst_best_paper_for_him = %f\n", worst_best_paper_for_him);

return 0;

}

## Re: (Score:2)

Sorry, but Slashdot mangled that code badly because of the angle brackets.

## Re:Simple.... Odds are even (Score:5, Interesting)

Sorry, but Slashdot mangled that code badly because of the angle brackets.

Let me try again:

#include <stdio.h>

struct rps {

double rock;

double paper;

double scissors;

};

static double

eval(struct rps *a, struct rps *b)

{

return

(a->rock * (b->scissors - b->paper)) +

(a->paper * (b->rock - b->scissors)) +

(a->scissors * (b->paper - b->rock));

}

int

main(void)

{

struct rps you;

struct rps him;

him.rock = 0.5;

double worst_best_eval_for_him = 1.0;

double best_rock_for_you = 0;

double best_paper_for_you = 0;

double worst_best_paper_for_him = 0;

double dx = 0.001;

for (you.rock = 0; you.rock < 1.0; you.rock += dx) {

for (you.paper= 0; (you.paper + you.rock) < 1.0; you.paper+= dx) {

you.scissors = 1.0 - you.rock - you.paper;

double best_paper_for_him = 0.0;

double best_eval_for_him = -1.0;

for (him.paper = 0; him.paper < 0.5; him.paper += dx) {

him.scissors = 1.0 - him.rock - him.paper;

double p = eval(&him, &you);

if (p > best_eval_for_him) {

best_eval_for_him = p;

best_paper_for_him = him.paper;

}

}

if (worst_best_eval_for_him > best_eval_for_him) {

worst_best_eval_for_him = best_eval_for_him;

best_rock_for_you = you.rock;

best_paper_for_you = you.paper;

worst_best_paper_for_him = best_paper_for_him;

}

}

}

printf("worst_best_eval_for_him = %f\n", worst_best_eval_for_him);

printf("best_rock_for_you = %f\n", best_rock_for_you);

printf("best_paper_for_you = %f\n", best_paper_for_you);

printf("worst_best_paper_for_him = %f\n", worst_best_paper_for_him);

return 0;

}

## Re:Simple.... Odds are even - RTF rescue? (Score:2)

## Re: (Score:2)

Ideone. Codepad. Any number of pastebins.

## Re: (Score:2)

Wins for you:

Paper vs rock: 2/3 * 1/2 = 1/3 win

Rock vs scissors: 1/3 * 0 = 0 win

Scissors vs paper: 0 * 1/2 = 0 win

For him:

Paper vs rock: 1/2 * 1/3 = 1/6

Rock vs scissors: 1/2 * 0 = 0

Scissors vs paper: 0 * 2/3 = 0

Your optimal strategy (2/3 paper, 1/3 rock) vs his optimal strategy (1/2 paper, 1/2 rock), results 1/3 win not a 1/2 win.

## Re: (Score:2)

I'm getting better test results from having around 38.4% rock, rest paper. At this point his scissor and paper strategies become equal and I'm winning (assuming ties are repeated) 61.7% of cases instead of 60% of cases with 1/3 rock rest paper.

I think that trick is finding a place where any mixture of scissors or paper for him is equally bad. At this moment, his choices doesn't matter anymore (of course, unless he choses rock even at free choice). At 1/3 rock 2/3 paper, scissors are clearly better choice fo

## Re: (Score:2)

Ok, Nash equilibrium, finally the correct approach. But here's an interesting variation, seemingly very similar, but a bit harder: same probabilities, but introduce a third side with a predefined algorithm and infinite budget. The game still looks the same to you as the player: 50% rock, 50% human RPS player, but in fact suddenly the game depends on so many details...

The setting: you play RPS against a human opponent, but do not communicate with them directly - the interface is operated by the third side an

## Re: (Score:2)

Bullshit..

Any answer that eliminates one of the three options for any of the players, have not been thought through.

If you eliminate one option, the opponent will have optimal strategy guaranteeing no loses.

Any optimal playing strategy will need a percentage of all threes.

Btw, the question is a teaser. There is not optimal solution as there is no equlibrium, any chosen strategy will have an answer by the opponent that makes it sub optimal.

## Re: (Score:2)

Though of the rounds where there is a winner the odds are 50/50.

## Re: (Score:1)

You seriously think that each player in RPS has a 33% chance of winning each round? Think a little bit about that. Oh, I forgot, this is /.

What do you think the odds are for each player? Keep in mind that there are three possible outcomes for each round: win, lose, or draw.

## Re: (Score:2)

## Re: (Score:2)

Never playing scissors and playing rock 50% means you play paper 50%.

That's not a hint, that's an answer.

## Re: (Score:2)

I didn't realize that the "less than" symbol wasn't allowed.

Let me introduce you to this newfangled thingamagick called H. T. M. L.

## Re: (Score:2)

Nobody seems to be asking what the definition of "50% of the time" is ?

Over what interval/time frame?

If the opponent throws scissors or paper on his first move is he then required to throw rock on his second move?

Probably because they read the link which describes how "50% of the time" is determined.

## Re: (Score:2)

would you be willing to pay to play against them?

Here's another one - what favors was this GTORangeBuilder person willing to pay timothy to post such a trivial(*) and uninspired(+) problem to the /. front page in a transparent attempt to herd traffic to his equity calculator?

(*) Trivial: don't take my word for it, just look at all the correct answers posted by slashdotters.

(+) Uninspired: I wake up with a more original brain teaser in my head every other day on averge. Yes, literally. Yeah, I know not everyone has a mind with a somewhat unpredictable tend

## Re: (Score:2)

## Re: (Score:2)

Another possible strategy for the opponent is to play the first round with each move at 1/3 chance. That leads to an expected win of 0 for the first round. For the second round, if he played rock in the first round he has no obligations and gets an expected win of 0 again, but if he didn't play rock (2/3 chance) he'll be forced to play rock and lose, so an expected win of -2/3 for the two rounds.

In fact, any opponent first round strategy with scissors 1/3 and rock between 1/3 and 2/3 will lead to an expecte

## Re: (Score:2)

The optimal strategy (as many posters have said) is R = 1/2, P = 1/6, S = 1/3 for the constrained player (A) and r = 1/3, p = 2/3 for the unconstrained player (B), and the value of the game is 1/6 in favour of B (pretty straightforward game theory). If a fee is charged on every round of the game, then $16.66 is the fair price. If a fee is only charged on rounds with a definite result, then the fair price is $23.07 since 5/18 of the time the result will be a draw. Fair odds for each player, if draws are coun

## Re: (Score:2)

Or you could just knock him out and take his wallet, amirite? Stupid eggheads with their book-learnin'.