Intereting Posts

Question about the totient function and congruence classes
If $f(x)=x^2-x-1$ and $f^n(x)=f(f(\cdots f(x)\cdots))$, find all $x$ for which $f^{3n}(x)$ converges.
How many 4 digit numbers can be made with exactly 2 different digits
Partial sum of a geometric series
When is a stochastic process defined via a SDE Markovian?
Solving a first order linear ODE and determining the behavior of its solutions
A transitive set of ordinals is an ordinal
Uniform $L^p$ bound on finite measure implies uniform integrability
Asymptotic of a sum involving binomial coefficients
In how many ways can 4 unlabeled blue balls and 2 unlabeled red balls be packed in 5 different boxes?
The “magic diagram” is cartesian
Infinitely many systems of $23$ consecutive integers
product of two uniformly continuous functions is uniformly continuous
Standard Deviation Annualized
Find four groups of order 20 not isomorphic to each other.

In a probability course, a game was introduced which a logical approach won’t yield a strategy for winning, but a probabilistic one will. My problem is that I don’t remember the details (the rules of the game)! I would be thankful if anyone can complete the description of the game. I give the outline of the game, below.

Some person (A) hides a 100 or 200 dollar bill, and asks another one (B) to guess which one is hidden. If B’s guess is correct, something happens and if not, something else (this is what I don’t remember). The strange point is, B can think of a strategy so that always ends to a positive amount, but now A can deduce that B will use this strategy, and finds a strategy to overcome B. Now B knows A’s strategy, and will uses another strategy, and so on. So, before even playing the game for once, there is an infinite chain of strategies which A and B choose successively!

Can you complete the story? I mean, what happens when B’s guess correct and incorrect?

- Very fascinating probability game about maximising greed?
- Why can a Nim sum be written as powers of 2?
- Game theory textbooks/lectures/etc
- A lady and a monster
- Game Theory Matching a Deck of Cards
- fair and unfair games

Thanks.

- Nash Equilibria for zero-sum games (Rock Paper Scissors)
- What books to use to start studying Mathematical Logic?
- $F$ has undecidable positive existential theory in the language $\{+, \cdot , 0, 1, t\}$
- successful absurd formalities
- Show non-definability of (biparted graphs, graphs with eulerian cycle, even number of nodes)
- Double Negation is sequent calculus systems LK and LJ
- Axiom of Choice: What exactly is a choice, and when and why is it needed?
- Knights and knaves: Who are B and C? (task 26 from “What Is the Name of This Book?”)
- Showing $(P\to Q)\land (Q\to R)\equiv (P\to R)\land$ {without truth table}
- “If inaccessible sets exist, their existence is not provable in ZF”

In the Envelope Paradox player 1 writes any two different numbers $a< b$ on two slips of paper. Then player 2 draws one of the two slips each with probability $\frac 12$, looks at its number $x$, and predicts whether $x$ is the larger or the smaller of the two numbers.

It appears at first that no strategy by player 2 can achieve a success rate grater than $\frac 12$. But there is in fact a strategy that will do this.

The strategy is as follows: Player 2 should first select some probability distribution $D$ which is positive everywhere on the real line. (A normal distribution will suffice.) She should then select a number $y$ at random according to distribution $D$. That is, her selection $y$ should lie in the interval $I$ with probability exactly $$\int_I D(x)\; dx.$$ General methods for doing this are straightforward; methods for doing this when $D$ is a normal distribution are well-studied.

Player 2 now draws a slip at random; let the number on it be $x$.

If $x>y$, player 2 should predict that $x$ is the larger number $b$; if $x<y$ she should predict that $x$ is the smaller number $a$. ($y=x$ occurs with probability 0 and can be disregarded, but if you insist, then player 2 can flip a coin in this case without affecting the expectation of the strategy.)

There are six possible situations, depending on whether the selected slip $x$ is actually the smaller number $a$ or the larger number $b$, and whether the random number $y$ selected by player 2 is less than both $a$ and $b$, greater than both $a$ and $b$, or in between $a$ and $b$.

The table below shows the prediction made by player 2 in each of the six cases; this prediction does not depend on whether $x=a$ or $x=b$, only on the result of her comparison of $x$ and $y$:

$$\begin{array}{r|cc}

& x=a & x=b \\ \hline

y < a & x=b & \color{blue}{x=b} \\

a<y<b & \color{blue}{x=a} & \color{blue}{x=b} \\

b<y & \color{blue}{x=a} & x=a

\end{array}

$$

For example, the upper-left entry says that when player 2 draws the smaller of the two numbers, so that $x=a$, and selects a random number $y<a$, she compares $y$ with $x$, sees that $y$ is smaller than $x$, and so predicts that $x$ is the larger of the two numbers, that $x=b$. In this case she is mistaken. Items in blue text are *correct* predictions.

In the first and third rows, player 2 achieves a success with probability $\frac 12$. In the middle row, player 2’s prediction is always correct. Player 2’s total probability of a successful prediction is therefore

$$

\frac12 \Pr(y < a) + \Pr(a < y < b) + \frac12\Pr(b<y) = \\

\frac12(\color{maroon}{\Pr(y<a) + \Pr(a < y < b) + \Pr(b<y)}) + \frac12\Pr(a<y<b) = \\

\frac12\cdot \color{maroon}{1}+ \frac12\Pr(a<y<b)

$$

Since $D$ was chosen to be everywhere positive, player 2’s probability $$\Pr(a < y< b) = \int_a^b D(x)\;dx$$ of selecting $y$ between $a$ and $b$ is *strictly* greater than $0$ and her probability of making a correct prediction is *strictly* greater than $\frac12$ by half this strictly positive amount.

This analysis points toward player 1’s strategy, if he wants to minimize player 2’s chance of success. If player 2 uses a distribution $D$ which is identically zero on some interval $I$, and player 1 knows this, then player 1 can reduce player 2’s success rate to exactly $\frac12$ by always choosing $a$ and $b$ in this interval. If player 2’s distribution is everywhere positive, player 1 cannot do this, even if he knows $D$. But player 2’s distribution $D(x)$ must necessarily approach zero as $x$ becomes very large. Since Player 2’s edge over $\frac12$ is $\frac12\Pr(a<y<b)$ for $y$ chosen from distribution $D$, player 1 can bound player 2’s chance of success to less than $\frac12 + \epsilon$ for any given positive $\epsilon$, by choosing $a$ and $b$ sufficiently large and close together. And even if player 1 doesn’t know $D$, he should *still* choose $a$ and $b$ very large and close together.

I have heard this paradox attributed to Feller, but I’m afraid I don’t have a reference.

[ Addendum 2014-06-04: I asked here for a reference, and was answered: the source is Thomas M. Cover “Pick the largest number”*Open Problems in Communication and Computation* Springer-Verlag, 1987, p152. ]

I know this is a late answer, but I’m pretty sure I know what game OP is thinking of (and none of the other answers have it right).

The way it works is person A chooses to hide either $100$ or $200$ dollars in an envelope, and person B has to guess the amount that person A hid. If person B guesses correctly they win the money in the envelope, but if they guess incorrectly they win nothing.

If person A uses the predictable strategy of putting $100$ dollars in the envelope every time, then person B can win $100$ dollars every time by guessing $100$ correctly.

If person A instead chooses to randomly put either $100$ or $200$, then person B can guess $200$ every time–he’ll win half the time, so again will win $100$ dollars per game on average.

But a third, better option for A is to randomly put $100$ in the envelope with probability $2/3$, and $200$ dollars in with probability $1/3$. If person B guesses $100$, he has a $2/3$ chance of being right so his expected winnings are $\$66.67$. If person B guesses $200$, he has a $1/3$ chance of being right so his expected winnings are again $\$66.67$. No matter what B does, this strategy guarantees that he will win only $\$66.67$ on average.

Looking at person B’s strategy, he can do something similar. If he guesses $100$ with probability $2/3$ and $200$ with probability $1/3$, then no matter what strategy person A uses he wins an average of $\$66.67$. These strategies for A and B are the Nash Equilibrium for this game, the set of strategies where neither person can improve their expected winnings by changing their strategy.

The “infinite chain of strategies” you mention comes in if either A or B start with a strategy that isn’t in the Nash equilibrium. Suppose A decides to put $100$ dollars in the envelope every time. The B’s best strategy is of course to guess $100$ every time. But given that, A’s best strategy is clearly to put $200$ dollars in the envelope every time, at which point B should change to guessing $200$ every time, and so on. In a Nash equilibrium, though, neither player gains any advantage by modifying their strategy so such an infinite progression doesn’t occur.

The interesting thing about this game is that although the game is entirely deterministic, the best strategies involve randomness. This actually turns out to be true in most deterministic games where the goal is to predict what your opponent will do. Another common example is rock-paper-scissors, where the equilibrium strategy is unsurprisingly to choose between rock, paper, and scissors with equal probability.

Are you perhaps thinking of the Two envelopes problem?

A similar thing I rmember:

$A$ picks arandom positive number $x$ and writes $x$ and $2x$ on two papers which are then hidden in envelopes. $B$ picks one of the envelopes, opens it, sees a number $y$. Now $B$ does not know whether $y=x$ or $y=2x$ and is asked whether he wants to keep his $y$ or switch to the other envelope, which may conatain either $2y$ or $\frac y2$. Since both variants are obviously equally likely, the expected value of switching is $\frac{2y+\frac y2}{2}>y$, so $B$ should always switch. But: This conclusion does not depend on $y$, so even without reading his number, he could have concluded that switching would be better! So why not switch immediately while picking the envelope in the first place? But with the other envelope, the same argument would apply! – Brain meltdown …

I think you are combining the two envelopes problem into your problem and that has caused the confusion. I think what you are looking for is something like one of the following:

1) you are talking about game theory of a sort akin to rock paper scissors in this link. http://www.ams.org/bookstore/pspdf/mbk-65-prev.pdf

2) alternatively you are talking about something about many games. As I understand it, if I pick envelope A you get 1 buck ; if I pick B, I win 2 bucks. I’ve no idea which is which and we play an infinite number of games. I program a computer to pick for me. You have a better computer.

If I tell mine to pick with any set pattern, your computer will figure it out in a finite number of games and then will win the infinite future games. You will win an infinite amount of money and I will only have won a finite number of games.

If I tell mine to pick randomly, your computer will never figure it out so we will both win 50% of games. You will win an infinite amount of money and I will only have won a finite number of games.

Probability theory is formalized in first order logic so anything you solve using probability theory is solved using logic. Thus such problem can not exist.

- Similar matrices have the same eigenvalues with the same geometric multiplicity
- A good book for metric spaces?
- Proof that SAT is NPC
- Is the function $f(x)=x$ on $\{\pm\frac1n:n\in\Bbb N\}$ differentiable at $0$?
- Prove that $\frac{\binom{p}{k}}{p}$ is integral for $k\in \{1,..,p-1\}$ with $p$ a prime number with a particular method
- Does a nonlinear additive function on R imply a Hamel basis of R?
- Highest power of a prime $p$ dividing $N!$
- Can $\sin n$ get arbitrarily close to $1$ for $n\in\mathbb{N}?$
- Cluster Point in Fréchet-Urysohn Space is Limit Point?
- Why is it legitimate to solve the differential equation $\frac{dy}{dx}=\frac{y}{x}$ by taking $\int \frac{1}{y}\ dy=\int \frac{1}{x}\ dx$?
- Which isomorphism of coordinate rings corresponds to isomorphisms of affine varieties?
- Linear algebra revisited: What do we do when we set a coordinate system?
- Geometric derivation of the quadratic equation
- Need a result of Euler that is simple enough for a child to understand
- An example of a residually finite group which is not Hopf