Intereting Posts

Does every strongly convex function has a stationary point?
The number of elements which are squares in a finite field.
In how many ways can $10$ letters be placed in $10$ addressed envelope such that exactly $9$ letters are in correct envelope?
In the card came “Projective Set”, show that 7 cards do always contain a set.
Fractional Trigonometric Integrands
$A^3+A=0$ We need to show $\mathrm{rank}(A)=2$
Comaximal ideals in a commutative ring
Can we have a matrix whose elements are other matrices as well as other things similar to sets?
Writing down an explicit homotopy
Good way to learn Ramsey Theory
Prove that $(A-B) \cap (A-C) = A \cap (B \cup C)^c$ for any three sets A, B, C.
Independence of a random variable $X$ from itself
Combinatorial proof for two identities
Expected Value and Variance of Monte Carlo Estimate of $\int_{0}^{1}e^{-x}dx$
Dot product for 3 vectors

I roll a fair die and sequentially sum the numbers the die shows. What are the odds the summation will hit exactly 100?

More generally, what are the odds of hitting an exact target number *t* while summing the results of numbers sequentially drawn uniformly from a set *S*.

I have experimentally tested this and the result is (warning – spoilers ahead):

- Discrete probability problem: what is the probability that you will win the game?
- Analysis of “Tiny Dice Dungeon” (I)
- How can I (algorithmically) count the number of ways n m-sided dice can add up to a given number?
- “8 Dice arranged as a Cube” Face-Sum Equals 14 Problem
- What is a good strategy for this dice game?
- Probability of 1x six from seven dice?

1 over the expectation of the drawn numbers. In the case of the fair die 1/((1+2+3+4+5+6)/6) = 6/21 = 2/7

However I do not have a strong intuition, let alone a formal proof, why this is the case.

I’ll be happy to get your thoughts!

- Dynamic voting quorum
- The PDF of the area of a random triangle in a square
- How to calculate Pr(Diseased | 2 Positive Tests)?
- Refining the central limit theorem on discrete random vars
- A coin is tossed $m+n$ times $(m>n)$.Show that the probability of atleast $m$ consecutive heads is $\frac{n+2}{2^{m+1}}$
- Computing the variance of hypergeometric distribution using indicator functions
- probability that 5 square lie along a diagonal line
- Prove: if $a,b\in G$ commute with probability $>5/8$, then $G$ is abelian
- A concise guide to basic probability
- Prove that vector has normal distribution

The generating function $$H(x)=\sum\limits_{n=0}^\infty h_nx^n,$$ where $h_0=1$ and, for every $n\geqslant1$, $h_n$ is the probability to hit exactly $n$, solves the identity $$H(x)=1+P(x)H(x),\qquad P(x)=\frac16(x+x^2+\cdots+x^6),$$ hence $$H(x)=\frac1{1-P(x)}.$$

The limit of $h_n$ when $n\to\infty$ is $$\ell=\lim_{x\to1}(1-x)H(x)=\frac1{P'(1)},$$ that is, $$\ell=\frac6{1+2+\cdots+6}=\frac27.$$ This extends to every “die” producing any collection of numbers in any biased or unbiased way. If the “die” produces a random positive integer number $\xi$, the limit of $h_n$ becomes $$\ell=\frac1{E(\xi)}.$$ Assuming the limit $\ell$ exists, this can be understood intuitively as follows: by the law of large numbers, the sum of $n$ draws is about $k=nE(\xi)$ hence, after $n$ draws, $n$ large, one has hit $n$ values from roughly $k$. If each value has a chance roughly $\ell$ to be hit, one can expect that $\ell\approx n/k$, QED. Obvious counterexamples are when $\xi$ is, say, always a multiple of $3$, and these are essentially the only cases since $\ell$ exists if and only if the greatest common divisor of the support of $\xi$ is $1$.

**Edit:** To estimate the difference $h_n-\ell$ in the usual case, note that $$1-P(x)=(1-x)(1+x/a)(1-x/u)(1-x/\bar u)(1-x/v)(1-x/\bar v),$$ for some $a$ real positive and some complex numbers $u$ and $v$ with nonzero imaginary parts, hence $$H(x)=\frac{\ell}{1-x}+\frac{b}{1+x/a}+\frac{r}{1-x/u}+\frac{\bar r}{1-x/\bar u}+\frac{s}{1-x/v}+\frac{\bar s}{1-x/\bar v},$$ for some real number $b$ and some complex numbers $r$ and $s$ defined as $$b=-\frac1{aP'(-a)},\qquad r=\frac1{uP'(u)},\qquad s=\frac1{vP'(v)}.$$ Thus, for every $n$, $$h_n=\ell+b(-1)^na^{-n}+2\Re\left(r u^{-n}+s v^{-n}\right).$$ Numerically, $a\approx1.49$, and $|u|$ and $|v|$ are approximately $1.46$ and $1.37$, hence all this yields finally $$|h_n-\ell|=O(\kappa^{-n}),\qquad\kappa\approx1.37.$$ For $n=100$, $\kappa^{-n}\approx2\cdot10^{-14}$ hence one expects that $h_n$ coincides with $\ell$ at up to $13$ or $14$ decimal places.

The probability of hitting $n$ on the nose is a function of $n$, call it $f(n)$. It satisfies the recurrence

$$f(n)=\frac{f(n-1)+f(n-2)+f(n-3)+f(n-4)+f(n-5)+f(n-6)}6$$

with boundary values $f(0)=1$ and $f(n)=0$ for $n\lt0$.

I didn’t try to solve the recurrence, but my numerical calculations agree with yours, it looks like $f(n)\to2/7$.

Some addition on the answer of @bof:

For $n\in\mathbb{Z}$ let $E_{n}$ denote the event that number $n$

shows up in the sequence.

This with $E_{n}=\emptyset$ if $n<0$ and

$E_{0}=\Omega$ (we start the list with $0$).

Let $X$ denote the first number that appears.

Then $P\left(E_{n}\right)=\sum_{i=1}^{6}P\left(E_{n}\mid X=i\right)P\left(X=i\right)=\frac{1}{6}\sum_{i=1}^{6}P\left(E_{n-i}\right)$

…oh I see my error, it’s sides/(sum(1 to #sides). That matches. I forgot to change the numerator. I’ll leave my comment as is here…

I was curious about the answer, and while I’m not very mathy, I did write a little simulation, to explore the result for dies with different number of sides

For 6 sided die, the answer agreed with the above: 2/7 (~0.28571) probability

but for other die, the answer wasn’t 6/sum(1 to #sides)

sides: probability (success at hitting 100 when summing sequential rolls)

2 0.666666…

3 0.5

4 0.4

5 0.333333…

6 ~0.28571

7 0.25

8 0.222222..

9 0.2

10 0.181818..

which suggests the answer is always 2/(#sides+1)

If the answer 6/(sum(1 to #sides)) was used, it was only correct for the sides=6 case (comparing to simulation).

Does the limit for the equations above always work out to 2/(sides+1) for different sided die? (I think? my simulation is accurate)

I was thinking that if the equation was “right”, it would be right for die of all sizes?

The 2 sided case suggests an interesting bar-bet that would make money on average (flipping a coin and summing with either 1 or 2, trying to hit 100)

-kevin

Not a formal proof but an intuition –

let $k$ be the number showing on the face of the die when it reaches $100$ or above for the first; considering

**(a)**this face has $\large{k \over \sum \text{Faces}}$ chances of appearing, and**(b)**when it appears there is a $1\over k$ chance that exactly $100$ was reached

… then there is a chance of $\large{1 \over \sum \text{Faces}}$ that the die stopped on face $k$ on exactly $100$, so a $\large{|\text{Faces}| \over \sum \text{Faces}}$ chance that exactly $100$ was reached.

Which for a fair die results in $\frac{6}{21} = \frac 27$.

(I think **(b)** is true, but I’m not sure about **(a)**)

I here will first describe my approach.

As the question asks for the computation of the probability for a win- to get a perfect 100 by rolling a fair die ‘n’ no of times, my approach is to start the computation of the two factors: win and loss, from the points where it begins to matter- and becomes countable, and this point is when the value of the sum becomes 94.

When the sum is 94, the probability to win is 1/6 by getting a 6Φ; the probability to lose is 0. When the sum is 95, then the probability to win is 1/6, again by getting a 5Φ (I am excluding the redundant counts- as the sum of smaller value, like (‘1’,’1’,’1’,’1’,’1’), (‘2’,’1’,’2’), (‘4’,’1’), etcetera.), and the probability to lose is 1/6, by getting a ‘6’. Similarly, for sum equivalent to 96, the probability to win is also 1/6, by getting a 4Φ, and the probability to lose is 2/6 (by getting a ‘5’ or ‘6’); for 97, the probability to win is again 1/6 by getting a 3 Φ, and the probability to lose is 3/6 (by getting any one among ‘4’,’5’,’6’); for 98, the probability to win is 1/6 as well by getting a 2Φ, and that to lose is 4/6 (be it with either of ‘3’,’4’,’5’,’6’). When the sum is 99, there is just one value which can assure a win- by getting a ‘1’ in the next roll, and thus the probability for a win is 1/6; any other resultant leads to a disappointing defeat, be it with ‘2’,’3’,’4’,’5’ or ’6’, and thus, the probability to lose is 5/6.

When accumulated and calculated, the probability for a win is as follows.

Total chances for a win= 1/6 + 1/6 + 1/6 + 1/6 + 1/6 + 1/6= 1 = P(w)

Total chances for a loss= 0 + 1/6 + 2/6 + 3/6 + 4/6 + 5/6 = 15/6 = P(l)

Thus probability is = (P(w))/(P(w)+ P(l)) = 1/(1+15/6 ) = 1/(21/6) = 6/21 = 2/7 = 0.2857142857142857

Legends:

Φ: individually or as a sum of smaller numbers in any permutation.

//Computing of the Sum

int sum=0, roll

int counter=0, win=0, lose=0

int p_win=0, p_lose=0, value

Inside main()

Ask user to enter number between 1 and 6. Reject any number <0 or >7.

sum = sum + roll

while (sum <94)

continue accepting roll.

Check sum.

if (sum>94)

sum=94; (restore it to 94 to be able to consider for the probability calculation)

//Computing of the probability

Value = 100%sum;

Until ( value >= 1)

chance of winning =1 (single for each value, from 94 to 99)

chance of losing= 6- value (subtracting the present difference from the

winning value, from maximum possible difference value, i.e. largest value on a die=6)

Aggregate all win chances for all cases, from 94 to 99.

Aggregate all losing chances for all cases, from 94 to 99.

Probability to win is = (all winning chance)/ (all losing +winning chances).

The efficiency class of this algorithm is ϴ(n). There are two separate while loops.

In the best case, the efficiency is O(n/4). This will happen when all of the rolls but the last one will be a six. This for a winning value of 100 will be 17 (16+1), 1667( 1666 + 1) for 10,000. This is comparatively lower than n/4 of the winning value.

In the worst case, when each of the roll results in a ‘1’, it will take 100 rolls to decide the fate; for a win on 10,000, it will take 10,000 rolls. Thus, the worst case efficiency will be ϴ(n).

The average case is O(n/2). It will lie in close proximity of the middle value of the worst and best cases.

(Here, instead of using a random function, as “roll= rand() % 6 + 1”, I have asked the user to dynamically input the values within the range 1 to 6, inclusive.)

```
#include <stdio.h>
#include <stdlib.h>
#define WIN 100
#define MAX 94
int main()
{
int sum=0, roll;
int counter=0, win=0, lose=0;
int p_win=0, p_lose=0, value;
while((WIN-sum)>6)
{
begin:
printf("\nEnter roll values between 1 and 6: ");
scanf("%d", &roll);
if (0<roll && roll<7)
{
sum= sum+roll;
counter++;
}
else
{
printf("Only numbers between 1 and 6, inclusive.");
goto begin; //used goto as control flow is not complex.
}
}
if(sum>MAX)
sum=MAX; //to bring to the point from where the probability calculation can be done.
value= WIN%sum; while(value>=1)
{
p_win= value/value;
p_lose= 6-value;
win+= p_win;
lose+= p_lose;
value--;
printf("\nThe chances to win and to lose for the present sum of %d are %d and %d, respectively.\n", sum, p_win, p_lose);
++sum;
}
printf("\nThe chances of winning is %d/%d as a fraction or %f as a decimal.\n\n", win, win+lose, (float)win/(win+lose));
return(0);
}
```

Any feedback is welcome. Thanks.

I would say since the possible outcomes are $1,2,3,4,5,6$ on the die that the chances of getting exactly $100$ are the same as that of getting any single desired digit on the die once you are in the $94$ to $99$ range. Eventually you will get into that range such that you need either a $1,2,3,4,5$ or $6$ to make a $100$ sum. Then you must get that number to achieve the $100$ sum. So I would say $1$ in $6$ chance at that point. For example, suppose the partial sum is $94$, then you would need a $6$. If you instead roll a $2$ and get $96$ then you would need a $4$. If instead you roll a $3$ then you need a $1$. So anytime you get into the $94$ to $99$ range, you have a $1$ in $6$ chance of hitting exactly $100$ on a single final roll. If you instead consider the probability from the beginning when the partial sum is $0$ or when the partial sum is in the range $94$ to $99$ and considering multiple “final” rolls, then the answer will be different.

So the answer depends on when you start counting the probability but in my case, I simplified and only care about the one final roll that would give you a $100$ sum.

One can also argue/claim that the probability is $0$ if the number of die rolls for the $100$ sum case is less than $17$ rolls. The original question didn’t specify how many rolls are allowed or will take place.

Another answer is to interpret the question to mean only 1 die roll and sum (as many times as needed) only the numbers showing. This could be interpreted many ways such as only the top number (in which case $1, 2, 4$, and $5$ would “always” sum to exactly $100$), or all sides except the “hidden” bottom side (assuming it wasn’t rolled on a see thru table or equivalent), or even summing all $6$ sides if they are all visible.

Also the question doesn’t state only base $10$ of the sum $100$ so if I go past that, I could change to base $11$ and keep rolling (and go for $121$ in base $10$ which is $100$ in base $11$). If I miss that I can go for $144$ which is $100$ in base $12$….

How about if you interpret the question as strictly $1$ die roll and all $6$ faces are visible thus adding up to $21$ and then expressing that sum in base $21^{1/2}$ which would be written as $100$? Then it is $100$% chance of getting it.

I don’t know the answer to the more general question yet but I will think about it some.

- How to prove if $u\in W^{1,p}$, then $|u|\in W^{1,p}$?
- Proof that Gauss-Jordan elimination works
- Is there a sequence such that $\sum{a_n}$ diverges but $\sum{na_n}$converges?
- Synthetic division of polynomials by factor of the form $(ax+b)$
- Probability and Statistics, Penicillin is grown in a broth whose sugar content must be carefully controlled.
- Compositional “square roots”
- Evaluation of $\int_0^1 \frac{\log^2(1+x)}{x} \ dx$
- Vanishing of the first Chern class of a complex vector bundle
- Irreducibility issue
- How many different possible permutations are there with k digits that add up to n?
- how can be prove that $\max(f(n),g(n)) = \Theta(f(n)+g(n))$
- Points in unit square
- Generating function for permutations in $S_n$ with $k$ cycles.
- Integral $\int_{-\infty}^\infty\frac{\Gamma(x)\,\sin(\pi x)}{\Gamma\left(x+a\right)}\,dx$
- How many subsets of size $n+1$ can we have so no two of them have intersection of size $n$