Proving Pascal's Rule : ${{n} \choose {r}}={{n-1} \choose {r-1}}+{{n-1} \choose r}$ when $1\leq r\leq n$

I’m trying to prove that ${n \choose r}$ is equal to ${{n-1} \choose {r-1}}+{{n-1} \choose r}$ when $1\leq r\leq n$.

I suppose I could use the counting rules in probability, perhaps combination= ${{n} \choose {r}}=\frac{n!}{r!(n-r!)}$.

I want to see an actual proof behind this equation. Does anyone have any ideas?

Solutions Collecting From Web of "Proving Pascal's Rule : ${{n} \choose {r}}={{n-1} \choose {r-1}}+{{n-1} \choose r}$ when $1\leq r\leq n$"

Consider $n$ balls in a basket. Let there be $1$ red ball and $n-1$ blue balls. Now look at the number of ways of choosing $r$ balls in two different ways

One way is to choose $r$ balls out of the $n$ balls. So the number of ways is $C(n,r)$

The other way is to look at the cases when out of the $r$ balls chosen if we have a red ball or not. We have only two options namely out of the $r$ balls we could have one red ball or no red balls

The number of ways of having $1$ red ball is to choose the one red ball which can be done in $C(1,1)$ ways and choose the remaining $(r-1)$ balls from the $(n-1)$ blue balls which can be done in $C(n-1,r-1)$ ways

Similarly, the number of ways of having no red balls is to choose all the balls as blue balls which can be done in $C(n-1,r)$ ways

These are the only two cases and these are mutually exclusive and hence the total number of ways is $C(n-1,r-1)+C(n-1,r)$

Hence, we get $$C(n,r) = C(n-1,r-1) + C(n-1,r)$$

The same idea could be extended to prove a generalization of the above $$C(m+n,r) = \displaystyle \sum_{k=\max(0,r-n)}^{\min(r,m)} C(m,k) C(n,r-k)$$

Consider a basket with $m$ red balls and $n$ blue balls and we want to count the number of ways in which $r$ balls can be drawn. Argue by two different ways to count (same as above) to prove this.

Since you mentioned that you were having a hard time visualising this and I always seem to find myself visualising it whenever I have the need to write it down here is what goes through my mind as the pen is moving across the paper:

We are placing $r$ identical balls in $n$ boxes (at most one in each) that are in a straight line, so
${ n \choose r}$ ways to do this, now either the last box is empty, that’s ${n-1 \choose r}$ ways, or the last box is full, that’s ${ n-1 \choose r-1}$ ways. QED

This is equivalent to Sivaram’s answer but does away with the colours, which for the purposes of visualising is probably slightly easier.

As Sivaram and Chandru1 suggested, a combinatorial argument is often a very good way to understand/prove that kind of identities.

The other way would be, as you said, to use the explicit formula for the Binomial coefficient:

$${{n-1} \choose {r-1}}+{{n-1} \choose r}=\frac{(n-1)!}{(n-r)!(r-1)!}+\frac{(n-1)!}{(n-r-1)!r!}$$

which reduces to $\frac{n!}{(n-r)!r!}={n\choose r}$.

See this Wikipedia page:

Under the subsection Recursion formula, HINT for proving this formula is given. Hope you can do it from there.

The formula follows either from tracing the contributions to $X^{k}$ in $(1 + X)^{n−1}(1 + X)$, or by counting k-combinations of {1, 2, …, n} that contain n and that do not contain n separately

Here’s a take on this formula from a different direction. Suppose you were given the recurrence relation $R(n,k) = R(n-1,k) + R(n-1,k-1)$, for $0 \leq k \leq n$, and boundary condition $R(0,k) = [k=0]$. How would you derive the solution $R(n,k)$ if you didn’t already know what it was? You could use generating functions.

(The following is borrowed from Wilf’s Generatingfunctionology, 2nd edition, p. 14). Let $$G_n(x) = \sum_{k=0}^{\infty} R(n,k) x^k.$$ Multiplying the recurrence relation above by $x^k$ and summing $k$ from $1$ to $\infty$ yields $$G_n(x) -1 = G_{n-1}(x) – 1 + x G_{n-1}(x).$$
Thus $G_n(x) = (x+1)G_{n-1}(x)$, with $G_0(x) = 1$. Thus $G_n(x) = (x+1)^n$. Since $R(n,k)$ is the coefficient of $x^k$ in $(x+1)^n$, applying the binomial theorem tells us that $R(n,k) = \binom{n}{k}$.

$\binom{n}{k} + \binom{n}{k-1} = \frac{n!}{k!(n-k)!} + \frac{n!}{(k-1)!(n-k+1)!} = \frac{n!}{k!(n-k)!} + \frac{n!}{k!(n-k)!}\frac{k}{n-k+1} = \frac{n!}{k!(n-k)!}(1+\frac{k}{n-k+1}) = \frac{n!}{k!(n-k)!}\frac{n+1}{n-k+1} = \frac{(n+1)!}{k!(n+1-k)!}=\binom{n+1}{k}$\


$n!(n+1) = (n+1)!$


How about this:

Note that

$${n \choose r} = \frac {n(n-1)(n-2)…(n-r+1)} {1\cdot2\cdot3\cdot…(r-1)r}$$

This can be written as
$${n \choose r} = \frac nr \cdot \frac {(n-1)(n-2)…(n-r+1)} {1\cdot2\cdot3\cdot…(r-1)} = \frac nr {n-1 \choose r-1} $$
and also
$${n \choose r} = \frac n{n-r} \cdot \frac {(n-1)(n-2)…(n-r+1)(n-r)} {1\cdot2\cdot3\cdot…(r-1)r}= \frac n{n-r} {n-1 \choose r}$$

$$\begin {align}
{n-1 \choose r-1} + {n-1 \choose r} & = \frac rn {n\choose r} + \frac {n-r} n {n\choose r} \\
& = {n \choose r}
\end {align} $$

as required.

start on the right hand side of the equation and simplify it. it is often better to start with the more complicated expressions when attempting to prove something in maths, because more tools are available to work with in demonstrations.

Here is another proof using the binomial theorem.

(1+x)^n &=& \sum_{k=0}^n \binom{n}{k} x^k \\
&=& (1+x)^{n-1} (1+x) \\
&=& \sum_{k=0}^{n-1} \binom{n-1}{k} x^k (1+x)\\
&=& \sum_{k=0}^{n-1} \binom{n-1}{k} x^k + \sum_{k=0}^{n-1} \binom{n-1}{k} x^{k+1} \\
&=& \sum_{k=0}^{n-1} \binom{n-1}{k} x^k + \sum_{k=1}^{n} \binom{n-1}{k-1} x^k
Now compare coefficients of $x^k$.

is equivalent to
on multiplying all terms by $r!(n-r)!$. Dividing through by $(n-1)!$ gives
$$n=r+(n-r) \, ,$$
i.e., $n = n$.


I don’t care if your question has already been answered or this is posted 4 years ago! The basic idea is $\Omega=A+A^c$. Say we take one element out, the permutation $k$ does not have that number in ${n}\choose{k}$ is ${n-1}\choose{k}$. In order to get the permutations that have this element in is “to make room for this element”, i.e. ${n-1}\choose{k-1}$.