Sum of two independent binomial variables

How can I formally prove that the sum of two independent binomial variables X and Y with same parameter p is also a binomial ?

Solutions Collecting From Web of "Sum of two independent binomial variables"

Let $(B_k)_k$ be a sequence of iid Bernoulli distributed random variable with $P(B_k=1)=p$ for $k=1,2,\dots$

Then $$X:=B_1+\cdots+B_n$$ is binomially distributed with parameters $n,p$ and $$Y:=B_{n+1}+\cdots+B_{n+m}$$ is binomially distributed with parameters $m,p$. It is evident that $X$ and $Y$ are independent.

Now realize that $$X+Y=B_1+\cdots+B_{n+m}$$ is binomially distributed with parameters $n+m,p$.

This spares you any computations.

Just compute. Suppose $X \sim \def\Bin{\mathord{\rm Bin}}\Bin(n,p)$, $Y \sim \Bin(m,p)$. Now let $0 \le k \le n+m$, then
\begin{align*}
\def\P{\mathbb P}\P(X+Y = k) &= \sum_{i=0}^k \P(X = i, Y = k-i)\\
&= \sum_{i=0}^k \P(X=i)\P(Y=k-i) & \text{by independence}\\
&= \sum_{i=0}^k \binom ni p^i (1-p)^{n-i} \binom m{k-i} p^{k-i} (1-p)^{m-k+i}\\
&= p^k(1-p)^{n+m-k}\sum_{i=0}^k \binom ni \binom m{k-i} \\
&= \binom {n+m}k p^k (1-p)^{n+m-k}
\end{align*}
Hence $X+Y \sim \Bin(n+m,p)$.

Another way:
Suppose $X\sim$ Bin$(n, p)$ and $Y\sim$ Bin$(m, p)$.
The characteristic function of $X$ is then
$$\varphi_X(t) = E[e^{itX}]=\sum_{k=0}^ne^{itk}{n\choose k}p^k(1-p)^{n-k}=\sum_{k=0}^n{n\choose k} (pe^{it})^k(1-p)^{n-k}=(1-p+pe^{it})^n.$$

Since $X, Y$ independent,
$$\varphi_{X+Y}(t)=\varphi_{X}(t)\varphi_Y(t)=(1-p+pe^{it})^n(1-p+pe^{it})^m=(1-p+pe^{it})^{n+m}.$$

By uniqueness, we get $X+Y\sim$ Bin$(n+m, p)$.