Prove: $(a + b)^{n} \geq a^{n} + b^{n}$

Struggling with yet another proof:

Prove that, for any positive integer $n: (a + b)^n \geq a^n + b^n$ for all $a, b > 0:$

I wasted $3$ pages of notebook paper on this problem, and I’m getting nowhere slowly. So I need some hints.

$1.$ What technique would you use to prove this (e.g. induction, direct, counter example)

$2.$ Are there any tricks to the proof? I’ve seen some crazy stuff pulled out of nowhere when it comes to proofs…

Solutions Collecting From Web of "Prove: $(a + b)^{n} \geq a^{n} + b^{n}$"

Hint: Use the binomial theorem.

This states that
$(a + b)^n = \sum \limits_{k = 0}^n {n \choose k} a^{n-k} b^k = a^n + b^n + \sum \limits_{k=1}^{n-1} {n \choose k} a^{n-k} b^k$.

Now, note that every term in the second sum is positive; this is because a, b, and the binomial coefficients are all positive. Therefore, (a+b)n = an + bn + (sum of positive terms) >= an + bn.

This follows directly from the binomial theorem.
Alternatively, you can prove it inductively (which is probably more fun): suppose the inequality true for $n-1$. Then $(a+b)^n = (a+b)(a+b)^{n-1} \geq (a+b)(a^{n-1} + b^{n-1})$ by the inductive hypothesis. So $(a+b)^n \geq a(a^{n-1}+ b^{n-1}) + b(b^{n-1} + a^{n-1})$, and this is at least $a^n + b^n$.

It might also be helpful for you to think a little about the geometry of the inequality.

For $n=2$, find a way to put an $a \times a$ square and a $b \times b$ square into a $(a+b) \times (a+b)$ square without any
overlaps. For $n=3$, see if you can fit an $a \times a \times a$ cube and a $b \times b \times b$ cube within an $(a+b) \times (a+b) \times (a+b)$ cube without overlaps.

Next, the notion of having more than three dimensions might seem a little weird, but
think of the box in $n$ dimensions whose sides have length $a+b$. Can you fit two boxes within it, one with side length $a$ and one with side length $b$?

You can write $n=m+1$ where $m \geq 0$, then

$(a+b)^n = (a+b)^{m+1} = (a+b) (a+b)^m = a(a+b)^m +b(a+b)^m \geq a^{m+1} + b^{m+1}$

no induction and works for any real $n \geq 1$.

Here is another way to look at the inequality. Pick the larger of $a^n$ and $b^n$ and divide through by that quantity. This reduces the problem to showing that $(1+r)^n \ge 1 + r^n$ for
some positive real number $r \le 1$. If $r < 1$, we have $r^n < r$, so $1 + r^n < 1 + r < (1+r)^n$.
I leave the case $r = 1$ (and $n$ rational and $\ge 1$) to others.

Let’s have a precalculus answer: Consider the function of $a$ depending on the parameter $b$ that is $f_b(a)=(a+b)^n-a^n-b^n$. Its derivative relative to $a$ is $f’_b(a)=n((a+b)^{n-1}-a^{n-1})$ because $b>0$ $f’_b(a,b)$ is nonnegative and $f_b$ is an increasing function of $a$ and you can conclude.

Induction.

For n=1 it is trivially true

Assume true for n=k i.e. (a + b)^k >= a^k + b^k

Consider case n=k+1 (a+b)^(k+1)=(a+b)(a+b)^k>=(a+b)(a^k+b^k)=a^(k+1)+b^(k+1)+ab^k+ba^k>=a^(k+1)+b^(k+1) as required