Why does $\log(n!)$ and $\log(n^n)$ have the same big-O complexity?

In an example that I found, it is said that $\log(n!)$ has the same big-O complexity as $\log(n^n)$. Please explain why this is the case.

Solutions Collecting From Web of "Why does $\log(n!)$ and $\log(n^n)$ have the same big-O complexity?"


$$\ln(n!) = \sum_{k=1}^{n} \ln(k) \leq n\ln(n) = \ln(n^n)$$

so $\ln(n!) = O(\ln(n^n))$

And for $n$ big enough,

$$\ln(n!) = \sum_{k=1}^{n} \ln(k) \geq \frac{n}{2} \ln\left(\frac{n}{2}\right) \geq \frac{n}{2}\left( \ln(n) – \ln(2) \right) \geq Cn\ln(n)$$

so $\ln(n^n) = O(\ln(n!))$

You have then

$$\ln(n^n) = \Theta(\ln(n!))$$

By writing

\log(n^n) – \log(n!) &= \log\left(\frac{n \cdot n \cdots n}{n \cdot (n-1) \cdots 1}\right) \\
&= \log\left(\prod_{k=1}^{n} \frac{n}{k} \right) \\
&= \sum_{k=1}^{n} \log\left(\frac{n}{k}\right) \\
&= n \sum_{k=1}^{n} \log\left(\frac{1}{k/n}\right)\frac{1}{n}

and using the fact that the right-endpoint Riemann sum of a decreasing nonnegative function is a lower bound for the integral, i.e.

0 \leq \sum_{k=1}^{n} \log\left(\frac{1}{k/n}\right)\frac{1}{n} \leq \int_0^1 \log\left(\frac{1}{x}\right)dx = 1,

we obtain the estimate

\log(n^n) – \log(n!) = O(n)

or, rearranging,

\log(n!) = \log(n^n) + O(n).

Dividing through by $\log(n^n)$ yields

\frac{\log(n!)}{\log(n^n)} = 1 + O\left(\frac{n}{\log(n^n)}\right) = 1 + O\left(\frac{1}{\log n}\right),

so we can conclude that

\lim_{n \to \infty} \frac{\log(n!)}{\log(n^n)} = 1.

Stirling’s Formula is given by

$$n!=\sqrt{2\pi n}\left(\frac{n}{e}\right)^n\left(1+O\left(\frac1n\right)\right)\tag 1$$

Therefore, we have from $(1)$ for $\log n!$

\log\,n!&=\frac12 \log(2\pi)+\left(n+\frac12\right) \log n -n+O\left(\frac1n\right)\\\\
&=O\left(n\,\log n\right)\\\\

Since $\log n^n=n\log n$ we see immediately that

$$\log n! =O\left(n\,\log n\right)=O\left(\log n^n\right)$$

as was to be shown.