Intereting Posts

Existence of Consecutive Quadratic residues
What does it mean to have a determinant equal to zero?
Solving modular arithmetic questions
Is every set in a separable metric space the union of a perfect set and a set that is at most countable?
Find the values of $m$ in the 2nd degree equation $mx^2-2(m-1)x-m-1=0$ so that it has one root between $-1$ and $2$
If an integer $n$ is such that $7n$ is the form $a^2 + 3b^2$, prove that $n$ is also of that form.
Solution to a recurrence relation
The vanishing ideal $I_{K}(A\!\times\!B)$ is generated by $I_{K}(A) \cup I_{K}(B)$?
Representing $\ln(x)$ as a power series centered at $2$ without computing any derivatives
What is the Arctangent of Tangent?
Are two invariant subspaces generated by two linearly independent generalized eigenvectors linearly independent?
What is a quotient ring and cosets?
Does $\lim \limits_ {n\to \infty}\ \frac{n}{n+1}$ equal $1$?
How does one know that a theorem is strong enough to publish?
Every prime ideal is either zero or maximal in a PID.

How can I get inequalities that bound the prime counting function if I have the following inequalities for some functions $f(x)$ and $g(x)$: $$ g(x)<\psi(x)<f(x), $$ where $\psi(x)$ is second Chebyshev function (the summatory function for the von Mangoldt function)?

- Integer solutions of $x^4 + 16x^2y^2 + y^4 = z^2$
- Help with proof using induction: $1 + \frac{1}{4} + \frac{1}{9}+\cdots+\frac{1}{n^2}\leq 2-\frac{1}{n}$
- Points in $ \mathbb{R}^{n} $
- Proof of an inequality about $\frac{1}{z} + \sum_{n=1}^{\infty}\frac{2z}{z^2 - n^2}$
- When do the multiples of two primes span all large enough natural numbers?
- When does Schwarz inequality become an equality?
- Show that $d \geq b+f$.
- Is the sequence $a_{n}=\prod\limits_{i=1}^{n}\left(1+\frac{i}{n^2}\right)$ decreasing?
- notion of the minimum of a function over a polytope
- Foundational proof for Mersenne primes

Chapter 5 of this book contains a pretty nice walkthrough of Chebyshev’s theorem (and the whole prime number theorem) that uses nothing but elementary number theory and combinatorics techniques to achieve such bounds.

Define:

$$\psi(x) = \sum_{n \le x} \Lambda(n) = \sum_{p^k \le x} \ln p$$

$$\theta(x) = \sum_{p \le x} \ln p$$

$$\pi(x) = \sum_{p \le x} 1$$

We’ll compare $\psi$ to $\theta$ and $\theta$ to $\pi$.

- $\psi(x) – \theta(x) = \sum_{p^k \le x, k\ge 2} \ln p$. So in the sum we care only about primes that are at most $\sqrt{x}$. Each such prime $p$ appears in the sum $\lfloor \frac{\ln x}{\ln p} \rfloor -1$ times, so we can write it as follows:

$$\psi(x) -\theta(x) = \sum_{p \le \sqrt{x}} \ln p (\lfloor \frac{\ln x}{\ln p} \rfloor -1)$$

Now there are a few options:

i. We can bound each term from above by $\ln x$, so $$0 \le \psi(x) – \theta(x) \le \pi(\sqrt{x}) \ln x$$ And then use the trivial bound $\pi(\sqrt{x}) \le \sqrt{x}$ bound, to find $\psi(x) – \theta(x) \le \sqrt{x}\ln x$.

ii. Similar to i, but use the Chebyshev bound $\pi(\sqrt{x}) = O(\frac{\sqrt{x}}{\ln x})$ to prove $\psi(x) – \theta(x) = O(\sqrt{x})$ (with constant around 2).

iii. A more involved argument shows that the real constant is around 1: Bound each term by $\ln x – \ln p$, so the difference becomes $\ln x \pi(\sqrt{x}) – \theta(\sqrt{x})$, which behaves like $\ln x \frac{\sqrt{x}}{\ln \sqrt{x}} – \sqrt{x} \sim \sqrt{x}$.

- To understand the relation between $\pi$ and $\theta$ we need Abel Summation applied to $a_n = 1_{n \text{ is prime}}, b_n = \log n$. As $\theta(x) = \sum_{n \le x} a_n b_n$, we find:

$$\theta(m) = (\sum_{n \le m} a_n) b_{m} – \sum_{n < m} (b_{n+1}-b_{n}) (\sum_{i \le n} a_n) =$$

$$\pi(m) \ln m – \sum_{n < m} \ln(1 + \frac{1}{n}) \pi(n)$$

(I worked with $m$ integer but it is true in general, up to some tiny error term.)

To bound the last sum, note that $0 \le \ln(1+\frac{1}{n}) \le \frac{1}{n}$ (it’s pretty tight for large $n$), so $0 \le \sum_{n < m} \ln(1 + \frac{1}{n}) \pi(n) \le \sum_{n < m} \frac{\pi(n)}{n}$.

i. The trivial bound is $m$, which follows form $\pi(x) \le x$.

ii. A more complex one is $O(\sum_{n < m} \frac{1}{\ln n}) = O(\frac{m}{\ln m})$ (constant around 1), based on $\pi(x) = O(\frac{x}{\ln x})$, which is based of Chebyshev’s estimates.

All in all, $\psi(x) = \pi(x) \ln x + O(\frac{x}{\ln x})$, so $g(x) < \psi(x) < f(x) \implies \frac{g(x)+O(\frac{x}{\ln x})}{\ln x} < \pi(x) < \frac{f(x)+O(\frac{x}{\ln x})}{\ln x}$. Precise error terms depend on what you’re assumeing, but I believe the constant of the $O()$ is around $1$.

- Prove that $f′(x)=f′(0)f(x)$ derivatives
- Integral dx term sanity check
- Can I get a PhD in Stochastic Analysis given this limited background?
- Which statements are equivalent to the parallel postulate?
- is there a name for this function (parity of a finite sequence)?
- Can you spot the mistake
- Vertical bar sign in Discrete mathematics
- Convergence of $\prod_{n=1}^\infty(1+a_n)$
- Decomposition of a finite measure on the sum of an atomless measure and a purely atomic measure
- Find Absolute Max and Min
- Show that $Hom_R(R^n,M) \cong M^n$ for R-modules
- Chinese remainder theorem for divisibility problem
- Division problem
- Using Modularity Theorem and Ribet's Theorem to disprove existence of rational solutions
- If $\lim\limits_{x \to \pm\infty}f(x)=0$, does it imply that $\lim\limits_{x \to \pm\infty}f'(x)=0$?