Is there a slowest divergent function?

So I’ve been playing around with some functions for a while, and started wondering about a slowest divergent function(as in $\lim_{x\to\infty} f(x)\to\infty$) and so I searched around for an answer.

I can see that there are ways to construct a new function that is necessarily diverging slower than the original one. But then it struck me that it really feels like properties of an open set where there are no smallest element in $(0,\infty)$.

So the question is this, is it possible to define recursively a function $f(x)$ such that $\lim_{x\to\infty} f(x)\to\infty$ and
$$
(\forall g(x) \neq f(x), \lim_{x\to\infty} g(x)\to\infty) \lim_{x\to\infty} {g(x)\over f(x)}\to\infty
$$
For an example of a function defined recursively, consider
$$
f(x)={x^{1\over f(x)}\over ln(x)}
$$
which I have no idea how it behaves.

The reason I’m stressing the recursion is beacuse despite having no smallest element on $(0,\infty)$ , elements can get arbitrarily close to the endpoints, and to do that with a function, I’m guessing recursion is the way to go.

Solutions Collecting From Web of "Is there a slowest divergent function?"

No such $f$ exists. If $f$ is any function such that $\lim_{x \to \infty} f(x) = \infty$, then consider the function $\log(f)$. We have that $\lim_{x \to \infty} \log(f(x)) = \infty$. But the function $\log(f)$ diverges slower than $f$, because we have that $\lim_{x \to \infty} \frac{\log(f(x))}{f(x)}=0$.

Now your idea of recursion brings up another interesting point. We can recursively define a sequence of functions $f_n$ as follows: $f_0(x)=x$, and $f_{n+1}(x)=\log f_n(x)$. Then $f_{n+1}$ always diverges slower than $f_n$. Then one may ask the question: for any function $f$ such that $\lim_{x \to \infty} f(x) = \infty$, does there exist $n \in \mathbb{N}$ such that $f_n$ diverges slower than $f$??

The answer is still no. To prove this, we’ll construct a function $f$ which diverges slower than all of the $f_n$. Let $a_1=1$. Suppose $a_1 < \cdots < a_n$ have been constructed so that $f_{i+1}(a_{i+1})>\frac{1}{i}+f_i(a_i)$ for all $1 \leq i \leq n-1$. Then choose $a_{n+1}>a_n$ such that $f_{n+1}(a_{n+1})>\frac{1}{n}+f_n(a_n)$ [such $a_{n+1}$ exists since $f_{n+1}$ diverges]. Continuing inductively, we get a sequence $(a_n)$ such that $f_n(a_n) \to \infty$. Define a function $f$ by interpolating all of the points $(a_n, f_n(a_n))$. Then you can check that $\lim_{x \to \infty} f(x)= \infty$, but $f$ diverges slower than all of the $f_n$.

We can do a similar construction to show that for any countable collection of divergent functions, we can find a function which diverges slower than all of these (which is stronger than your original question, pertaining to just one function).

What you call a “recursive definition” isn’t a really definition, it’s a formula which may be satisfied by one, several, or no functions at all.

Your question seems a little contradictory – you say in the second paragraph that you know no slowest function exists since you can always construct a slower one, and then immediately go on to ask for a slowest function regardless. The fact that you have a novel method of defining a function doesn’t change the fact that no such slowest function exists, as you said in your first paragraph. The fact that you can always construct a slower function means that you don’t even need to consider new methods of defining your function, because the operating word is always. It would be a bit like if you proved for me that no rational number $a$ satisfies $a^2=2$, and I said “okay, but what if I used a really big denominator?”. You would simply reply, “I said no rational number”.

On the other hand, as discussed in Shalop’s answer, if you’re willing to relax your definition of a “slowest” function, then the idea of defining a sequence of functions via some recurrence relation may yield some interesting results. A very simple such relation would be $f_{n+1} = \sqrt[3] {f_n}$. As long as $f_0$ diverges then so do all $f_n$, and each converges more slowly than the last.

Let me try to connect one of your questions with one of your other questions. As stated by Shalop, no such function exists. You can see this, for example as follows. Let $f(x)$ be a (positively) divergent function – a function such that $\lim\limits_{x\to\infty} f(x) = \infty$. Since you can define $g(x) := \log f(x)$, which diverges more slowly than $f(x)$, there is always a function which is “less divergent, but still divergent”. Thus, there cannot exist a “least divergent” function.

This matter is exactly similar to the properties of an open set, where, for example, the interval $(0, \infty)$ has no smallest element. The derivative of a function measures its rate of change – a function whose derivative is very positive at a point $x$ is increasing very quickly at that point. Now, if we think of functions $f(x)$ like $x, \log x, \log(\log x), \ldots$, then we can look at how their derivatives behave.

In order for such a function to stop being divergent, intuitively, we need it to level out at some point. That is, its derivative – its rate of change – must become less than or equal to zero. Afterall, as soon as the derivative crosses zero, the function is no longer headed toward $+\infty$. In particular, the interval of values such that $f(x)$ no longer heads toward $+\infty$ has and end-point: $(-\infty, 0]$. Namely, the interval $(0, \infty)$ is the interval of values for which $f(x)$ will still be heading toward infinity – it’s an open interval!

Granted, the description I gave is very non-rigorous, and only deals with increasing functions. To deal with functions like $x\exp(\sin(x))$ or $|x \sin(x)|$, we would need a different approach. Each still tends to $+\infty$ as $x \to \infty$, but their derivatives fluctuate between positive and negative values! To use the same approach, we could instead look at the largest value of these functions so far: the envelope function

$E(x) := \sup\limits_{0\leq y\leq x} f(x).$

Notice that $E(x)$ is now an increasing function: for all $x_2 \geq x_1 \geq 0$, $E(x_2)\geq E(x_1)$. Then in order for $f(x)$ not to be a divergent function, we require that $\lim\limits_{x\to\infty} E(x) < \infty$. But this can only happen if there is some point $x_0$ past which $E(x)$ stops increasing – i.e.

There exists $x_0 \geq 0$ such that there does not exist $x > x_0$ with $E(x) > E(x_0)$. If this is true, then there is an interval $I = (a,\infty)$ such that $E(x) \equiv 0$ on $I$. This means that $E(x)$ once again takes values in the closed interval $(-\infty, 0]$, so that $f(x)$ cannot tend to $+\infty$.