Intereting Posts

What is a fair game?
Find combinations of N sets having no more than one item in common.
How to Compute $\frac{d}{dx}\left(\left(1+x^2\right)^x\right)$?
How to find $\lim_{n \to \infty} \int_0^1 \cdots \int_0^1 \sqrt{x_1+\sqrt{x_2+\sqrt{\dots+\sqrt{x_n}}}}dx_1 dx_2\dots dx_n$
Proving integral of zeroth-order Bessel function multiplied by cosine with complicated arguments
Jacobian of exponential mapping in SO3/SE3
A linear system of a curve on a K3 surface.
Convergence/divergence of $\int_0^{\infty}\frac{x-\sin x}{x^{7/2}}\ dx$
Evaluate the sum $\sum_{k=0}^{\infty}\frac{1}{(4k+1)(4k+2)(4k+3)(4k+4)}$?
Evaluating the Poisson Kernel in the upper half space in $n$-dimensions
Property of odd ordered elements of a Group
Double Factorial: Number of possibilities to partition a set of $2n$ items into $n$ pairs
Prove that if $a,b,$ and $c$ are positive real numbers, then $\frac{a^3}{b}+\frac{b^3}{c}+\frac{c^3}{a} \geq ab + bc + ca$.
Universal algorithm to estimate probability of drawing certain combination of coloured balls
Is the square pyramid a manifold with corners?

So I’ve been playing around with some functions for a while, and started wondering about a slowest divergent function(as in $\lim_{x\to\infty} f(x)\to\infty$) and so I searched around for an answer.

I can see that there are ways to construct a new function that is necessarily diverging slower than the original one. But then it struck me that it really feels like properties of an open set where there are no smallest element in $(0,\infty)$.

So the question is this, is it possible to define **recursively** a function $f(x)$ such that $\lim_{x\to\infty} f(x)\to\infty$ and

$$

(\forall g(x) \neq f(x), \lim_{x\to\infty} g(x)\to\infty) \lim_{x\to\infty} {g(x)\over f(x)}\to\infty

$$

For an example of a function defined recursively, consider

$$

f(x)={x^{1\over f(x)}\over ln(x)}

$$

which I have no idea how it behaves.

- Fourier transform of $\operatorname{erfc}^2\left|x\right|$
- Maximum value of $f(x) = \cos x \left( \sin x + \sqrt {\sin^2x +\sin^2a}\right)$
- How to evaluate the following integral using hypergeometric function?
- How to prove that the set $A = \left\{ {p:{p^2} < 2,p \in {\Bbb Q^+}} \right\}$ has no greatest element?
- Evaluating $\int_{0}^{\infty} \left ^{-1}\mathrm{d}x$
- Proving $\frac2\pi x \le \sin x \le x$ for $x\in $

The reason I’m stressing the recursion is beacuse despite having no smallest element on $(0,\infty)$ , elements can get arbitrarily close to the endpoints, and to do that with a function, I’m guessing recursion is the way to go.

- How do “Dummy Variables” work?
- Multiplying Taylor series and composition
- smooth functions or continuous
- What comes after tetration ? And after ? And after ? etc.
- Is every function with the intermediate value property a derivative?
- Irrational Cantor set?
- Evaluating $ \int x \sqrt{\frac{1-x^2}{1+x^2}} \, dx$
- Closed form for $\sum_{n=1}^\infty\frac{1}{2^n\left(1+\sqrt{2}\right)}$
- $\lim_{n\to \infty }\sqrt{a_{n}}< 1$, $a_{_{n}}\geq 0$ for every $n \in \mathbb{N}$- prove that $a_{n}$ is convergent
- Uniform Convergence Implies $L^2$ Convergence and $L^2$ Convergence Implies $L^1$ Convergence

No such $f$ exists. If $f$ is any function such that $\lim_{x \to \infty} f(x) = \infty$, then consider the function $\log(f)$. We have that $\lim_{x \to \infty} \log(f(x)) = \infty$. But the function $\log(f)$ diverges slower than $f$, because we have that $\lim_{x \to \infty} \frac{\log(f(x))}{f(x)}=0$.

Now your idea of recursion brings up another interesting point. We can recursively define a **sequence** of functions $f_n$ as follows: $f_0(x)=x$, and $f_{n+1}(x)=\log f_n(x)$. Then $f_{n+1}$ always diverges slower than $f_n$. Then one may ask the question: for any function $f$ such that $\lim_{x \to \infty} f(x) = \infty$, does there exist $n \in \mathbb{N}$ such that $f_n$ diverges slower than $f$??

The answer is still no. To prove this, we’ll construct a function $f$ which diverges slower than all of the $f_n$. Let $a_1=1$. Suppose $a_1 < \cdots < a_n$ have been constructed so that $f_{i+1}(a_{i+1})>\frac{1}{i}+f_i(a_i)$ for all $1 \leq i \leq n-1$. Then choose $a_{n+1}>a_n$ such that $f_{n+1}(a_{n+1})>\frac{1}{n}+f_n(a_n)$ [such $a_{n+1}$ exists since $f_{n+1}$ diverges]. Continuing inductively, we get a sequence $(a_n)$ such that $f_n(a_n) \to \infty$. Define a function $f$ by interpolating all of the points $(a_n, f_n(a_n))$. Then you can check that $\lim_{x \to \infty} f(x)= \infty$, but $f$ diverges slower than all of the $f_n$.

We can do a similar construction to show that for any countable collection of divergent functions, we can find a function which diverges slower than all of these (which is stronger than your original question, pertaining to just one function).

What you call a “recursive definition” isn’t a really definition, it’s a formula which may be satisfied by one, several, or no functions at all.

Your question seems a little contradictory – you say in the second paragraph that you know no slowest function exists since you can always construct a slower one, and then immediately go on to ask for a slowest function regardless. The fact that you have a novel method of defining a function doesn’t change the fact that *no* such slowest function exists, as you said in your first paragraph. The fact that you can *always* construct a slower function means that you don’t even need to consider new methods of defining your function, because the operating word is *always*. It would be a bit like if you proved for me that no rational number $a$ satisfies $a^2=2$, and I said “okay, but what if I used a *really big* denominator?”. You would simply reply, “I said *no* rational number”.

On the other hand, as discussed in Shalop’s answer, if you’re willing to relax your definition of a “slowest” function, then the idea of defining a sequence of functions via some recurrence relation may yield some interesting results. A very simple such relation would be $f_{n+1} = \sqrt[3] {f_n}$. As long as $f_0$ diverges then so do all $f_n$, and each converges more slowly than the last.

Let me try to connect one of your questions with one of your other questions. As stated by Shalop, no such function exists. You can see this, for example as follows. Let $f(x)$ be a (positively) divergent function – a function such that $\lim\limits_{x\to\infty} f(x) = \infty$. Since you can define $g(x) := \log f(x)$, which diverges more slowly than $f(x)$, there is always a function which is “less divergent, but still divergent”. Thus, there cannot exist a “least divergent” function.

This matter is exactly similar to the properties of an open set, where, for example, the interval $(0, \infty)$ has no smallest element. The derivative of a function measures its rate of change – a function whose derivative is very positive at a point $x$ is increasing very quickly at that point. Now, if we think of functions $f(x)$ like $x, \log x, \log(\log x), \ldots$, then we can look at how their derivatives behave.

In order for such a function to *stop* being divergent, intuitively, we need it to level out at some point. That is, its derivative – its rate of change – must become less than or equal to zero. Afterall, as soon as the derivative crosses zero, the function is no longer headed toward $+\infty$. In particular, the interval of values such that $f(x)$ no longer heads toward $+\infty$ has and end-point: $(-\infty, 0]$. Namely, the interval $(0, \infty)$ is the interval of values for which $f(x)$ will still be heading toward infinity – it’s an open interval!

Granted, the description I gave is very non-rigorous, and only deals with increasing functions. To deal with functions like $x\exp(\sin(x))$ or $|x \sin(x)|$, we would need a different approach. Each still tends to $+\infty$ as $x \to \infty$, but their derivatives fluctuate between positive and negative values! To use the same approach, we could instead look at the largest value of these functions *so far*: the *envelope function*

$E(x) := \sup\limits_{0\leq y\leq x} f(x).$

Notice that $E(x)$ is now an increasing function: for all $x_2 \geq x_1 \geq 0$, $E(x_2)\geq E(x_1)$. Then in order for $f(x)$ not to be a divergent function, we require that $\lim\limits_{x\to\infty} E(x) < \infty$. But this can only happen if there is some point $x_0$ past which $E(x)$ stops increasing – *i.e.*

There exists $x_0 \geq 0$ such that there does not exist $x > x_0$ with $E(x) > E(x_0)$. If this is true, then there is an interval $I = (a,\infty)$ such that $E(x) \equiv 0$ on $I$. This means that $E(x)$ once again takes values in the closed interval $(-\infty, 0]$, so that $f(x)$ cannot tend to $+\infty$.

- A fair coin is tossed until a head comes up for the first time. The probability of this happening on an odd number toss is?
- Find the limit: $\lim _{n\rightarrow +\infty }\sqrt {\frac {\left( 2n\right) !} {n!\,n^{n}}}$
- Calculate $\frac{1}{5^1}+\frac{3}{5^3}+\frac{5}{5^5}+\frac{7}{5^7}+\frac{9}{5^9}+\cdots$
- Sufficient / necessary conditions for $f \circ g$ being injective, surjective or bijective
- Algorithmic Analysis Simplified under Big O
- find all self-complementary graphs on five vertices
- minimum number of steps for knight in chess
- The minimum value of $|z+1|+|z-1|+|z-i|$ for $z \in \mathbb C?$
- What is the probability that $X<Y$?
- Two basic examples of trace diagrams?
- Irrational numbers and Borel Sets
- Property of an operator in a finite-dimensional vector space $V$ over $R$
- An Ordinary Differential Equation with time varying coefficients
- A non-compact topological space where every continuous real map attains max and min
- Is the product of any two invertible diagonalizable matrices diagonalizable?