# Find $f$ where $f'(x) = f(1+x)$

Let $f \colon \mathbb{R} \rightarrow \mathbb{R}$ be a smooth function such that
$$f'(x) = f(1+x)$$
How can we find the general form of $f$? I thought of some differential equations, but not sure how to use them here. Thanks.

#### Solutions Collecting From Web of "Find $f$ where $f'(x) = f(1+x)$"

By repeated application of the equation you have there, it’s clear that $f^{(n)}(x) = f(n + x)$, and hence that $f$ is smooth (i.e. has continuous derivatives of all orders). Furthermore, the equation you give is linear. This means both that $f = 0$ is a solution, and that any linear combination of solutions is a solution.

Suppose we know what $f$ is on $[0,1]$. We can then work out what $f$ is everywhere else: on $[1,2]$ we need to have $f^\prime$, on $[n,n+1]$ more generally we need to have $f^{(n)}$. So the values on positive reals are already determined. On the negatives, we analogously need to put antiderivatives of $f$. You may think that this isn’t unique, because of course antiderivatives aren’t unique, but the continuity of the function at the endpoints of the intervals is going to fix the constants of integration, so in fact there are no choices.

However, that hints at a problem we might encounter on the positive reals: if $f^\prime(0)\not=f(1)$, the stitched-together function will not be continuous at $1$. Likewise, if $f^{\prime\prime}(0)\not=f^\prime(1)$, the solution will run into difficulties at $2$.

Hence, if my analysis is correct, the solutions are in one-to-one correspondence with smooth functions $g:[0,1]\to \mathbb R$ satisfying the additional constraint that $g^{(n+1)}(0)=g^{(n)}(1)$. In particular, the latter constraint is easy to satisfy if $g$ is a smooth bump function with support a subset of $[0+\epsilon,1-\epsilon]$ for some $\epsilon$. All such functions (excluding the zero function) are non-analytic, so wouldn’t show up in a power series analysis.

I haven’t actually got a general solution, still, but I have managed to find an infinite family of linearly-independent bump-solutions, and a criterion for a general solution.

If you assume your solution is $f(x)= {\rm e}^{\lambda x }$ and substituting back in the equation, you get,

$$\lambda {\rm e}^{\lambda x } = {\rm e}^{\lambda (x+1) } \Rightarrow \lambda = {\rm e}^{\lambda } \Rightarrow \frac{1}{\lambda}{\rm e}^{\lambda } = 1 \,.$$

To find $\lambda$, we appeal to the Lambert W function which gives
$$\lambda = -W(-1)\,$$

Then, we can write our solution as

$$f(x) = c \,{\rm e}^{-W(-1) x } \,.$$

But note that, this is not a real function, since $-W(-1)=0.3181315052- 1.337235701\,i$ is complex. If we consider the other values contributed from the other branches of the Lambert W function, we can write the general solution, as pointed out in the comments, as

$$f(x) = \sum_{n= -\infty}^{\infty} c_n {\rm e}^{-W_n(-1) x}\,,$$
where $c_n$ are constants.

First of all, let’s turns into a delayed form. Let $g(x) = f(-x)$, then we have:
$$g'(x) = -g(x-1)$$
Therefore consider more general equation $y'(t) = -y(t-\tau)$, and initial condition $y(t) = \upsilon(t)$ for $0<t<\tau$. Then, for $\tau < t < 2 \tau$,
$$y(t) = \upsilon(\tau) – \int_\tau^t y(s-\tau) \mathrm{d}s = \upsilon(\tau)- \int_\tau^t \upsilon(s-\tau) \mathrm{d}s = \upsilon(\tau) – \int_0^{t-\tau} \upsilon(s) \mathrm{d}s$$
for $2 \tau < t <3 \tau$:
$$y(t) = y(2 \tau) – \int_{\tau}^{t-\tau} y(s) \mathrm{d} s = \upsilon(\tau)-\int_0^\tau \upsilon(s) \mathrm{d}s – (t-2 \tau) \upsilon(\tau) + \int_\tau^{t-\tau} \int_{0}^{s-\tau} \upsilon(u) \mathrm{d} u$$
and so on.

Here is an implementation of this idea in Mathematica, and comparison with built-in delayed ODE solver: