Little Bézout theorem for smooth functions

As a special case of little Bézout theorem, if we have a polynomail $f(x)$ with $f(0)=0$, then there exists another polynomial $g(x)$ such that $f(x)=xg(x)$. It’s easy to see that this fact generalizes to analytic functions because we have Taylor expansion. Now my question is whether little Bézout theorem holds for smooth functions. More precisely, my question is:

If $f(x)\in C^{\infty}(\mathbb{R})$ with $f(0)=0$, does there exist $g(x)\in C^{\infty}(\mathbb{R})$ such that $f(x)=xg(x)$?

EDIT:

As pointed out by Jason, the above question has positive answer and it actually holds in arbitrary dimensions, that is,

If $f(x)\in C^{\infty}(\mathbb{R^n})$ with $f(0)=0$, then there exist $g_i(x)\in C^{\infty}(\mathbb{R^n})$ such that $f(x)=\sum_{i=1}^{n} x_ig_i(x)$.

Multiplying a cutoff function on both sides, one obtains,

If $f(x)\in C^{\infty}_c(\mathbb{R^n})$ with $f(0)=0$, then there exist $g_i(x)\in C^{\infty}_c(\mathbb{R^n})$ such that $f(x)=\sum_{i=1}^{n} x_ig_i(x)$.

With a cleverer use of cutoff function, one can also obtain,

If $f(x)\in \mathscr{S}(\mathbb{R^n})$ with $f(0)=0$, then there exist $g_i(x)\in \mathscr{S}(\mathbb{R^n})$ such that $f(x)=\sum_{i=1}^{n} x_ig_i(x)$.

Here $\mathscr{S}((\mathbb{R^n})$ denotes the Schwartz space.

Solutions Collecting From Web of "Little Bézout theorem for smooth functions"

This is actually a standard fact proven in the beginning of a manifolds course – it’s a step on the way to proving all derivations are given as a linear combination of partial derivatives.

Theorem: Suppose $f:\mathbb{R}\rightarrow\mathbb{R}$ is smooth and $f(0) = 0$. Then there is a smooth function $g:\mathbb{R}\rightarrow\mathbb{R}$ such that $f(x) = xg(x)$.

Proof:

By the fundamental theorem of calculus, $\int_0^1 \frac{d}{dt}[f(tx)] dt = f(tx)|_{t=0}^{t=1} = f(x) – f(0) = f(x)$.

So, $f(x) = \int_0^1 \frac{d}{dt}[f(tx)] dt = \int_0^1 f'(tx)x dt = x\int_0^1 f'(tx) dt$.

The second equality is the chain rule ($f’$ means $\frac{d}{dx} f(x)$) and the third follows because with respect to $t$, $x$ is constant so can pull out of the integral.

Then, setting $g(x) = \int_0^1 f'(tx) dt$ gives the desired function. $\square$

To see this in action, let’s suppose $f(x) = x^2 + x$. Then we see that $f'(x) = 2x + 1$ so $f'(tx) = 2tx + 1$. Thus, $g(x) = \int_0^1 2tx+1 dt = t^2x + t|_{t=0}^{t=1} = x+1$, as it should be.