Prove that a positive polynomial function can be written as the squares of two polynomial functions

Let $f(x)$ be a polynomial function with real coefficients such that $f(x)\geq 0 \;\forall x\in\Bbb R$. Prove that there exist polynomials $A(x),B(x)$ with real coeficients such that $f(x)=A^2(x)+B^2(x)\;\forall x\in\Bbb R$

I don’t know how to approach this, apart from some cases of specific polynomials that turned out really ugly. Any hints to point me to the right direction?

Solutions Collecting From Web of "Prove that a positive polynomial function can be written as the squares of two polynomial functions"

Consider roots of $f(x)$, as $f(x)\geq0,\forall x\in\mathbb{R}$, so $f(x)$ can be rewritten as following:
$$f(x)=a^2(x-a_1)^2\cdots(x-a_k)^2[(x-\alpha_1)(x-\bar{\alpha_1})]\cdots[(x-\alpha_l)(x-\bar{\alpha_l})]$$
Where $a,a_1,\cdots,a_k\in\mathbb{R},\alpha_1,\cdots,\alpha_l\in\mathbb{C}$.

Denote $g(x)=a(x-a_1)\cdots(x-a_k),h(x)=(x-\alpha_1)\cdots(x-\alpha_l)=h_1(x)+ih_2(x)$, then
\begin{align*}
f(x)&=g^2(x) \, h(x) \, \bar{h}(x)\\
&=g^2(x) \, [h_1(x)+ih_2(x)] \, [h_1(x)-ih_2(x)]\\
&=(g(x)h_1(x))^2+(g(x)h_2(x))^2
\end{align*}

Assume that the leading coefficient of $f$ is $1$. If $f(x)=x^k$ for some $k$, then $k$ is even and the result follows. By completing the square we see that the result holds if the degree of $f$ is less than or equal to $2$. If $f$ has a real root, we may assume this root is at 0 by replacing $x$ with $x-a$. We can see that this must be a multiple root by the fact that $f(x)\geq 0$ for all $x$, so by factoring out $x^2$ we reduce the problem to a polynomial of lower degree. We may thus assume that $\deg(f)>2$, the result holds for polynomials smaller degree, and that $f$ has a nonzero complex root. The nonzero roots of $f$ are $z_1,z_1′,z_2,z_2′,\ldots,z_n,z_n’$ where $z_i$ is the complex conjugate of $z_i’$ for all $i$. Thus there is a quadratic polynomial $x^2+bx+c=(x-z_1)(x-z_1′)$ and a polynomial $p(x)$ of degree two less than the degree of $f$ such that
$$f(x)=(x^2+bx+c)p(x)$$

By induction, $p(x)=A(x)^2+B(x)^2$ for some polynomials $A(x)$ and $B(x)$. Thus
$$f(x)=(x^2+bx+c)(A(x)^2+B(x)^2)$$
We also have that
$$f(x)=((x+\frac{b}{2})^2+c-\frac{b^2}{4})(A(x)^2+B(x)^2)$$
so
$$f(x)=\left((x+\frac{b}{2})A(x)+\sqrt{c-\frac{b^2}{4}}B(x)\right)^2+\left((x+\frac{b}{2})B(x)-\sqrt{c-\frac{b^2}{4}}A(x)\right)^2$$
by the Brahmagupta–Fibonacci identity, so the result follows by induction provided $c\geq b^2/4$. However, if this were not the case then $x^2+bx+c$ would have a real root, so we are done.

Survey article by Bruce Reznick called Some Concrete Aspects of Hilbert’s 17th Problem, includes your case in the paragraph on Before 1900:

enter image description here