Energy for the 1D Heat Equation

So consider the heat equation on a rod of length $L$,

$u_t (x,t) = c^2 u_{xx} (x,t)$, $\forall (x,t) \in [0,L]$ x $\mathbb{R}^+ $,

and the energy at time $t$ defined as,

$$E(t)=\frac{1}{2}\int_{0}^{L} u(x,t)^2 dx.$$

How would I show that $E(t) \geq 0$ for every $t \in \mathbb{R}^+$, and that

E'(t) = -c^2 \int_{0}^{L} (u_x (x,t))^2 dx + c^2 \big(u(L,t)u_x(L,t) – u(0,t)u_x(0,t)\big)?

Here’s my attempt:

$E'(t) = \frac{d}{dt} \int_{0}^{L} \frac{u^2}{2} dx = \int_{0}^{L} \frac{1}{2} (u^2) dx = \int_{0}^{L} uu_t dx$

and if $u_t(x,t) = c^2 u_{xx}(x,t)$, then,

$E'(t) = c^2 \int_{0}^{L} u u_{xx} dx = \int_{0}^{L} uu_t dx$

But I don’t really know where to go from here.

Solutions Collecting From Web of "Energy for the 1D Heat Equation"