Prove that a straight line is the shortest distance between two points?

Prove that a straight line is the shortest distance between two points
in $E_3$. Use the following scheme; let $\alpha: [a,b]\to E_3$ be an arbitrary curve segment from $p = \alpha(a) , q = \alpha(b)$.Let $u = (q — p)/||q — p || $.

(a) If $\sigma$ is a straight-line segment from $ p$ to $ q$ , say
$$\sigma (t) = (1 – t)p + tq ,\quad 0\leq t\leq1$$
show that $L(\sigma ) = d(p,q)$.

What I have done
$$
L(\sigma)=\int_{0}^{1}||\sigma'(t)||dt=\int_{0}^{1}(p^2+q^2)^{1/2}dt=\sqrt{(p^2+q^2)}(1),
$$
$d(p,q)=\sqrt{(b-a)^2+(q-p)^2}$. Where am I doing wrong? It’s a problem from O’Neill Elementary Differential Geometry.

Solutions Collecting From Web of "Prove that a straight line is the shortest distance between two points?"

It is correct that $\sigma'(t)=q-p,$ what is wrong is the norm of this vector, that is not $\sqrt{p^2+q^2},$ but $||q-p||=d(p,q).$

In coordinates, if $p=(x_1,y_1,z_1)$ and $q=(x_2,y_2,z_2)$, then $q-p$ is the vector of components $(x_2-x_1,y_2-y_1,z_2-z_1)$ whose norm is
$$
\sqrt{(x_1-x_2)^2+(y_1-y_2)^2+(z_1-z_2)^2}=d(p,q).
$$