How to prove that the sum and product of two algebraic numbers is algebraic?

Suppose $E/F$ is a field extension and $\alpha, \beta \in E$ are algebraic over $F$. Then it is not too hard to see that when $\alpha$ is nonzero, $1/\alpha$ is also algebraic. If $a_0 + a_1\alpha + \cdots + a_n \alpha^n = 0$, then dividing by $\alpha^{n}$ gives $$a_0\frac{1}{\alpha^n} + a_1\frac{1}{\alpha^{n-1}} + \cdots + a_n = 0.$$

Is there a similar elementary way to show that $\alpha + \beta$ and $\alpha \beta$ are also algebraic (i.e. finding an explicit formula for a polynomial that has $\alpha + \beta$ or $\alpha\beta$ as its root)?

The only proof I know for this fact is the one where you show that $F(\alpha, \beta) / F$ is a finite field extension and thus an algebraic extension.

Solutions Collecting From Web of "How to prove that the sum and product of two algebraic numbers is algebraic?"

The relevant construction is the Resultant of two polynomials. If $x$ and $y$ are algebraic and $P(x) = Q(y) = 0$ and $\deg Q=n$ then $z=x+y$ is a root of the resultant of $P(x)$ and $Q(z-x)$ (where we take this resultant by regarding $Q$ as a polynomial in only $x$) and $t=xy$ is a root of the resultant of $P(x)$ and $x^n Q(t/x).$

Let $\alpha$ have minimal polynomial $p(x)$ and let $\beta$ have minimal polynomial $q(x)$. Then $V = F[x, y]/(p(x), q(y))$ is a finite-dimensional vector space over $F$ of dimension $\deg p \deg q$ (it is not necessarily the same dimension as $F(\alpha, \beta)$, for example when $\alpha = \beta$); moreover, it has an explicit basis
$$x^i y^j : 0 \le i < \deg p, 0 \le j < \deg q.$$

$xy$ and $x + y$ act by left multiplication on $V$ and one can write down explicit matrices for this action in the basis above in terms of the coefficients of $p$ and $q$. Now apply the Cayley-Hamilton theorem.

This argument proves the stronger result that if $F$ is the fraction field of some domain $D$ and $\alpha, \beta$ are integral over $D$ (hence $p, q$ are monic with coefficients in $D$) then so are $\alpha \beta, \alpha + \beta$.

Okay, I’m giving a second answer because this one is clearly distinct from the first one. Recall that finding a polynomial over which $\alpha+\beta$ or $\alpha \beta$ is a root of $p(x) \in F[x]$ is equivalent to finding the eigenvalue of a square matrix over $F$ (living in some algebraic extension of $F$), since you can link the polynomial $p(x)$ to the companion matrix $C(p(x))$ which has precisely characteristic polynomial $p(x)$, hence the eigenvalues of the companion matrix are the roots of $p(x)$.

If $\alpha$ is an eigenvalue of $A$ with eigenvector $x \in V$ and $\beta$ is an eigenvalue of $B$ with eigenvector $y \in W$, then using the tensor product of $V$ and $W$, namely $V \otimes W$, we can compute
(A \otimes I + I \otimes B)(x \otimes y) = (Ax \otimes y) + (x \otimes By) = (\alpha x \otimes y) + (x \otimes \beta y) = (\alpha + \beta) (x \otimes y)
so that $\alpha + \beta$ is the eigenvalue of $A \otimes I + I \otimes B$. Also,
(A \otimes B)(x \otimes y) = (Ax \otimes By) = (\alpha x \otimes \beta y) = \alpha \beta (x \otimes y)
hence $\alpha \beta$ is the eigenvalue of the matrix $A \otimes B$. If you want explicit expressions for the polynomials you are looking for, you can just compute the characteristic polynomial of the tensor products.

Hope that helps,

Technically, you could find the automorphisms of the Galois closure of $F(\alpha,\beta)$ over $F$ (assuming this extension is separable) and compute the polynomial
\prod_{\sigma \in \mathrm{Gal}}(x- \sigma(\alpha+\beta))
or the same with $\alpha \beta$, but I don’t believe this is what you are looking for. Since you can define Galois closures without knowing that $\alpha + \beta$ and $\alpha \beta$ are also algebraic, it is a legitimate way of proving it, but not a practical nor pedagogical one.

Hope that helps,