Intereting Posts

Is a coordinate system a requirement for a vector space?
Showing that $l^p(\mathbb{N})^* \cong l^q(\mathbb{N})$
Find $\int \limits_0^1 \int \limits_x^1 \arctan \bigg(\frac yx \bigg) \, \, \, dx \, \, dy$
Prove by induction. How do I prove $\sum\limits_{i=0}^n2^i=2^{n+1}-1$ with induction?
Approximating Stirling's number of the second kind to allow for large inputs
measure of a set invariant by rational translation
What is the radius of convergence of $\sum z^{n!}$?
Trouble with Morley's theorem proof.
Steinhaus theorem (sums version)
Why $\mathbf{0}$ vector has dimension zero?
Automorphism of algebraic closure $\overline{{\bf F}}_p$.
Distance traveled by a bouncing ball with exponentially diminishing rebounds
Inner product of two continuous maps is continuous
What is the condition for roots of conjugate reciprocal polynomials to be on the unit circle?
Compactness and sequential compactness

Suppose $E/F$ is a field extension and $\alpha, \beta \in E$ are algebraic over $F$. Then it is not too hard to see that when $\alpha$ is nonzero, $1/\alpha$ is also algebraic. If $a_0 + a_1\alpha + \cdots + a_n \alpha^n = 0$, then dividing by $\alpha^{n}$ gives $$a_0\frac{1}{\alpha^n} + a_1\frac{1}{\alpha^{n-1}} + \cdots + a_n = 0.$$

Is there a similar elementary way to show that $\alpha + \beta$ and $\alpha \beta$ are also algebraic (i.e. finding an explicit formula for a polynomial that has $\alpha + \beta$ or $\alpha\beta$ as its root)?

The only proof I know for this fact is the one where you show that $F(\alpha, \beta) / F$ is a finite field extension and thus an algebraic extension.

- Union of the conjugates of a proper subgroup
- Prove that the dihedral group $D_4$ can not be written as a direct product of two groups
- Why $a+b$ is a generator of $F(a,b)$ over $F$, where $F$ is a field of characteristic zero.
- If the index $n$ of a normal subgroup $K$ is finite, then $g^n\in K$ for each $g$ in the group.
- Find the order of $8/9, 14/5,48/28$ in the additive group of $\mathbb{Q}/\mathbb{Z}$
- Obtain a basis of invertible matrices for $M_n(D)$, where $D$ is an integral domain

- Abstract algebra: for a polynomial $p$, prove $\sigma(\tau(p))=(\sigma\tau)(p)$ for all $\sigma, \tau \in S_n$
- Is $\mathbb{Z}/3\otimes \mathbb{Z}=\mathbb{Z}/3$?
- Prove $-(-a)=a$ using only ordered field axioms
- Does $\varphi(1)=1$ if $\varphi$ is a field homomorphism?
- Problem with the ring $R=\begin{bmatrix}\Bbb Z & 0\\ \Bbb Q &\Bbb Q\end{bmatrix}$ and its ideal $D=\begin{bmatrix}0&0\\ \Bbb Q & \Bbb Q\end{bmatrix}$
- Why is ring addition commutative?
- Dimension of an algebraic closure as a vector space over its base field.
- Isomorphic subfields of $\mathbb C$
- Free Group Generated By Image
- Inclusion of Fields whose order is a prime power

The relevant construction is the Resultant of two polynomials. If $x$ and $y$ are algebraic and $P(x) = Q(y) = 0$ and $\deg Q=n$ then $z=x+y$ is a root of the resultant of $P(x)$ and $Q(z-x)$ (where we take this resultant by regarding $Q$ as a polynomial in only $x$) and $t=xy$ is a root of the resultant of $P(x)$ and $x^n Q(t/x).$

Let $\alpha$ have minimal polynomial $p(x)$ and let $\beta$ have minimal polynomial $q(x)$. Then $V = F[x, y]/(p(x), q(y))$ is a finite-dimensional vector space over $F$ of dimension $\deg p \deg q$ (it is not necessarily the same dimension as $F(\alpha, \beta)$, for example when $\alpha = \beta$); moreover, it has an explicit basis

$$x^i y^j : 0 \le i < \deg p, 0 \le j < \deg q.$$

$xy$ and $x + y$ act by left multiplication on $V$ and one can write down explicit matrices for this action in the basis above in terms of the coefficients of $p$ and $q$. Now apply the Cayley-Hamilton theorem.

This argument proves the stronger result that if $F$ is the fraction field of some domain $D$ and $\alpha, \beta$ are integral over $D$ (hence $p, q$ are monic with coefficients in $D$) then so are $\alpha \beta, \alpha + \beta$.

Okay, I’m giving a second answer because this one is clearly distinct from the first one. Recall that finding a polynomial over which $\alpha+\beta$ or $\alpha \beta$ is a root of $p(x) \in F[x]$ is equivalent to finding the eigenvalue of a square matrix over $F$ (living in some algebraic extension of $F$), since you can link the polynomial $p(x)$ to the companion matrix $C(p(x))$ which has precisely characteristic polynomial $p(x)$, hence the eigenvalues of the companion matrix are the roots of $p(x)$.

If $\alpha$ is an eigenvalue of $A$ with eigenvector $x \in V$ and $\beta$ is an eigenvalue of $B$ with eigenvector $y \in W$, then using the tensor product of $V$ and $W$, namely $V \otimes W$, we can compute

$$

(A \otimes I + I \otimes B)(x \otimes y) = (Ax \otimes y) + (x \otimes By) = (\alpha x \otimes y) + (x \otimes \beta y) = (\alpha + \beta) (x \otimes y)

$$

so that $\alpha + \beta$ is the eigenvalue of $A \otimes I + I \otimes B$. Also,

$$

(A \otimes B)(x \otimes y) = (Ax \otimes By) = (\alpha x \otimes \beta y) = \alpha \beta (x \otimes y)

$$

hence $\alpha \beta$ is the eigenvalue of the matrix $A \otimes B$. If you want explicit expressions for the polynomials you are looking for, you can just compute the characteristic polynomial of the tensor products.

Hope that helps,

Technically, you could find the automorphisms of the Galois closure of $F(\alpha,\beta)$ over $F$ (assuming this extension is separable) and compute the polynomial

$$

\prod_{\sigma \in \mathrm{Gal}}(x- \sigma(\alpha+\beta))

$$

or the same with $\alpha \beta$, but I don’t believe this is what you are looking for. Since you can define Galois closures without knowing that $\alpha + \beta$ and $\alpha \beta$ are also algebraic, it is a legitimate way of proving it, but not a practical nor pedagogical one.

Hope that helps,

- understanding integration by change of variables
- $\int_X f(x)\,d\mu\,\,$ exists iff $\,\,\int_X \lvert \,f(x)\rvert\,d\mu\,\,$ does
- Extending Cauchy's Condensation Test
- Polynomial approximation of circle or ellipse
- Construction of an exact sequence $1 \to N_{16} \to G_{64} \to \mathbb{Z}/2\mathbb{Z} \times \mathbb{Z}/2\mathbb{Z} \to 1$
- How to show $f'(0)$ exist and is equal to $1$?
- Numerical Approximation of the Continuous Fourier Transform
- Is there a direct proof of this lcm identity?
- Generalizing Lagrange multipliers to use the subdifferential?
- Permutations with k inversions. Combinatorial proof.
- Can $f\in L^2(\Omega)$ imply $\nabla f\in H^{-1}(\Omega):=(H_0^1(\Omega))^*$?
- How do I prove that a subspace of a vector space $X$ is the null space of some linear functional on $X$?
- A 4×4 homogeneous matrix for a 90 degree rotation about Y axis?
- Smooth structure on the topological space
- $1/4$ is in the Cantor set?