Show that $\det{(A + sxy^*)}$ is linear in $s$

Suppose that $A \in \mathbb{R}^{n\times n }$, and $x,y\in \mathbb{R}^n$.

  • Show that there exist numbers $a,b$ so that $\det{(A + sxy^*)}=a+bs$
  • Show that if $\det{(A)}\neq 0$ then $a = \det{(A)}$ and $b = \det{(A)} \cdot y^*A^{-1}x$.
  • Is it true that if $\det{(A)}=0$ then $a=0$?
  • My answer so far: The third question seems trivial and silly, and I don’t understand why it’s on there. I have solved the first one as follows: take $\beta$ to be any basis whose first element is $x$. Then with respect to basis $\beta$, $A+ sxy^*$ has the form

    $$\left( \begin{matrix} \alpha_{11} + sy_1 & \alpha_{12} + sy_2 & \dotsb & \alpha_{1n}+sy_n \\ \alpha_{21}&\dotsb&&\alpha_{2n} \\ \vdots & \ddots &&\vdots\\ \alpha_{n1} & \dotsb &&\alpha_{nn} \end{matrix} \right),$$

    from which it’s clear that the determinant is a polynomial in $s$ of degree $\leq 1$.

    I am less clear on the second question.

    For a limited case, note that if $A$ is diagonalizable and $x$ is an eigenvector of $A$ with eigenvalue $\lambda_x \neq 0$, then adding $sxy^*$ to $A$ has the effect of preserving all the eigenvectors, bumping up $\lambda_x$ to $\lambda_x + sy^*x$, and since the determinant is the product of all the eigenvalues, the product of $\lambda_1 \lambda_2 \dotsb (\lambda_x + sy^*x)\dotsb \lambda_n $ is

    $$\det{(A)} + \det{(A)}\frac{sy^*x}{\lambda_x}=\det{(A)}\cdot sy^*(\frac{x}{\lambda_x}) = \det{(A)}\cdot s y^*(A^{-1}x).$$

    Note that $y^*A^{-1}x$ is $\alpha_1 y_1 + \dotsb + \alpha_n y_n$, for $\alpha_i$ such that $x = \alpha_1 A_1 + \dotsb + \alpha_n A_n$, where $A_i$ are the columns of $A$.

    But this is all I’ve got. Can someone give me a hint? I’m not looking for you to feed me the answer. Thanks as always

    Solutions Collecting From Web of "Show that $\det{(A + sxy^*)}$ is linear in $s$"

    First note that it is not a linear function in general, but it is affine. In $\mathbb{R}$, if we take $u=v=0$ and $A=1$, we have $\det(A+ s xy^*) = 1$ for all $s$, which is not linear.

    If either $x$ or $y$ is zero, we have $\det(A+ s xy^*) = \det A$ for all $s$, which is a constant (hence affine). It would be linear iff $A$ is singular.

    Note that the determinant can be viewed as a function $d:\mathbb{C}^n \times \cdots \mathbb{C}^n \to \mathbb{C}$, where $\det B = d(Be_1,…,Be_n)$. The function $d$ is multilinear.

    We can suppose without loss of generality that $x,y$ are unit vectors.

    Choose any $u_2,…,u_n$ such that $x,u_2,…,u_n$ form an orthonormal basis, and let $U=\begin{bmatrix} x &u_2&…u_n \end{bmatrix}$. Then $Ue_1 = x$. Similarly, find a $V$ such that $V e_1 = y$.
    Then $\det(A+ s xy^*) = \det(A+ s U e_1 e_1^T V^*) = \det(U^*A V+ s e_1 e_1^T)$. Note that $e_1 e_1^T$ is zero except for a one in the $(1,1)$ place.

    Letting $B = U^*A V+ s e_1 e_1^T$, we can write $\det B = d(Be_1,…,B e_n)$, and evaluating gives
    $\det(U^*A V+ s e_1 e_1^T) = d((U^*A V+s)e_1,…,U^*A V e_n)$, and since $d$ is multilinear, we have $\det(U^*A V+ s e_1 e_1^T) = \det A + s d(e_1,…,U^*A V e_n)$. It follows that the function is affine and linear iff $\det A = 0$.

    If $x=0$, we have nothing to prove. If $x\ne0$, then as you said, by a suitable change of basis, we may assume that $x=e_1=(1,0,\ldots,0)^\top$. Let $M_{ij}$ denotes the $(i,j)$-th minor of $A$.

    1. By Laplace expansion along the first row,
      \begin{align*}
      \det{(A + se_1y^*)}
      &=\sum_j (-1)^{j+1}(a_{1j}+s\bar{y_j})M_{1j}\\
      &=\underbrace{\sum_j (-1)^{j+1}a_{1j}M_{1j}}_{a}
      +\color{red}{s}\ \underbrace{\sum_j (-1)^{j+1}\bar{y_j}M_{1j}}_{b}\\
      &=a+sb.
      \end{align*}
    2. Again, by Laplace expansion, we see that $a=\det(A)$ and $b$ is the determinant of the matrix obtained by replacing the first row of $A$ by $y^\ast$. Yet, by Cramer’s rule, $\frac{b}{\det(A)}$ is the first entry of row vector $y^*A^{-1}$. So, …
    3. As you said, the answer is obvious.