Determinant of rank-one perturbations of (invertible) matrices

I read something that suggests that if $I$ is the $n$-by-$n$ identity matrix, $v$ is an $n$-dimensional real column vector with $\|v\| = 1$ (standard Euclidean norm), and $t > 0$, then $\det(I+tvv^T)=1+t$.

Can anyone prove this or provide a reference?

More generally, is there also an (easy) formula for calculating $\det(A + wv^T)$ for $v,w \in \mathbb{K}^{d \times 1}$ and some (invertible) matrix $A \in \Bbb{K}^{d \times d}$?

Solutions Collecting From Web of "Determinant of rank-one perturbations of (invertible) matrices"

Rank one update, reference Matrix Analysis and Aplied Linear Algebra, Carl D. Meyer, page 475:

If $A_{n \times n} $ is nonsingular, and if $\mathbf{c}$ and $\mathbf{d} $ are $n \times 1$ columns, then
\det(\mathbf{I} + \mathbf{c}\mathbf{d}^T) = 1 + \mathbf{d}^T\mathbf{c} \tag{6.2.2}
\det(A + \mathbf{c}\mathbf{d}^T) = \det(A)(1 + \mathbf{d}^T A^{-1}\mathbf{c}) \tag{6.2.3}

So in your case, $A=\mathbf{I}$ and the determinant is $1(1+ t\mathbf{v}^T\mathbf{v})=1+t$

Further from the text:

Proof. The proof of (6.2.2) [the previous] follows by applying the product rules (p. 467) to
\pmatrix{\mathbf{I} & \mathbf{0} \\ \mathbf{d}^T & 1}\pmatrix{\mathbf{I} + \mathbf{c}\mathbf{d}^T& \mathbf{c} \\ \mathbf{0} & 1}\pmatrix{\mathbf{I} & \mathbf{0} \\ -\mathbf{d}^T & 1}=\pmatrix{\mathbf{I} & \mathbf{c} \\ \mathbf{0} & 1 + \mathbf{d}^T\mathbf{c}}

To prove (6.2.3) write $A + \mathbf{c}\mathbf{d}^T = A ( \mathbf{I} + A^{-1}\mathbf{c}\mathbf{d}^T)$, and apply the product rule (6.1.15) along with (6.2.2)

Sylvester’s determinant theorem states that more generally
for any $k\times l$ matrix $A$ and $l\times k$ matrix $B$. You can apply this for $(k,l)=(n,1)$, $A=tv$ and $B=v^T$. See the link provided for a straightforward proof, using row and column operations.

In fact the proof is almost a one-liner, so here it is: in $(k+l)\times(k+l)$ block matrices one has
=\det\begin{pmatrix}I_k&A\\-B&I_l \end{pmatrix}
=\det\begin{pmatrix}I_k&A\\0&I_l+BA \end{pmatrix},
where the first equality is a compound column operation (subtract $B$ times the second block-column from the first, multiplication being on the right for column operations), and the second is a compound row operation (add $B$ times the first block-row to the second, multiplication being on the left for row operations), and the desired identity follows from computation of block-triangular determinants.

I solved it. The determinant of $I+tvv^T$ is the product of its eigenvalues. $v$ is an eigenvector with eigenvalue $1+t$. $I+tvv^T$ is real and symmetric, so it has a basis of real mutually orthogonal eigenvectors, one of which is $v$. If $w$ is another one, then $(I+tvv^T)w=w$, so all the other eigenvalues are $1$.

I feel like I should have known this already. Can anyone provide a reference for this and similar facts?

Here’s another proof (cf. Sherman–Morrison formula):

The non-zero eigenvalues of $AB$ and $BA$ are the same. This is straightforward to prove.

Hence the non-zero eigenvalues of $ab^T$ and $b^Ta$ are the same (that is, exactly one non-zero eigenvalue).

Hence the eigenvalues of $I+ ab^T$ are $1+b^Ta, 1,…,1$, and since the determinant is the product of eigenvalues, we have $\det(I+ab^T) = 1+b^Ta$.

In this particular example, $a=tv$, $b=v$, and $\|v\| = 1$, hence $b^Ta = t$, and so $\det(I+t v v^T) = 1+t$.