When is the matrix $\text{diag}(\mathbf{x}) + \mathbf{1}$ invertible?

Given a vector $\mathbf{x} \in \mathbb{R}^N$, let’s define:

$$\text{diag}(\mathbf{x}) = \begin{pmatrix}
x_1 & 0 & \ldots & 0 \\
0 & x_2 & \ldots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \ldots & x_N
\end{pmatrix}.$$

Moreover, let

$$\mathbf{1}= \begin{pmatrix}
1 & 1 & \ldots & 1 \\
1 & 1 & \ldots & 1 \\
\vdots & \vdots & \ddots & \vdots \\
1 & 1 & \ldots & 1
\end{pmatrix}.$$

Here is my question:

When is the matrix $\mathbf{M} = \text{diag}(\mathbf{x}) + \mathbf{1}$ invertible?

I was able to find some results when $x_1 = x_2 = \ldots = x_N = x$. Indeed, the matrix $M$ is singular when:

  1. $x=0$. This is trivial since $\mathbf{M} = \mathbf{1}$…
  2. $x=-N$. In this case, if you sum up all the rows (or columns) of the matrix $M$, you get the zero vector.

What can I say in the general case when $\text{diag}(\mathbf{x})$ is a generic vector?

Solutions Collecting From Web of "When is the matrix $\text{diag}(\mathbf{x}) + \mathbf{1}$ invertible?"

We can easily compute the determinant of the sum $\operatorname{diag}(\mathbf{x}) + \mathbf{1}$ and check invertibility that way,

If all the diagonal entries $x_i$ are nonzero, we can apply the matrix determinant lemma for a rank one update to an invertible matrix:

$$ \det(A+uv^T) = (1 + v^T A^{-1} u) \det(A) $$

When $A$ is the matrix $\operatorname{diag}(\mathbf{x})$ and $u,v$ are vectors of all ones, this says the matrix sum is invertible unless the sum of the reciprocals $x_i^{-1}$ is $-1$.

If one of the diagonal entries is zero, say $x_1$ without loss of generality, then elementary row operations quickly show that the determinant of $\operatorname{diag}(\mathbf{x}) + \mathbf{1}$ is $\prod_{k=2}^n x_k$.

i think your matrix is nonsingular iff
$$1 +\frac 1{x_1}+\frac 1{x_2}+ \ldots + \frac 1{x_n} \ne 0$$

i will look at the case $n = 4$ will consider the matrix $A = D + jj^\top$ where $D$ is the diagonal matrix with entries $d_1, d_2, d_3$ and $d_4, j = (1,1,1,1)^\top.$

suppose $\lambda, x$ is an eigenvalue-eigenvector pair. then we have
$$d_1x_1 + x_1 + x_2+ x_3 + x_4 = \lambda x_1, \ldots, x_1+x_2+x_3+x_4 + d_4x_4=\lambda x_4 $$ solving for $x_1$ we find that
$$x_1= \frac 1{(\lambda- d_1)}(x_1+x_2+x_3+x_4), \ldots
x_4= \frac 1{(\lambda- d_4)}\left(x_1+x_2+x_3+x_4\right)\tag 1 $$ adding the four equations in (1), you find that the characteristic equation of $A$ is
$$1 = \frac 1{(\lambda-d_1)}+\frac 1{(\lambda-d_2)}+\frac 1{(\lambda-d_3)}+\frac 1{(\lambda-d_4)} $$

therefore the matrix $D + jj^\top$ is singular iff $$ \frac 1d_1 + \frac 1d_2 + \frac 1d_3 + \frac 1d_4 + 1 \ne 0. $$