# How to find a perturbation of coefficients of a linear system to guarantee the error of solutions is small?

Assume $A$ is a $n \times n$ matrix, and $rank(A)<n$. For $b \in \mathbb{R}^n$, assume $AX=b$ has a solution $X=(x_1, \cdots, x_n)$, then clearly there exist infinitely many solutions. By the structure of the solutions, we may assume $\sum_{i=1}^n x_i=1$. Now my question is, for any $\epsilon>0$, does there exist invertible $n \times n$ matrix $\tilde{A}$ and vector $\tilde{X}=(\tilde{x}_1, \cdots, \tilde{x}_n) \in \mathbb{R}^n$ such that $||A-\tilde{A}||<\epsilon, ||X-\tilde{X}||<\epsilon$ and $\tilde{A} \tilde{X}=b$ ?

My attempt is the following:

It’s easy to see except for a finite number of $t>0$, $A+tI$ is invertible. Then we can solve the linear systems $(A+tI)Y=b$. It’s easy to see that $W=Y-X$ satisfies $(A+tI)W=-tX$, hence $W=-t(A+tI)^{-1}X$. It would be nice if $||W||$ small as long as $t$ is small, then we can choose $\tilde{A}=A+tI$ and $\tilde{X}=Y$ and thus the claim is proved. However, $||W||$ is not necessarily small when $t$ is small, so the $A+tI$ perturbation of $A$ is not quite true. I cannot find a better perturbation of $A$. I got stuck here.

Any comment or ideas will be really aprrectiated.