Articles of inverse problems

Range conditions on a linear operator

While reading though some engineering literature, I came across some logic that I found a bit strange. Mathematically, the statement might look something like this: I have a linear operator $A:L^2(\Bbb{R}^3)\rightarrow L^2(\Bbb{R}^4)$, that is a mapping which takes functions of three variables to functions of four variables. Then, “because the range function depends on 4 […]

When is $R \, A^{-1} \, R^t$ invertible?

In the context of a Gaussian model, I came across a matrix product $R \, A^{-1} \, R^t$ where $R$ is a $m \times n$ rectangular matrix and as implied $A$ is $n \times n$ and invertible. On which properties of $R$ does the existence of $(R \, A^{-1} \, R^t)^{-1}$ depend?

Why are additional constraint and penalty term equivalent in ridge regression?

Tikhonov regularization (or ridge regression) adds a constraint that $\|\beta\|^2$, the $L^2$-norm of the parameter vector, is not greater than a given value (say $c$). Equivalently, it may solve an unconstrained minimization of the least-squares penalty with $\alpha\|\beta\|^2$ added, where $\alpha$ is a constant (this is the Lagrangian form of the constrained problem). The above […]

Low-rank Approximation with SVD on a Kernel Matrix

I have very little experience in linear algebra so please bear with me. Here’s a little background of my issue. I’m working on a problem that utilizes a large kernel matrix, K. This matrix, when multiplied with 500 x 1 column vector A, results in a 500 x 1 column vector B as shown below: […]