Is inverse matrix convex?

I wonder a generalization of Jensen’s inequality:
let $\mathbf{X,Y}$ be two positive definite matrices, can we obtain the following Jensen like inequality
$$(1-\lambda)\mathbf{X}^{-1}+\lambda\mathbf{Y}^{-1} \succeq((1-\lambda)\mathbf{X}+\lambda\mathbf{Y})^{-1}$$
for any $\lambda \in (0,1)$, where the notation $\mathbf{A}\succeq \mathbf{B}$ represents matrix $(\mathbf{A-B})$ positive semi-definite.

Solutions Collecting From Web of "Is inverse matrix convex?"

Edit: The proof is wrong as pointed out by @mewmew. I am tryin to fix it.

Let $\lambda\in [0,1]$.
Let, $$A=(1-\lambda){X}^{-1}+\lambda {Y}^{-1}\\
B=\left((1-\lambda){X}+\lambda Y\right)$$

Then, we note that $A$ and $B$ are positive definite since $X,\ Y$ are positive definite. Hence $X,Y,A,B$ are invertible.

Then \begin{align}
AB-I=&\left((1-\lambda){X}^{-1}+\lambda {Y}^{-1}\right)\left((1-\lambda){X}+\lambda Y\right)-I\\
\ =& \left(1-\lambda\right)^2I+\lambda(1-\lambda)\left(X^{-1}Y+Y^{-1}X\right)+\lambda^2I-I\\
\ =& 2\lambda(\lambda-1)I+\lambda(1-\lambda)\left(X^{-1}Y+Y^{-1}X\right)\\
\ =& \lambda(1-\lambda)\left(-X^{-1}X-Y^{-1}Y+X^{-1}Y+Y^{-1}X\right)\\
\ =& \lambda(1-\lambda)(Y^{-1}-X^{-1})(X-Y)\quad \\
So, for any vector $u\ne 0$ \begin{align}
u^T(B^TAB-B^T)u=& u^TB^T(AB-I)u=\lambda(1-\lambda)u^TB^T(Y^{-1}-X^{-1})(X-Y)u\\
\ =&\lambda(1-\lambda)u^T((1-\lambda)X^T+\lambda Y^T)(Y^{-1}-X^{-1})(X-Y)u
Now, for any vector $v\ne 0\ \exists$ a vector $u\ne 0$ such that $$Bu=v$$.
Hence, for any vector $v\ne 0$\begin{align}
v^T(A-B^{-1})v=& (Bu)^T(A-B^{-1})Bu\\
\ =& u^T(B^TAB-B^T)u\ge 0

Hence $$A\ge B^{-1}\\ \Rightarrow (1-\lambda)X^{-1}+\lambda Y^{-1}\ge \left((1-\lambda){X}+\lambda Y\right)^{-1}\quad \forall \lambda\in [0,1]\hspace{0.6cm}\Box$$

Assume $X, Y$ are positive definite and $\lambda \in [0,1]$.
We know $Z(\lambda) = (1-\lambda) X + \lambda Y$ is positive definite and so does
$Z^{-1}(\lambda)$. Let us use $(\ldots)’$ to represent derivative with respect to $\lambda$, we have:

Z Z^{-1} = I_n
\implies Z’Z^{-1} + Z ( Z^{-1} )’ = 0_n
\implies (Z^{-1})’ = – Z^{-1} Z’ Z^{-1}
Differentiate one more time and notice $Z” = 0_n$, we get:
$$(Z^{-1})” = – (Z^{-1})’ Z’ Z^{-1} – Z^{-1}Z’ (Z^{-1})’ = 2 Z^{-1}Z’ Z^{-1} Z’ Z^{-1}\tag{*1}$$

Pick any random non-zero vector $u$ and consider following pair of vector/matrix valued functions:

$$v(\lambda) = Z'(\lambda) Z^{-1}(\lambda) u\quad\quad\text{ and }\quad\quad
\varphi(\lambda) = u^T Z^{-1}(\lambda) u$$

$(*1)$ tell us

$$\varphi”(\lambda) = u^T (Z^{-1})”(\lambda) u = 2 v^T(\lambda) Z^{-1}(\lambda) v(\lambda) \ge 0\tag{*2}$$

because $Z^{-1}(\lambda)$ is positive definite. From this we can conclude $\varphi(\lambda)$ is a convex function for $\lambda$ over $[0,1]$. As a result, for any $\lambda \in (0,1)$, we have:

$$\begin{align}&(1-\lambda)\varphi(0) + \lambda\varphi(1) – \varphi(\lambda) \ge 0\\
\iff& u^T \left[ (1-\lambda) X^{-1} + \lambda Y^{-1} – ((1-\lambda) X + \lambda Y)^{-1}\right] u \ge 0\tag{*3}
Since $u$ is arbitrary, this implies the matrix within the square bracket in $(*3)$
is positive semi-definite and hence:

$$(1-\lambda) X^{-1} + \lambda Y^{-1} \succeq ((1-\lambda) X + \lambda Y)^{-1}$$

Please note that when $Z’ = Y – X$ is invertible, $v(\lambda)$ is non-zero for non-zero $u$. The inequalities in $(*2)$ and $(*3)$ become strict and the matrix within
the square bracket in $(*3)$ is positive definite instead of positive semi-definite.

Let $\mathbf{P}=\mathbf{X}^{-1/2}\mathbf{Y}\mathbf{X}^{-1/2}$. By left- and right- multiplying both sides of the equation by $\mathbf{X}^{1/2}$, the inequality is equivalent to
$(1-\lambda)I+\lambda\mathbf{P}^{-1} \succeq((1-\lambda)I+\lambda\mathbf{P})^{-1}$. Since $\mathbf P$ is positive definite, it can be unitarily diagonalised and hence we may assume that it is a diagonal matrix. So, the inequality reduces down to the scalar case $(1-\lambda)+\lambda p_{ii}^{-1} \ge ((1-\lambda)+\lambda p_{ii})^{-1}$, which is true because the function $f(t)=\frac1t$ is convex for $t>0$.