Articles of positive semidefinite

Constrained quadratic programming with positive semidefinite matrix

I am facing the following problem: $$\begin{equation*} \begin{aligned} & \underset{z}{\text{minimize}} & & \textbf{z}^T \textbf{Q} \textbf{z} + \textbf{b}^T \textbf{z}\\ & \text{subject to} && \textbf{Az } \leq \textbf{d} \end{aligned} \end{equation*} $$ where $\textbf{Q}\geq 0$, this is, positive semidefinite. The dimension of vector $\textbf{z}$ is $3$ and the constraint matrix $\textbf{A}$ has $8$ rows. The Lagrangian would look […]

Hessian matrix for convexity of multidimensional function

To prove that a one dimensional differentiable function $f(x)$ is convex, it is quite obvious to see why we would check whether or not its second derivative is $>0$ or $<0.$ What is the intuition behind the claim that, if the Hessian $H$ of a multidimensional differentiable function $f(x_1,…,x_n)$ is positive semi-definite, it must be […]

Definiteness of a general partitioned matrix $\mathbf M=\left$

If $\mathbf M=\left[\begin{matrix}\bf A & \bf b\\\bf b^\top & \bf d \\\end{matrix}\right]$ such that $\bf A$ is positive definite, under what conditions is $\bf M$ positive definite, positive semidefinite and indefinite? It is readily seen that $\det(\mathbf M)=\alpha\det(\mathbf A)$, where $\alpha=\mathbf d-\bf b^\top A^{-1}b$ Now, $\alpha>0\Rightarrow \det(\mathbf M)>0$ $\quad(\det(\mathbf A)>0$ by hypothesis$)$ This is not […]

How to prove that $A$ is positive semi-definite if all principal minors are non-negative?

Let $A\in\mathbb C^{n\times n}$ be a Hermitian matrix such that all its principal minors are non-negative (i.e. for $B=\left(a_{l_il_j}\right)_{1≤i,j≤k}$ with $1≤l_1<…<l_k≤n$ we have $\det(B)≥0$). Then how to show that $A$ is positive semi-definite? I thought maybe we could use induction since the condition is also satisfied for every submatrix, but I couldn’t find an easy […]

Prove that every positive semidefinite matrix has nonnegative eigenvalues

There is a theorem which states that every positive semidefinite matrix only has eigenvalues $\ge0$ How can I prove this theorem?

Is the product of symmetric positive semidefinite matrices positive definite?

I see on Wikipedia that the product of two commuting symmetric positive definite matrices is also positive definite. Does the same result hold for the product of two positive semidefinite matrices? My proof of the positive definite case falls apart for the semidefinite case because of the possibility of division by zero…

$f$ is convex function iff Hessian matrix is nonnegative-definite.

Let $f: \mathbb{R}^2 \rightarrow \mathbb{R}$, $f \in C^2$. Show that $f$ is convex function iff Hessian matrix is nonnegative-definite. $f(x,y)$ is convex if $f( \lambda x + (1-\lambda )y) \le \lambda f(x) + (1- \lambda)f(y)$ for any $x,y \in \mathbb{R}^2$. Hessian matrix is nonnegative-definite if $f_{xx}” x^2 + f_{x,y}(x+y) + f_{yy}”y^2 \ge 0$ I know […]