What are the restrictions on the covariance matrix of a nonnegative multivariate distribution.

This question is a step in answering this question on the stats.se.

Given a distribution $F(X_1,\ldots,X_n)$ on the nonnegative orthant $\mathbb{R}_+^n$ (i.e. each of the marginals is supported on the nonnegative reals). Where the mean of each marginal is 1 (i.e. $E(X_i)=1$ for all $i$). What are the restrictions on the covariance matrix (assuming that it exists, other than positive semi-definiteness)?

The idea is to be able to recognize a covariance matrix as coming from a nonegative multivariate distribution. For example $\pmatrix{4&-3\\-3& 4}$ is a perfectly fine covariance matrix, it is symmetric and positive definite, but it cannot come from a non-negative multivariate ditribution with mean $\mathbf 1$ because $\text{Cov}(X_1,X_2)=E(X_1X_2)-1\ge-1$ as $E(X_1X_2)$ is positive. I am certain that this is not the only such restriction.

Solutions Collecting From Web of "What are the restrictions on the covariance matrix of a nonnegative multivariate distribution."

I hate answering my own questions, but noone else is doing so.

It turns out that the only restrictions on the covariance matrix are that it is positive definite and that $\text{Cov}(X_i,X_j)>-1$. As demonstrated in the answer to the related question, given a covariance matrix satisfying these restrictions, a lognormal distribution with mean $\mathbf{1}$ can be constructed having the specified covariance.

Are you assuming that the covariance matrix exists? $E[X_i]=1$ is not enough to ensure this.

Otherwise there are cases where the covariance matrix is not well-defined. For example, if $X_i\sim G$; where $G$ is the CDF of $X/c$, $X$ has a Student’s-$t$ distribution with $1.5$ degress of freedom truncated below $0$, $c=E[X]\approx 2.04$. This implies that $E[X_i]=1$ but their variances do not exist and consequently the covariance matrix does not exist.