Intereting Posts

How many words of length $n$ can we make from $0, 1, 2$ if $2$'s cannot be consecutive?
Approximation to the Lambert W function
Differentiable $f$ such that the set of translates of multiples of $f$ is a vector space of dimension two
Axiomatic Definition of a Category
Eigenfunction of the Fourier transform
The Proximal Operator of the $ {L}_{\infty} $ (Infinity Norm)
Convergence of sequence given by $x_1=1$ and $x_{n+1}=x_n+\sqrt{x_n^2+1}$
How do I show the uniform continuity of $\tan^{-1}$ over $\mathbb{R}$
Recognizing when a tower of Galois extensions gives a Galois extension
Help me solve a combinatorial problem
The inner product determines the structure of the space
For $E (X – EX)^2$ to exist, do we need $EX$ to exist and be finite?
Is it true that $|g| = |\phi(g)|$ for all homomorphisms $\phi: G \to G$ and $g \in G$?
For $G$ group and $H$ subgroup of finite index, prove that $N \subset H$ normal subgroup of $G$ of finite index exists
A matrix and its transpose have the same set of eigenvalues

A colleague popped into my office this afternoon

and asked me the following question. He told me there is a

clever proof when $n=2$. I couldn’t do

anything with it, so I thought I’d post it here and see what happens.

*Prove or find a counterexample*

For positive, i.i.d. random variables $Z_1,\dots, Z_n$

with finite mean, and positive constants $a_1,\dots, a_n$,

we have

$$\mathbb{E}\left({\sum_{i=1}^n a_i^2 Z_i\over\sum_{i=1}^n a_i Z_i}\right)

\leq {\sum_{i=1}^n a_i^2\over\sum_{i=1}^n a_i}.$$

- Proving Integral Inequality
- Trouble with an Inequality
- The number of divisors of any positive number $n$ is $\le 2\sqrt{n}$
- Intuition: If $a\leq b+\epsilon$ for all $\epsilon>0$ then $a\leq b$?
- The sine inequality $\frac2\pi x \le \sin x \le x$ for $0<x<\frac\pi2$
- Meaning of $\geqslant$, $\leqslant$, $\eqslantgtr$, $\eqslantless$

**Added:** This problem originates from the thesis of a student in Computer and Electrical Engineering at the University of Alberta. Here is the response from his supervisor: “Many thanks for this! It is a nice result in addition to being useful in a practical problem of antenna placement.”

- What is the correct way to think about this yet another balls/boxes problem?
- What's wrong with my solution for the birthday problem?
- Bob starts with \$20. Bob flips a coin. Heads = Win +\$1 Tails = Lose -\$1. Stops if he has \$0 or \$100. Probability he ends up with \$0?
- Proving inequality $\sqrt{\frac{2a}{b+c}}+\sqrt{\frac{2b}{c+a}}+\sqrt{\frac{2c}{a+b}} \leq \sqrt{3 \left(\frac{a}{b}+\frac{b}{c}+\frac{c}{a}\right)}$
- About an inequality including arithmetic mean, geometric mean and harmonic mean
- Uncorrelated, Non Independent Random variables
- Conditional expectation of product of Bernoulli random variables
- Inequality with condition $x^2+y^2+z^2=1$.
- Asymptotic behaviour of a multiple integral on the unit hypercube
- A (probably trivial) induction problem: $\sum_2^nk^{-2}\lt1$

Yes, the inequality always holds for i.i.d. random variables $Z_1,\ldots,Z_n$. In fact, as suggested by Yuval and joriki, it is enough to suppose that the joint distribution is invariant under permuting the $Z_i$. Rearranging the inequality slightly, we just need to show that the following is nonnegative (here, I am using $\bar a\equiv\sum_ia_i^2/\sum_ia_i$)

$$

\bar a-\mathbb{E}\left[\frac{\sum_ia_i^2Z_i}{\sum_ia_iZ_i}\right]=\sum_ia_i(\bar a-a_i)\mathbb{E}\left[\frac{Z_i}{\sum_ja_jZ_j}\right].

$$

I’ll write $c_i\equiv\mathbb{E}[Z_i/\sum_ja_jZ_j]$ for brevity. Then, noting that $\sum_ia_i(\bar a-a_i)=0$, choosing any constant $\bar c$ that we like,

$$

\bar a-\mathbb{E}\left[\frac{\sum_ia_i^2Z_i}{\sum_ia_iZ_i}\right]=\sum_ia_i(\bar a-a_i)(c_i-\bar c).

$$

To show that this is nonnegative, it is enough to show that $c_i$ is a decreasing function of $a_i$ (that is, $c_i\le c_j$ whenever $a_i\ge a_j$). In that case, we can choose $\bar c$ so that $\bar c\ge c_i$ whenever $a_i\ge\bar a$ and $\bar c\le c_i$ whenever $a_i\le\bar a$. This makes each term in the final summation above positive and completes the proof.

Choosing $i\not=j$ such that $a_i \ge a_j$,

$$

c_i-c_j=\mathbb{E}\left[\frac{Z_i-Z_j}{\sum_ka_kZ_k}\right]

$$

Let $\pi$ be the permutation of $\{1,\ldots,n\}$ which exchanges $i,j$ and leaves everything else fixed. Using invariance under permuting $Z_i,Z_j$,

$$

\begin{align}

2(c_i-c_j)&=\mathbb{E}\left[\frac{Z_i-Z_j}{\sum_ka_kZ_k}\right]-\mathbb{E}\left[\frac{Z_i-Z_j}{\sum_ka_kZ_{\pi(k)}}\right]\cr

&=\mathbb{E}\left[\frac{(a_j-a_i)(Z_i-Z_j)^2}{\sum_ka_kZ_k\sum_ka_kZ_{\pi(k)}}\right]\cr

&\le0.

\end{align}

$$

So $c_i$ is decreasing in $a_i$ as claimed.

**Note:** In the special case of $n=2$, we can always make the choice $\bar c=(c_1+c_2)/2$. Then, both terms of the summation on the right hand side of the second displayed equation above are the same, giving

$$

\bar a-\mathbb{E}\left[\frac{\sum_ia_i^2Z_i}{\sum_ia_iZ_i}\right]=\frac{a_1a_2(a_2-a_1)(c_1-c_2)}{a_1+a_2}.

$$

Plugging in my expression above for $2(c_1-c_2)$ gives the identity

$$

\bar a-\mathbb{E}\left[\frac{\sum_ia_i^2Z_i}{\sum_ia_iZ_i}\right]=\frac{a_1a_2(a_1-a_2)^2}{2(a_1+a_2)}\mathbb{E}\left[\frac{(Z_1-Z_2)^2}{(a_1Z_1+a_2Z_2)(a_1Z_2+a_2Z_1)}\right],

$$

which is manifestly nonnegative. I’m guessing this could be the “clever” proof mentioned in the question.

**Note 2:** The proof above relies on $\mathbb{E}[Z_i/\sum_ja_jZ_j]$ being a decreasing function of $a_i$. More generally, for any decreasing $f\colon\mathbb{R}^+\to\mathbb{R}^+$, then $\mathbb{E}[Z_if(\sum_ja_jZ_j)]$ is a decreasing function of $a_i$. Choosing positive $b_i$ and setting $\bar a=\sum_ia_ib_i/\sum_ib_i$ then $\sum_ib_i(\bar a-a_i)=0$. Applying the argument above gives the inequality

$$

\mathbb{E}\left[f\left(\sum_ia_iZ_i\right)\sum_ia_ib_iZ_i\right]

\le

\mathbb{E}\left[f\left(\sum_ia_iZ_i\right)\sum_ib_iZ_i\right]\frac{\sum_ia_ib_i}{\sum_ib_i}

$$

The inequality in the question is the special case with $b_i=a_i$ and $f(x)=1/x$.

Since the $Z_i$ are i.i.d., the expectation is the same if we rename the variables. Taking all permutations, your inequality is equivalent to

$$ \mathbb{E} \left[ \frac{1}{n!} \sum_{\pi \in S_n} \frac{\sum_{i=1}^n a_i^2 Z_{\pi(i)}}{\sum_{i=1}^n a_i Z_{\pi(i)}} \right] \leq \frac{\sum_{i=1}^n a_i^2}{\sum_{i=1}^n a_i}. $$

Going over all possible values of $Z_1,\ldots,Z_n$, this is the same as the following inequality for positive real numbers:

$$ \frac{1}{n!} \sum_{\pi \in S_n} \frac{\sum_{i=1}^n a_i^2 z_{\pi(i)}}{\sum_{i=1}^n a_i z_{\pi(i)}} \leq \frac{\sum_{i=1}^n a_i^2}{\sum_{i=1}^n a_i}. $$

Intuitively, the maximum is attained at $z_i = \text{const}$, which is why we get the right-hand side. This is indeed true for $n = 2$, which is easy to check directly.

I’ve been toying with a concavity argument and Jensen’s theorem. For $n=2$ I think this works, as shown below. Unfortunately it falls flat for $n > 2$. In fact for $n > 2$ the restriction to any convex set $x_1 + \dotsc + x_n = \lambda$ can be shown to be *not* concave. So, here goes the partial result.

Let

$$

f(x_1, x_2) = \frac{a_1^2x_1 + a_2^2x_2}{a_1x_1+a_2x_2}.

$$

Then $f$ restricted to the convex segment $I=\{(x_1,x_2) \mid x_1,x_2 \geq 0 \textrm{ and } x_1+x_2=1\}$ is a concave function. This can be checked in multiple ways, for example by explicitly computing the second derivative

$$

\frac{\partial^2}{\partial x^2} f(x, 1-x) = \frac{-2a_1a_2(a_1-a_2)^2}{(a_1x+a_2(1-x))^3}

$$

and noting that it is non-positive for $x \in [0,1]$. Assume that the joint distribution of $(Z_1,Z_2)$ induces a probability distribution on $I$, then it is symmetric under $(x_1,x_2) \mapsto (x_2, x_1)$. In particular the expected values for the coordinates on $I$ are equal and therefore $\mathbb{E}(x_1) = \mathbb{E}(x_2) = \tfrac{1}{2}$. Then by Jensen’s theorem for concave functions on $I$:

$$

\mathbb{E}(f(x_1,x_2)) \leq f(\mathbb{E}(x_1),\mathbb{E}(x_2)) = f(\tfrac{1}{2},\tfrac{1}{2}) = \frac{a_1^2+a_2^2}{a_1+a_2}.

$$

Now $f$ is a radial function and the same reasoning applies to all segments $x_1+x_2 = \lambda$ for $\lambda > 0$. And even though the induced distribution depends on the parameter $\lambda$, the inequality above will hold for each such segment and hence also for the joint distribution $(Z_1,Z_2)$.

- Effective Upper Bound for the Number of Prime Divisors
- incremental computation of standard deviation
- Ordinary Differential Equations used in Cosmology
- Existence of a Riemannian metric inducing a given distance.
- Convolution product on the linear dual of the polynomial algebra
- Proving $\sum\limits_{k=1}^{n}{\frac{1}{\sqrt{k}}\ge\sqrt{n}}$ with induction
- A logic puzzle involving a balance.
- How do I find the marginal probability density function of 2 continuous random variables?
- A complete orthonormal system contained in a dense sub-space.
- Finding an angle within an 80-80-20 isosceles triangle
- Good non-mathematician book on Game Theory
- A tiling puzzle/question
- How can I prove irreducibility of polynomial over a finite field?
- Matrix is conjugate to its own transpose
- How to calculate $\lim_{n\to\infty}(1+1/n^2)(1+2/n^2)\cdots(1+n/n^2)$?