Intereting Posts

Prob. 18, Chap. 3 in Baby Rudin: Behaviour of $x_{n+1}=\frac{p-1}{p} x_n+\frac{\alpha}{p} x_n^{1-p}$ with $\alpha>0$ and $p$ a positive integer
Why is the $L_p$ norm strictly convex for $1<p<\infty?$
The set of natural number functions is uncountable
Congruence between Bernoulli numbers
Locus using Euclidean geometry
Prove that every year has at least one Friday the 13th
Symbols for Quantifiers Other Than $\forall$ and $\exists$
How many trees in a forest?
QR decomposition and the fundamental subspaces
Square root of a Mersenne number is irrational
Why do mathematicians use this symbol $\mathbb R$ to represent the real numbers?
Lambert W function aproximation
Calculating a simple integral using abstract measure theory
Is it true that $\sum_{k=1}^n |a_{kk}| \le \sum_{k=1}^n |\lambda_k|$ for any complex square matrix $A$?
Combinatorial Proof of Binomial Coefficient Identity, summing over the upper indices

Let $U_{1}, \, … \, ,U_{n}$ be a random sample of uniform random variables $U_i \sim \mathrm{Uniform}(0,1)$. Let $U_{(1)}, \, … \, , U_{(n)}$ be the order statistics of the sample. The goal is to prove that:

$$

W = U_{(s)}-U_{(r)} \sim \textrm{Beta}(s-r, \, n – s + r +1) \qquad 1 \leq r < s \leq n

$$

**My “proof”**

- How can I do a constructive proof of this:
- Checking if the Method of Moments and Maximum Likelihood Estimators are biased and/or consistent
- If the sum of two i.i.d. random variables is normal, must the variables themselves be normal?
- Very large moments of binomial distribution
- compound of gamma and exponential distribution
- Memoryless property of the exponential distribution

The joint pdf of $U_{(r)}, U_{(s)}$ is:

$$

f_{U_{(r)}, U_{(s)}} (u, v) = \frac{n!}{(r-1)!(s-1-r)!(n-s)!} u^{r-1} (v-u)^{s-1-r} (1 -v)^{n-s} \cdot \textbf{1}_{\{u < v\}}

$$

Consider the transformation:

$$

\begin{Bmatrix}

W = U_{(s)} – U{(r)} & U_{(r)} = Z \\

Z = U_{(r)} & U_{(s)} = W + Z \\

\end{Bmatrix}

$$

The absolute value of the Jacobian determinant is 1. Therefore, the joint pdf of the transformation is:

$$

f_{W,Z}(w,z) = \frac{n!}{(r-1)!(s-1-r)!(n-s)!} z^{r-1} w^{s-1-r} (1 – w – z)^{n-s}

$$

$$

f_W(w) = \int_{S} f_{W,Z}(w,z) \, \textrm{d}z

$$

For fixed $w$, we have $S = \{z \, : \, 0 \leq z \leq 1 – w\}$. Thus:

$$

f_W(w) = \frac{n!}{(r-1)!(s-1-r)!(n-s)!} w^{s-1-r} \int_{0}^{1-w} z^{r-1}(1-w-z)^{n-s} \, \textrm{d} z

$$

The problem is that I do not know how to evaluate the integral, but Wolfram Alpha does:

$$

\int_{0}^{1-w} z^{r-1}(1-w-z)^{n-s} \, \textrm{d} z= \frac{\Gamma(r)\Gamma(n-s+1)}{\Gamma(n+r-s+1)} (1- w)^{n+r-s}

$$

Now it is really easy to figure out the distribution of $W$. Indeed, the only thing that has to be done is to rewrite the factorials using gamma functions.

Initially I was going to ask how to evaluate the integral without the help of Wolfram Alpha, but then I realized that maybe there is a “better” proof that avoids it. I have tried to find an alternative proof by using multiple transformations which involved $W$ and another transformation, but it was useless.

**Edit**

I have corrected some mistakes I think I made:

1) The joint pdf I wrote was incorrect.

2) I wrote that I the pdf of $W$ could be found as a convolution, but $U_{(r)}, U_{(s)}$ are clearly dependent. What I actually did was an implicit change of variables.

- How far do I need to drive to find an empty parking spot?
- Distribution of Ratio of Exponential and Gamma random variable
- Compute $\phi_X(t)=E(e^{it^\top X})$ if $X\stackrel{d}{=}\mu + \xi AU$ with $AA^\top=\Sigma$
- Sums of independent Poisson random variables
- Number of arrangements of red, blue, and green balls in which a maximum of three balls of the same color are placed in the five slots
- What does the value of a probability density function (PDF) at some x indicate?
- The joint density of the max and min of two independent exponentials
- Minimum / Maximum and other Advanced Properties of the Covariance of Two Random Variables
- How to calculate the following conditional expectation? Is my calculation process right?
- Expected value and Variance

One does not need to evaluate the integral. The change of variables $u=(1-w)v$ yields that $f_W(w)$ is a constant factor $C$ times $w^{s-1-r}$ times $(1-w)^{n+r-s}$. This also yields the value of the constant factor $C$ since $f_W$ must integrate to $1$.

(Or, after the change of variables, one can recognize the integral over $v$ from $0$ to $1$ as a Beta.)

Consider this alternative proof:

One can generate $n$ uniform random variables on a line by the following equivalent way:

1. Generating $n+1$ uniform random variables on a circle.

2. Choosing a random point as a references point (‘$0$’ point).

Therefore $U(s)−U(r)$ has an identical distribution to $U(s-1)−U(r-1)$, and to $U(s-r)−U(0)=U(s-r)-0 = U(s-r)$.

We already know $U(s-r) \sim \text{Beta}(s−r,n−(s+r)+1)$

- Show continuity or uniform continuity of $\phi: (C(;\Bbb R ),||\cdot||_\infty )\to (\Bbb R, |\cdot | )$
- Dominoes and induction, or how does induction work?
- structure and properties of a function inherited from its integrals
- Prove that $ S=\{0\}\cup\left(\bigcup_{n=0}^{\infty} \{\frac{1}{n}\}\right)$ is a compact set in $\mathbb{R}$.
- Intuitively and Mathematically Understanding the Order of Actions in Permutation GP vs in Dihereal GP
- Union of uncountable cardinals
- Convergence test of the series $\sum\sin100n$
- Is there always an equivalent metric which is not complete?
- How to compute homography matrix H from corresponding points (2d-2d planar Homography)
- Convergence for log 2
- Subgroup of order $p$ is normal
- Multiplicative inverse of $2x + 3 + I$ in $\mathbb{Z}_5/I $?
- Subset of a countable set is itself countable
- Proof that $\lim_{m\to\infty}(1+\frac{r}{m})^{mt}=e^{rt}$
- Find $f(x)$ such that $f(f(x)) = x^2 – 2$