Difference of order statistics in a sample of uniform random variables

Let $U_{1}, \, … \, ,U_{n}$ be a random sample of uniform random variables $U_i \sim \mathrm{Uniform}(0,1)$. Let $U_{(1)}, \, … \, , U_{(n)}$ be the order statistics of the sample. The goal is to prove that:

W = U_{(s)}-U_{(r)} \sim \textrm{Beta}(s-r, \, n – s + r +1) \qquad 1 \leq r < s \leq n

My “proof”

The joint pdf of $U_{(r)}, U_{(s)}$ is:

f_{U_{(r)}, U_{(s)}} (u, v) = \frac{n!}{(r-1)!(s-1-r)!(n-s)!} u^{r-1} (v-u)^{s-1-r} (1 -v)^{n-s} \cdot \textbf{1}_{\{u < v\}}

Consider the transformation:

W = U_{(s)} – U{(r)} & U_{(r)} = Z \\
Z = U_{(r)} & U_{(s)} = W + Z \\

The absolute value of the Jacobian determinant is 1. Therefore, the joint pdf of the transformation is:

f_{W,Z}(w,z) = \frac{n!}{(r-1)!(s-1-r)!(n-s)!} z^{r-1} w^{s-1-r} (1 – w – z)^{n-s}

f_W(w) = \int_{S} f_{W,Z}(w,z) \, \textrm{d}z

For fixed $w$, we have $S = \{z \, : \, 0 \leq z \leq 1 – w\}$. Thus:

f_W(w) = \frac{n!}{(r-1)!(s-1-r)!(n-s)!} w^{s-1-r} \int_{0}^{1-w} z^{r-1}(1-w-z)^{n-s} \, \textrm{d} z

The problem is that I do not know how to evaluate the integral, but Wolfram Alpha does:

\int_{0}^{1-w} z^{r-1}(1-w-z)^{n-s} \, \textrm{d} z= \frac{\Gamma(r)\Gamma(n-s+1)}{\Gamma(n+r-s+1)} (1- w)^{n+r-s}

Now it is really easy to figure out the distribution of $W$. Indeed, the only thing that has to be done is to rewrite the factorials using gamma functions.

Initially I was going to ask how to evaluate the integral without the help of Wolfram Alpha, but then I realized that maybe there is a “better” proof that avoids it. I have tried to find an alternative proof by using multiple transformations which involved $W$ and another transformation, but it was useless.


I have corrected some mistakes I think I made:

1) The joint pdf I wrote was incorrect.

2) I wrote that I the pdf of $W$ could be found as a convolution, but $U_{(r)}, U_{(s)}$ are clearly dependent. What I actually did was an implicit change of variables.

Solutions Collecting From Web of "Difference of order statistics in a sample of uniform random variables"

One does not need to evaluate the integral. The change of variables $u=(1-w)v$ yields that $f_W(w)$ is a constant factor $C$ times $w^{s-1-r}$ times $(1-w)^{n+r-s}$. This also yields the value of the constant factor $C$ since $f_W$ must integrate to $1$.

(Or, after the change of variables, one can recognize the integral over $v$ from $0$ to $1$ as a Beta.)

Consider this alternative proof:
One can generate $n$ uniform random variables on a line by the following equivalent way:
1. Generating $n+1$ uniform random variables on a circle.
2. Choosing a random point as a references point (‘$0$’ point).

Therefore $U(s)−U(r)$ has an identical distribution to $U(s-1)−U(r-1)$, and to $U(s-r)−U(0)=U(s-r)-0 = U(s-r)$.
We already know $U(s-r) \sim \text{Beta}(s−r,n−(s+r)+1)$