# Finding $E(N)$ in this question

suppose $X_1,X_2,\ldots$ is sequence of independent random variables of $U(0,1)$ if
$N=\min\{n>0 :X_{(n:n)}-X_{(1:n)}>\alpha , 0<\alpha<1\}$ that $X_{(1:n)}$ is smallest order statistic and
$X_{(n:n)}$ is largest order statistic. how can find $E(N)$

#### Solutions Collecting From Web of "Finding $E(N)$ in this question"

Let $m_n=\min\{X_k\,;\,1\leqslant k\leqslant n\}=X_{(1:n)}$ and $M_n=\max\{X_k\,;\,1\leqslant k\leqslant n\}=X_{(n:n)}$. As explained in comments, $(m_n,M_n)$ has density $n(n-1)(y-x)^{n-2}\cdot[0\lt x\lt y\lt1]$ hence $M_n-m_n$ has density $n(n-1)z^{n-2}(1-z)\cdot[0\lt z\lt1]$.

For every $n\geqslant2$, $[N\gt n]=[M_n-m_n\lt\alpha]$ hence
$$\mathrm P(N\gt n)=\int_0^\alpha n(n-1)z^{n-2}(1-z)\mathrm dz=\alpha^{n}+n(1-\alpha)\alpha^{n-1}.$$
The same formula holds for $n=0$ and $n=1$ hence
$$\mathrm E(N)=\sum_{n=0}^{+\infty}\mathrm P(N\gt n)=\sum_{n=0}^{+\infty}\alpha^n+(1-\alpha)\sum_{n=0}^{+\infty}n\alpha^{n-1}=\frac2{1-\alpha}.$$
Edit: To compute the density of $(m_n,M_n)$, start from the fact that
$$\mathrm P(x\lt m_n,M_n\lt y)=\mathrm P(x\lt X_1\lt y)^n=(y-x)^n,$$
for every $0\lt x\lt y\lt 1$. Differentiating this identity twice, once with respect to $x$ and once with respect to $y$, yields the opposite of the density of $(m_n,M_n)$.