Mean and Variance of Methods of Moment Estimate and Maximum Likelihood Estimate of Uniform Distribution.

Let $X_1, X_2,\ldots, X_n$ be i.i.d. uniform on $[0, \theta ]$.

a. Find the method of moments estimate of $\theta$ and its mean and variance

b. Find the MLE of $\theta$ and its mean and variance.

Thank you for answering, I really appreciate it.

My answers were:

a. $\hat{\theta} = 2 \bar{X}$

b. $\hat{\theta} = X_n$

I’m not just sure about my solution, I don’t also know how to start solving for the mean and variance considering the MLE and MME.

Solutions Collecting From Web of "Mean and Variance of Methods of Moment Estimate and Maximum Likelihood Estimate of Uniform Distribution."

OK, so I’ll drop a few hints.

First of all: Check that your MLE estimator of $\theta$ is indeed the maximum of the likelihood function. Note that this maximum is not detected by the derivative!

Now, mean and variance of $\hat \theta=2\overline{X}$ can be deduced from those of $\overline{X}.$ The distribution of $\overline{X}$ is difficult to write down, but you don’t need the whole pdf, you only need $E(\overline{X})$ and Var $(\overline{X}).$ If you have been given this problem, you probably already know what the mean and the variance of the sample mean is, but just in case, here you are: $E(\overline{X})=E(X),$ Var$(\overline{X})=$Var$(X)/n.$

Concerning the MLE, you probably will have to work out first the pdf of $\hat \theta=\max\{X_1,\cdots,X_n\}$. Fix $x\in [0,\theta].$ From the definition of a maximum, $P[\hat \theta \le x]=P[X_1\le x, \;X_2\le x,\;\cdots, X_n\le x];$ now we use that the $X_i$ are independent copies of the uniform distribution in $[0,\theta]$,hence $P[\hat \theta \le x]=P[X\le x]^n$ where $X$ is a uniform distribution in $[0,\theta].$ Since $P[X\le x]=x/\theta$ (cdf of a uniform variable in $[0,\theta]$), we deduce that the cdf of $\hat\theta$ is $x^n/\theta^n$ in $[0,\theta]$ and thus the pdf is $nx^{n-1}/\theta^n$, also in $[0,\theta]$. Now use this pdf to compute $E(\hat \theta)$ and Var$\hat \theta$ in the usual way. That is, $E(\hat \theta)=\int_0^{\theta}x (nx^{n-1}/\theta^n)\,dx$ and Var$\hat \theta=\int_0^{\theta}x^2 (nx^{n-1}/\theta^n)\,dx – E(\hat\theta)^2$

Your MLE is wrong. You said $X_1,\ldots,X_n$ are i.i.d. That implies $X_1$ or $X_2$, etc., is just as likely to be the maximum observed value as is $X_n$ or any other. The MLE is actually $\max\{X_1,\ldots,X_n\}$.

If you use the conventional notation for the order statistics, with parentheses enclosing the subscripts, so that $X_{(1)}\le X_{(2)} \le \cdots\le X_{(n)}$, then the MLE is $X_{(n)}$.

The density of the uniform distribution on $[0,\theta]$ is $\dfrac 1 \theta$ for $0<x<\theta$, so the joint density is $\dfrac{1}{\theta^n}$ for $0< x_1,\ldots,x_n<\theta$. Look at this as a function of $\theta$: it’s $\dfrac{1}{\theta^n}$ for $\theta>\text{all }x\text{s}$. Thus the likelihood function is
$$
L(\theta) = \frac{1}{\theta^n}\text{ for }\theta \ge \max\{x_1,\ldots,x_n\}$.
$$
This is a decreasing function on the whole interval $[\max\{x_1,\ldots,x_n\},\infty)$. Thus it attains its maximum value at the left endpoint of the interval, which is $\max\{x_1,\ldots,x_n\}$.