Articles of statistical inference

Statistical Inference and Manifolds

I have just begun approaching the connection between statistical inference and differencial geometry. If I got it correctly, one of the most fundamental concept regards the connection between a $ P(x; \xi) $ Statistical Distribution with $ \xi $ Parameter Vector and a certain $ V $ Manifold for which $ \xi $ can be […]

Big Balloon Game

The problem In this game, you are given empty balloons one by one, and for each balloon you are to inflate it with air until you are satisfied. If it does not burst, you gain happiness points proportional to the volume of air in the balloon (say 1 point per ml). If it bursts, you […]

Show $\psi$ and $\Delta$ are identifiable

Let $X_1$,…,$X_m$ be i.i.d. F, $Y_1$,…,$Y_n$ be i.i.d. G, where model {(F,G)} is described by $\hspace{20mm}$ $\psi$($X_1$) = $Z_1$, $\psi$($Y_1$)=$Z’_1$ + $\Delta$, where $\psi$ is an unknown strictly increasing differentiable map from R to R, $\psi$’ > 0, $\psi$($\pm$$\infty$) = $\pm$$\infty$ and $Z_1$ and $Z’_1$ are independent r.v.’s. (a) Suppose $Z_1$, $Z’_1$ have a $\mathcal{N}$(0,1) […]

Likelihood Functon.

$n$ random variables or a random sample of size $n$ $\quad X_1,X_2,\ldots,X_n$ assume a particular value $\quad x_1,x_2,\ldots,x_n$ . What does it mean? The set $\quad x_1,x_2,\ldots,x_n$ constitutes only a single value? or, $\quad x_1,x_2,\ldots,x_n$ are n values , that is, $X_1$ assumes the value $x_1$, $X_2$ assumes the value $x_2$,and so on? Why Likelihood […]

Likelihood Function for the Uniform Density $(\theta, \theta+1)$

Let the random variable X have a uniform density given by $f(x;\theta)$~$R(\theta,\theta+1)$ What is the maximum likelihood function according to the samples $X_1,\ldots,X_n$? The question is much like Likelihood Function for the Uniform Density. But there seems no satisfying answer. So far I get $X_{max}-1\le\theta\le X_{min}$ and my guess is $\hat\theta=\frac{X_{max}+X_{min}-1}{2}$. Am I right? I […]

Probability vs Confidence

My notes on confidence give this question: An investigator is interested in the amount of time internet users spend watching TV a week. He assumes $\sigma = 3.5$ hours and samples $n=50$ users and takes the sample mean to estimate the population mean $\mu$ Since $n=50$ is large we know that $\frac{\bar{X}-\mu}{\sigma/\sqrt{n}}$ approximates the Standard […]

Maximum Likelihood Estimator of parameters of multinomial distribution

Suppose that 50 measuring scales made by a machine are selected at random from the production of the machine and their lengths and widths are measured. It was found that 45 had both measurements within the tolerance limits, 2 had satisfactory length but unsatisfactory width, 2 had satisfactory width but unsatisfactory length, 1 had both […]

Why the sum of residuals equals 0 when we do a sample regression by OLS?

That’s my question, I have looking round online and people post a formula by they don’t explain the formula. Could anyone please give me a hand with that ? cheers

Identifying joint distribution

Let $Y_1$ and $Y_2$ be independent random variables with $Y_1\sim N(1,3)$ and $Y_2 \sim N(2,5).$ If $W_1=Y_1+2Y_2$ and $W_2=4Y_1-Y_2$ what is the joint distribution of $W_1$ and $W_2$? Is my procedure correct? $E(W_1)=E(Y_1+2Y_2)=E(Y_1)+2E(Y_2)=1+2\cdot2=5$ $Var(W_1)=Var(Y_1+2Y_2)=Var(Y_1)+4 Var(Y_2)=3+4\cdot 5=23$ $f_{W_1}(w_1)=\frac{1}{\sqrt{2\pi}23}\epsilon^{\frac{-1}{2}(\frac{x-5}{23})^2}$ I can’t conclude.

How do I show that the sum of residuals of OLS are always zero using matrices

I am trying to show that $$\sum_{i=1}^ne_i = 0$$ using matrices (or vectors). I have two hints, so to speak: $$ HX = X$$ where $H$ is the hat matrix, and that $$\sum_{i=1}^ne_i = e’1$$ My previous solution, In OLS is the vector of residuals always 0?, is wrong since I expanded $Y =X\beta$, leaving […]