Articles of estimation theory

Variance of sample mean (problems with proof)

Assuming that I have $\{x_1,\ldots, x_N\}$ – an iid (independent identically distributed) sample size $N$ of observations of random variable $\xi$ with unknown mean $m_1$, variance (second central moment) $m_{c_2}$ and second raw moment $m_2$. I try to use sample mean $\overline{x}=\frac{1}{N}\sum_{i=1}^Nx_i$ as an estimator of the true mean. So I want to find its’ […]

Estimate function f(x) in high-dimensional space

I’m working on a problem of estimating a function $y=f(x): \mathbb{R}^d \rightarrow \mathbb{R}$. Namely, I have an unknown function $f(x)$ (like a black box), what I can do is to input $x^{(i)}$ to it, and obtain $y^{(i)}$ ($i=1,2,\cdots, N$). Then I get a dataset $(x^{(i)}, y^{(i)})$ and am able to fit a function on it. […]

Prove that $\int k(w)o(h^2w^2)dw=o(h^2)$ for $\int k(w)dw=1$

Suppose that $k$ is nonnegative real-valued function satisfying $$ \int k(w)dw=1,\quad\int wk(w)dw=0,\quad\int w^2k(w)dw=\kappa_2<\infty.\tag{$\star$} $$ (The limits of the integrals are all from $-\infty$ to $+\infty$.) Can you please teach me a rigorous argument to justify that $$ \int k(w)[\underbrace{o(w^2h^2)}_{|wh|\to 0}]dw=o(h^2)\text{ as }|h|\to 0. $$ Context: I’m starting to study kernel density estimation. The bias for […]

Minimum variance unbiased estimator for scale parameter of a certain gamma distribution

Let $X_1, X_2, …, X_n$ be a random sample from a distribution with p.d.f., $$f(x;\theta)=\theta^2xe^{-x\theta} ; 0<x<\infty, \theta>0$$ Obtain minimum variance unbiased estimator of $\theta$ and examine whether it is attained? MY WORK: Using MLE i have found the estimator for $\theta=\frac{2}{\bar{x}}$ Or as $$X\sim Gamma(2, \theta)$$So $E(X)=2\theta$ $E(\frac{x}{2})=\theta$ so can i take $\frac {x}{2}$ […]

Maximum Likelihood Estimator of parameters of multinomial distribution

Suppose that 50 measuring scales made by a machine are selected at random from the production of the machine and their lengths and widths are measured. It was found that 45 had both measurements within the tolerance limits, 2 had satisfactory length but unsatisfactory width, 2 had satisfactory width but unsatisfactory length, 1 had both […]

Difference between logarithm of an expectation value and expectation value of a logarithm

Assuming I have a always positive random variable $X$, $X \in \mathbb{R}$, $X > 0$. Then I am now interested in the difference between the following two expectation values: $E \left[ \ln X \right]$ $\ln E \left[ X \right]$ Is one maybe always a lower/upper bound of the other? Many thanks in advance…

Intuitive explanation of a definition of the Fisher information

I’m studying statistics. When I read the textbook about Fisher Information, I couldn’t understand why the Fisher Information is defined like this: $$I(\theta)=E_\theta\left[-\frac{\partial^2 }{\partial \theta^2}\ln P(\theta;X)\right].$$ Could anyone please give an intuitive explanation of the definition?

Maximum a Posteriori (MAP) Estimator of Exponential Random Variable with Uniform Prior

What would be the Maximum a Posteriori (MAP) estimator for $ \lambda $ for IID $ \left\{ {x}_{i} \right\}_{i = 1}^{N} $ where $ {x}_{i} \sim \exp \left( \lambda \right), \; \lambda \sim U \left[ {u}_{0}, {u}_{1} \right] $? One could assume that $u_0 > 0 $. The Exponential Distribution is given by: $$ f(x; […]

How can I compare two Markov processes?

There is a discrete-time irreductible Markov process with $r$ possible states. $k$ observations were performed. At each observation a state of process was determined. $T_0 = \lbrace 0,1,\dots ,k-1\rbrace$ $T_1 = \lbrace 1,2,\dots ,k-1\rbrace$ $n_i(t) = 1$ means that process was in state $i$ at time $t$, ($t \in T_0$), $n_i(t) = 0$ otherwise, $v_{ij}(t) […]