Intereting Posts

Is the matrix square root uniformly continuous?
After swapping the positions of the hour and the minute hand, when will a clock still give a valid time?
measurability question with regard to a stochastic process
Geometric construction of hyperbolic trigonometric functions
Find and classify singular points of $\cot\left(\frac{1}{z}\right)$
Prove that if $F_n$ is highly abundant, then so is $n$.
Constructing a vector space of dimension $\beth_\omega$
Is there a possibility to choose fairly from three items when every choice can only have 2 options
Equivalence relations on classes instead of sets
Cesaro Means convergence
What are curves (generalized ellipses) with more than two focal points called and how do they look like?
Properties of the space of $ T $-invariant probability measures over a compact topological space.
If $\frac{\sin^4 x}{a}+\frac{\cos^4 x}{b}=\frac{1}{a+b}$, then show that $\frac{\sin^6 x}{a^2}+\frac{\cos^6 x}{b^2}=\frac{1}{(a+b)^2}$
Which of the following is also an ideal?
Why the emphasis on Projective Space in Algebraic Geometry?

Suppose I have two tables, each of unknown size, and I’d like to estimate the average of their true sizes. I hire 2 contractors: one guarantees good precision (i.e., her measurement normally-distributed about the true value, with a standard deviation of 5mm), while the other is a dimwit (i.e., his measurement is also unbiased, but with a standard deviation of 100mm). What’s the optimal way to combine the two measurements to form a final estimate of their true average size? The formal way to ask this question is, “Given a sample from a normally-distributed random variable with known variance, and another sample from a second normally-distributed random variable with known (but potentially different) variance, what is the best guess as to the expected value of the average of the two random variables?” Is the answer “Just average them.”? Ideally, I want to prove the answer, whatever it is.

EDIT: I’m not sure if it was clear, but one contractor measures table 1, while the other contractor measures table 2.

- Positive semidefinite cone is generated by all rank one matrices.
- Math Wizardry - Formula for selecting the best spell
- Why is convexity more important than quasi-convexity in optimization?
- Optimization with cylinder
- “Box With No Top” Optimization
- A System of Matrix Equations (2 Riccati, 1 Lyapunov)

- Intuitive explanation of variance and moment in Probability
- Monty hall problem extended.
- How long until everyone has been in the lead?
- What is the probability that this harmonic series with randomly chosen signs will converge?
- Uniqueness for Integral Transform
- The probability two balls have the same number
- Probability and Laplace/Fourier transforms to solve limits/integrals from calculus.
- The Tuesday Birthday Problem - why does the probability change when the father specifies the birthday of a son?
- Given enough time, what are the chances I can come out ahead in a coin toss contest?
- Probability, integers and reals (soft question)

Just averaging them does not account for the different variances, and therefore does not yield an optimal result.

The best (as in minimum variance, unbiased) estimation of the expected value $\mu$ of those two distributions $X_1, X_2$ with variances $\sigma^2_1,\sigma^2_2$ is $$\hat\mu(x_1, x_2) = \frac{\sigma_1^2x_2+\sigma_2^2x_1}{\sigma_1^2+\sigma_2^2}$$.

Please correct my notaion. I’ve learned this stuff with German names. =)

The joint distribution of $X_1, X_2$ forms an exponential family in $T(x_1, x_2) = \sigma_1^2x_2+\sigma_2^2x_1$, since its probability density function has the form

$$f(x_1, x_2) = \frac{1}{\sqrt{2\pi\sigma_1^2}\sqrt{2\pi\sigma_2^2}}\exp({-\frac{(x_1-\mu)^2}{2\sigma_1^2}-\frac{(x_2-\mu)^2}{2\sigma_2^2}})$$ $$= h(x)A(\mu)\exp({\frac{\mu(\sigma_1^2x_2+\sigma_2^2x_1)}{2\sigma_1^2\sigma_2^2}}) $$ with $h(x)A(\mu)$ being all the uninteresting stuff. =)

Therefore $T$ is sufficient, $\mu$ depends only on $T$ and is unbiased and the Lehmann–Scheffé theorem together with some mumbling about completeness concludes the proof.

More generally, the minimal-variance unbiased linear combination of estimators is their reciprocal-variance-weighted average.

You can do away with the normality if you restrict yourself to linear estimates. Given $\mu, \sigma_1^2, \sigma_2^2, \sigma_{12}$,, you have random variables $X_1,X_2$, with $E(X_1) = E(X_2) = \mu $, and $\text{Var}(X_1) = \sigma_1^2$ and $\text{Var}(X_2) = \sigma_2^2$, and $\text{Cov}(X_1,X_2) = \sigma_{12}$. You want $l_0,l_1,l_2$ such that $E(l_0+l_1 X_1 + l_2 X_2) = \mu $ (unbiased) and $\text{Var}(l_0 + l_1 x_1 + l_2 X_2)$ is as small as possible (minimum variance unbiased estimator). From the first equation you get $l_0 + l_1 \mu + l_2 \mu = \mu $ for all $\mu$ i.e., $l_0 = \mu ( 1 – l_1 – l_2)$ for all $\mu$. Which is possible iff $l_0 = 0$ and $l_1 + l_2 = 1$. So the second condition reduces to minimizing $\text{Var}(l_1 X_1 + l_2 X_2)$ over all $l_1,l_2$ subject to $l_1 + l_2 = 1$. Simple calculus of lagrangian multipliers applied to $\text{Var}(l_1 X_1 + l_2 X_2) – \lambda(l_1 + l_2 – 1)$ leads to

$

l_1 \sigma_1^2 + l_2 \sigma_{12} – \lambda = 0

$

$

l_1 \sigma_{12} + l_2 \sigma_{2}^2 – \lambda = 0

$

$

l_1 + l_2 = 1$

from which we get our estimator as

$\frac{\alpha_1}{\alpha_1 + \alpha_2} X_1 + \frac{\alpha_2}{\alpha_1 + \alpha_2} X_2$ where $\alpha_1 = \frac{\sigma_2^2 – \sigma_{12}}{\sigma_1^2\sigma_2^2 -\sigma_{12}^2}$ and $\alpha_2 = \frac{\sigma_1^2 – \sigma_{12}}{\sigma_1^2\sigma_2^2 -\sigma_{12}^2}$. Note, if $\sigma_{12} = 0$ then it reduces to $\frac{\sigma_2^2 X_1 + \sigma_1^2 X_2}{\sigma_1^2 + \sigma_2^2}$ as noted above.

- How to prove absolute summability of sinc function?
- If a sequence satisfies $\lim\limits_{n\to\infty}|a_{n+1} – a_n|=0$ then the set of its limit points is connected
- How to find a linearly independent vector?
- Limit of $\lim_{x\rightarrow 1}\sqrt{x-\sqrt{x-\sqrt{x-\sqrt{x-…}}}}$
- Why $(\mathbb Q\times\mathbb Q)/(\mathbb Z\times{=})$ is not homeomorphic to $(\mathbb Q/\mathbb Z)\times(\mathbb Q/{=})$?
- Proving that $A_n$ is the only proper nontrivial normal subgroup of $S_n$, $n\geq 5$
- Interesting integral formula
- How to calculate the velocity for this
- A question regarding Frobenius method in ODE
- Computing the integrals of the form $\exp(P(x))$, $P(x)$ a polynomial
- Additive Property of Integrals of Step Functions
- Homotopy lifting property of $\mathbb{R} \to S^1$ in Hatcher
- Surjective endomorphism of abelian group is isomorphism
- Show that a Bilinear form is Coercive
- Is there a limit of cos (n!)?