Intereting Posts

Why study Galois Representations?
Counting squarefree numbers which have $k$ prime factors?
When the product of dice rolls yields a square
Motivation for separation axioms
Going down theorem fails
Is there a Stokes theorem for covariant derivatives?
Martingale formulation of Bellman's Optimality Principle
Is the sum and difference of two irrationals always irrational?
How do I show that $\sqrt{5+\sqrt{24}} = \sqrt{3}+\sqrt{2}$
Inequality between two sequences preserved in the limit?
How to show the series $\displaystyle\sum_{\xi\in\mathbb Z^n}\frac{1}{(1+|\xi|^2)^{s/2}}$ converges if and only if $s>n$?
Why does the natural ring homomorphism induce a surjective group homomorphism of units?
Every $k$ vertices in an $k$ – connected graph are contained in a cycle.
List of interesting math podcasts?
Does the open mapping theorem imply the Baire category theorem?

Suppose that 50 measuring scales made by a machine are selected at random from the production of the machine and their lengths and widths are measured. It was found that 45 had both measurements within the tolerance limits, 2 had satisfactory length but unsatisfactory width, 2 had satisfactory width but unsatisfactory length, 1 had both length and width unsatisfactory. Each scale may be regarded as a drawing from a multinomial population with density

$$

\pi_{11}^{x_{11}}

\pi_{12}^{x_{12}}

\pi_{21}^{x_{21}}(1-\pi_{11}-\pi_{12}-\pi_{21})^{x_{22}}

$$

*Obtain the maximum likelihood estimates of the parameters.*

- How can I compare two Markov processes?
- Intuitive explanation of a definition of the Fisher information
- Difference between logarithm of an expectation value and expectation value of a logarithm
- Prove that $\int k(w)o(h^2w^2)dw=o(h^2)$ for $\int k(w)dw=1$
- Variance of sample mean (problems with proof)
- Estimate function f(x) in high-dimensional space

I have tried this by the following way:

the likelihood function is

$L=L(\pi_{11},\pi_{12},\pi_{21},(1-\pi_{11}-\pi_{12}-\pi_{21}))$

$=\prod_{i=1}^{50}[\pi_{11}^{x_{11}}

\pi_{12}^{x_{12}}

\pi_{21}^{x_{21}}(1-\pi_{11}-\pi_{12}-\pi_{21})^{x_{22}}]

$

$=[\pi_{11}^{x_{11}}

\pi_{12}^{x_{12}}

\pi_{21}^{x_{21}}(1-\pi_{11}-\pi_{12}-\pi_{21})^{x_{22}}

]^{50}$

$=[\pi_{11}^{45}

\pi_{12}^{2}

\pi_{21}^{2}(1-\pi_{11}-\pi_{12}-\pi_{21})^{1}

]^{50}$

$=\pi_{11}^{2250}

\pi_{12}^{100}

\pi_{21}^{100}(1-\pi_{11}-\pi_{12}-\pi_{21})^{50}$

Taking logarithm of the likelihood function yields,

$L^*=\log L=\log [\pi_{11}^{2250}

\pi_{12}^{100}

\pi_{21}^{100}(1-\pi_{11}-\pi_{12}-\pi_{21})^{50}]$

$=2250\log [\pi_{11}]+100\log [\pi_{12}]+100\log [\pi_{21}]+50\log (1-\pi_{11}-\pi_{12}-\pi_{21})$

Now taking the first derivative of $L^*$ with respect to $\pi_{11}$

$\frac{\partial L^*}{\partial \pi_{11}}$

$=\frac{2250}{\pi_{11}}-\frac{50}{(1-\pi_{11}-\pi_{12}-\pi_{21})}$

setting $\frac{\partial L^*}{\partial \pi_{11}}$ equal to $0$,

$\frac{\partial L^*}{\partial \hat\pi_{11}}=0$

$\Rightarrow\frac{2250}{\hat\pi_{11}}-\frac{50}{(1-\hat\pi_{11}-\hat\pi_{12}-\hat\pi_{21})}=0$

$\Rightarrow \hat\pi_{11}=\frac{45(1-\hat\pi_{12}-\hat\pi_{21})}{44}$

*$\bullet$Are the procedure and estimate of $\pi_{11}$ correct?*

*$\bullet$I have another question that if it is multinomial then where the term $\binom{n}{x_{11}x_{12}x_{21}x_{22}}=\binom{50}{45,2,2,1}$?*

- Intuitively, why does Bayes' theorem work?
- Fair and Unfair coin Probability
- What is the most unfair set of three nontransitive dice?
- Dice Probability of rolling at least one four?
- The probability of an ace from a 5-card hand?
- Conditional probability intuition.
- Is it generally accepted that if you throw a dart at a number line you will NEVER hit a rational number?
- Non-Probabilistic Argument for Divergence of the Simple Random Walk
- Why is stopping time defined as a random variable?
- Question on the 'Hat check' problem

Consider a positive integer $n$ and a set of positive real numbers $\mathbf p=(p_x)$ such that $\sum\limits_xp_x=1$. The multinomial distribution with parameters $n$ and $\mathbf p$ is the distribution $f_\mathbf p$ on the set of nonnegative integers $\mathbf n=(n_x)$ such that $\sum\limits_xn_x=n$ defined by

$$

f_\mathbf p(\mathbf n)=n!\cdot\prod_x\frac{p_x^{n_x}}{n_x!}.

$$

For some fixed observation $\mathbf n$, the likelihood is

$L(\mathbf p)=f_\mathbf p(\mathbf n)$ with the constraint $C(\mathbf p)=1$, where $C(\mathbf p)=\sum\limits_xp_x$. To maximize $L$, one asks that the gradient of $L$ and the gradient of $C$ are colinear, that is, that there exists $\lambda$ such that, for every $x$,

$$

\frac{\partial}{\partial p_x}L(\mathbf p)=\lambda\frac{\partial}{\partial p_x}C(\mathbf p).

$$

In the present case, this reads

$$

\frac{n_x}{p_x}L(\mathbf p)=\lambda,

$$

that is, $p_x$ should be proportional to $n_x$. Since $\sum\limits_xp_x=1$, one gets finally $\hat p_x=\dfrac{n_x}n$ for every $x$.

Let $\mathbf{X}$ be a RV following multinomial distribution. Then, $$\begin{align}P(\mathbf{X} = \mathbf{x};n,\mathbf{p}) &= n!\,\Pi_{k=1}^K \frac{p_k^{x_k}}{x_k!} \end{align}$$

$x_i$ is the number of success of the $k^{th}$ category in $n$ random draws, where $p_k$ is the probability of success of the $k^{th}$ category. Note that, $$\begin{align}\sum_{k=1}^K x_k &= n\\ \sum_{k=1}^{K} p_k &=1 \end{align}$$

For the estimation problem, we have $N$ samples $\mathbf{X_1},

\ldots,\mathbf{X_N}$ drawn independently from above multinomial distribution. The log-liklihood is given as $$\mathcal{L}(\mathbf{p},n) = \sum_{i=1}^N \log P(\mathbf{x_i},n,\mathbf{p})$$

where $$\begin{align}\log P(\mathbf{x_i},n,\mathbf{p}) &= \log \frac{n!}{\Pi_k x_{ik}!} + \sum_{k=1}^{K} x_{ik} \log p_k \\ \sum_{i=1}^N \log P(\mathbf{x_i},n,\mathbf{p}) &= C + \sum_{k=1}^{K} N_k \log p_k \end{align}$$

where $N_k = \sum_{i=1}^{N} x_{ik}$, is the total number of success of $k^{th}$ category in $N$ samples.

For MLE estimate of $\mathbf{p}$, assuming $n$ is known, we solve the following optimization problem:

$$\begin{align} \max_{\mathbf{p}} &\,\, \mathcal{L}(\mathbf{p},n) \\ s.t. & \,\, \sum_{k=1}^{K} p_k \,\,=1\end{align}$$ Using equality constraint for variable reduction, $$p_K\,=\, 1 – \sum_{k=1}^{K-1} p_k$$ We have an unconstrained problem in $K-1$ variables. Compute gradient for stationary point computation as,

$$\begin{align}\frac{\partial\mathcal{L}(\mathbf{p},n)}{\partial p_k} &= \frac{N_k}{p_k} – \frac{N_K}{p_K}\,\,=\,\, 0 \\ p_k &= \frac{N_k\,p_K}{N_K}\end{align}$$ Solving, with $\sum_{k=1}^{K} p_k\,=\, 1$ gives MLE estimate for $p_k$, $$p_k = \frac{N_k}{nN}$$

- Is something similar to Robin's theorem known for possible exceptions to Lagarias' inequality?
- Why is it important to study combinatorics?
- How to obtain asymptotes of this curve?
- Example $x$, $y$ and $z$ values for $x\uparrow^\alpha y=z$ where $\alpha\in \Bbb R-\Bbb N$
- Proving that $ f: \to \Bbb{R} $ is Riemann-integrable using an $ \epsilon $-$ \delta $ definition.
- Dual residual for linearized ADMM
- Countable unions of countable sets
- Wiedemann for solving sparse linear equation
- On critical points of a large function
- Area enclosed between the curves $y=x^2$ and $y=60-7x$
- Why an inconsistent formal system can prove everything?
- Measure theoretic proof of $|\Bbb{Z}^d/A\Bbb{Z}^d| = |\det(A)|$
- sum of all positive integers less than $n$ and relatively prime to $n$
- Show that $(\mathbb Z,+)$ and $(\mathbb Q_{>0},\cdot)$ are isomorphic groups
- Is every monomorphism an injection?