# Show $\psi$ and $\Delta$ are identifiable

Let $X_1$,…,$X_m$ be i.i.d. F, $Y_1$,…,$Y_n$ be i.i.d. G, where model {(F,G)} is described by

$\hspace{20mm}$ $\psi$($X_1$) = $Z_1$, $\psi$($Y_1$)=$Z’_1$ + $\Delta$,

where $\psi$ is an unknown strictly increasing differentiable map from R to R, $\psi$’ > 0, $\psi$($\pm$$\infty) = \pm$$\infty$ and $Z_1$ and $Z’_1$ are independent r.v.’s.

(a) Suppose $Z_1$, $Z’_1$ have a $\mathcal{N}$(0,1) distribution. Show that both $\psi$ and $\Delta$ are identifiable.

(b) Suppose $Z_1$ and $Z’_1$ have a $\mathcal{N}$(0, $\sigma^2$) distribution with $\sigma^2$ unknown. Are $\psi$ and $\Delta$ still identifiable?

#### Solutions Collecting From Web of "Show $\psi$ and $\Delta$ are identifiable"

I think I’ve understood the question, not sure I’ve got the right answer yet.

What you want to do is transform the statements

$$\psi(X_1) = Z_1, \psi(Y_1)=Z’_1 + \Delta$$

into statements about probability.

Note that

$$\mathbb{P}(X_1<x)=\mathbb{P}(\psi(X_1)<\psi(x))$$

because $\psi$ is a strictly increasing map.

Then

$$\mathbb{P}(X_1<x)=\mathbb{P}(\psi(X_1)<\psi(x)) = \mathbb{P}(Z_1<\psi(x)) = \Phi(\psi(x))$$

In which $\Phi$ is the error-function.

Likewise for $Y_1$

$$\mathbb{P}(Y_1<y)=\mathbb{P}(\psi(Y_1)<\psi(y)) = \mathbb{P}(Z’_1<\psi(y)-\Delta) = \Phi(\psi(y)-\Delta))$$

Now, presumably, the distribution laws $F$ and $G$ of $X_1$ and $Y_1$ are known and therefore these equations are in principle solvable to find $\psi$ and $\Delta$.