Intereting Posts

How to prove the space of orbits is a Hausdorff space
Sum of two periodic functions
Can this quick way of showing that $K/(Y-X^2)\cong K$ be turned into a valid argument?
number of non-isomorphic rings of order $135$
Bounded index of nilpotency
Generalized “scalar product” based on multilinear form?
The probability that two vectors are linearly independent.
Does every Lebesgue measurable set have the Baire property?
Can every proof by contradiction also be shown without contradiction?
probability that he will be selected in one of the firms
Period of repeating decimals
Polar to cartesian form of r=cos(2θ)
Inductive Proof that $k!<k^k$, for $k\geq 2$.
Proofs of consistency for two formal systems
Formula for the $1\cdot 2 + 2\cdot 3 + 3\cdot 4+\ldots + n\cdot (n+1)$ sum

We have a random vector $X=(X_j)_{j=1,…,n}$, whose components $X_j$ are mutually independent. We build a new random vector $Y=A X+b$, with $Y=(Y_k)_{k=1,…,m}$, $A=(a_{k,j})_{k=1,…,m;j=1,…,n}$, $b=(b_k)_{k=1,…,m}$.

When the components of $Y$ are still mutually independent random variables? How to prove the result? I suppose this can be related with the rank of $A$ but I’m not sure and I cannot find a way to prove if I don’t make some special assumption about the law of $X$.

I know $U$ and $V$ are independent iff $f(U)$ and $g(V)$ are independent for each and every pair of measurable function $f$ and $g$. This can help, I think, when, for example, $Y_1=X_1+X_2$ and $Y_2=X_3+X_4$.

- Question about Conditional Expectation
- Brownian motion - Hölder continuity
- Domain of a random variable - sample space or probability space?
- Mutual Independence Definition Clarification
- Conditional expectation of independent variables
- Markov processes driven by the noise

But when does $Y_1 = X_1 + X_2$ and $Y_2 = X_1 – X_2$? Or when does $Y_1=X_1 + X_2$ and $Y_2=X_5$?

None of the texts I searched discuss seriously this topic, but leave it in a certain way to “intuition”?

Thanks

- Generate a number with a die that has three 0s and three 1s
- If I randomly generate a string of length N from an alphabet {A, B, C}, what's the likelihood that exactly k characters will be the same?
- iid variables, do they need to have the same mean and variance?
- How variance is defined?
- Mathematical description of a random sample
- Weak convergences of measurable functions and of measures
- If $X_n \stackrel{d}{\to} X$ and $c_n \to c$, then $c_n \cdot X_n \stackrel{d}{\to} c \cdot X$
- Does the power spectral density vanish when the frequency is zero for a zero-mean process?
- Rate of convergence in the central limit theorem (Lindeberg–Lévy)
- Probability Brownian motion is positive at two points

If $A = a_1 \oplus a_2 \oplus \cdots \oplus a_m$, for $m \leq n$, where $a_i$ are *row* vectors of dimension $n_i$ such that $\sum_{i=1}^m n_i = n$ and $\oplus$ denotes the direct sum, then the random vector $Y$ has independent coordinates.

This is not hard to see since $Y_1$ is measurable with respect to $\sigma(X_1, \ldots X_{n_1})$, $Y_2$ is measurable with respect to $\sigma(X_{n_1+1}, \ldots, X_{n_1+n_2})$, etc., and these $\sigma$-algebras are independent since the $X_i$ are independent (essentially, *by definition*).

Obviously, this result still holds if we consider matrices that are column permutations of the matrix $A$ described above. Indeed, as we see below, in the case where the distribution of each $X_i$ is non-normal (though perhaps depending on the index $i$), this is essentially the *only* form that $A$ can take for the desired result to hold.

In the normal-distribution case, as long as $A A^T = D$ for some diagonal matrix $D$, then the coordinates of $Y$ are independent. This is easily checked with the moment-generating function.

Suppose $X_1$ and $X_2$ are iid with finite variance. If $X_1 + X_2$ is independent of $X_1 – X_2$, then $X_1$ and $X_2$ are normal distributed random variables. See here. This result is known as **Bernstein’s theorem** and can be generalized (see below). A proof can be found in Feller or here (Chapter 5).

In the case where $A$ cannot be written as a direct sum of row vectors, you can always cook up a distribution for $X$ such that $Y$ does not have independent coordinates. Indeed, we have

Theorem(Lukacs and King, 1954): Let $X_1, X_2, \cdots, X_n$ be $n$ independently (but not necessarily identically) distributed random variables with variances $\sigma_i^2$, and assume that the $n$th moment of each $X_i(i = 1, 2, \cdots, n)$ exists. The necessary and sufficient conditions for the existence of two statistically independent linear forms $Y_1 = \sum^n_{i=1} a_i X_i$ and $Y_2 = \sum^n_{i=1} b_i X_i$ are

- Each random variable which has a nonzero coefficient in both forms is normally distributed, and
- $\sum^n_{i=1} a_i b_i \sigma^2_i = 0$.

I think little can be said in general, apart from the rather trivial case in which each of the resulting $y_i$ depends on non-intersecting subssets of $X$ (what would correspond to each column of A having no more than one non-null value).

Some weaker result can be obtained by considering non-correlation instead of independence (or, what would equivalent, restricting to gaussian variables), because the covariances matrices are simply related

$C_Y = A \; C_X A^t $

By assumption, $C_X$ is diagonal, and we want fo find out for which $A$ matrices $C_Y$ is also diagonal. Again, little can be said in general, more than just that. If we restrict further the assumptions, and assume that $x_i$ are iid (or just equal variances), then we get that orthogonality of $A$ in sufficient (but not necessary).

In particular, for the case $Y_1 = X_1 + X_2$, $Y_2 = X_1 – X_2$, $Y$ is not guaranteed to be uncorrelated, unless we assume $x_i$ are iid (or have same variance).

**The Skitovich-Darmois theorem A** (Skitovich(1953), Darmois(1953), see also

A. Kagan, Yu. Linnik, and C.R. Rao (1973, Ch.3)). Let $\xi_j$, where $j=1,

2,\dots, n,$ and $n\geq 2$, be independent random variables. Let

$\alpha_j, \beta_j$ be nonzero constants. If the linear statistics

$L_1=\alpha_1\xi_1+\cdots+\alpha_n\xi_n$ and

$L_2=\beta_1\xi_1+\cdots+\beta_n\xi_n$ are independent, then all

random variables $\xi_j$ are Gaussian.

- Identically zero multivariate polynomial function
- Given a simple graph and its complement, prove that either of them is always connected.
- Why does synthetic division work?
- What is the range of $y$ if $x+y+z=4$ and $xy+yz+xz=5$ for $x, y, z \in\mathbb{R}_+$
- reference for finit sum of cotangents
- Monotone Class Theorem and another similar theorem.
- $L^2$ norm inequality
- Evaluating $\lim_{x\to0}\frac{1-\cos(x)}{x}$
- Formula that's only satisfiable in infinite structures
- Weakest hypothesis for integration by parts
- Non-abelian group of order $p^3$ without semidirect products
- Integral $ \int \frac{\operatorname d\!x}{\sin^3 x} $
- Three normals of a hyperbola passing through the same point on the curve
- Intuitive/Visual proof that $(1+2+\cdots+n)^2=1^3+2^3+\cdots+n^3$
- Describing the Wreath product categorically.