Intereting Posts

Fourier series for $\sin x$ is zero?
From the series $\sum_{n=1}^{+\infty}\left(H_n-\ln n-\gamma-\frac1{2n}\right)$ to $\zeta(\frac12+it)$.
Sturm-Liouville Problem
Projective Noether normalization?
Local vs global truncation error
Can multiplication be defined in terms of divisibility?
Evaluate $\prod_{n=1}^\infty \frac{2^n-1}{2^n}$
Prove that $\displaystyle\int_{x=-1}^{1}P_L(x)P_{L-1}\acute (x)\,\mathrm{d}x=\int_{x=-1}^{1}P_L\acute(x)P_{L+1} (x)\,\mathrm{d}x=0$
Regularity of a domain – definition
Two styles of semantics for a first-order language: what's to choose?
Solving $x^k+(x+1)^k+(x+2)^k+\cdots+(x+k-1)^k=(x+k)^k$ for $k\in\mathbb N$
To prove two angles are equal when some angles are supplementary in a parallelogram
How to integrate $\int \frac{1}{\sin^4x + \cos^4 x} \,dx$?
Suppose f is differentiable on an interval I. Prove that f' is bounded on I if and only if exists a constant M such that $|f(x) – f(y)| \le M|x – y|$
Given a Fourier series $f(x)$: What's the difference between the value the expansion takes for given $x$ and the value it converges to for given $x$?

I’m working through a math stats book on my own (I’ve always wanted to learn it), but I’m getting confused about the definition of a random variable. The book says that a random variable is a function from the state space $\Omega$ into some space $T$. I understand this in terms of some simple examples: take a finite state space where each event has a probability. Then, given some $X$, we can easily compute $E(X)$ by mapping each event in $\Omega$ to $X(\omega)$ and so on.

But, here’s my problem: we also talk about “Normal random variables” or “Cauchy random variables” or … I having a hard time connecting those random variables to the functional definition. What is the state space $\Omega$? My first guess would be $\Omega=\mathbb{R}$, but that doesn’t seem right because $P(\Omega)=1$ and equal length intervals should have equal probability, right? That doesn’t work if $\Omega=\mathbb{R}$ though…

- How do I calculate the odds of a given set of dice results occurring before another given set?
- Weak/strong law of large numbers for dependent variables with bounded covariance
- Fixed-time Jumps of a Lévy process
- Almost sure convergence of random variables
- What is an approximation for Poisson binomial distribution?
- Random walk in the plane

- A square integrable martingale has orthogonal increments
- Intuition for the definition of the Gamma function?
- Are $X$ and $X+Y$ independent, if $X$ and $Y$ are independent?
- Why is this coin-flipping probability problem unsolved?
- When can a measurable mapping be factorized?
- Mean of gamma distribution.
- One confusion over conditional expectation
- Uncorrelated but not independent random variables
- Why can't Fubini's/Tonelli's theorem for non-negative functions extend to general functions?
- Independent, Identically Distributed Random Variables

First of all, a random variable is usually defined as a function $X: \Omega \to \mathbb{R}$. So for any possible event in the state space $\omega \in \Omega$, the random variable $X(\omega)$ assigns a real number to that event.

Strictly speaking, probabilities are defined for special sets of events in $\Omega$. They are not defined on the target space $\mathbb{R}$. So if we’re being precise, it doesn’t makes sense to ask “what is the probability of $X = 3$?” Instead, we should be asking “what is the probability of the *set of events corresponding to* $X=3$?”

**But, there’s a catch.** Random variables are not just any old functions. They are **measurable** functions from $\Omega \to \mathbb{R}$. This means that any set of values in the target space $\mathbb{R}$ corresponds to some set of events in $\Omega$, for which a probability has been defined. Therefore, because of this fact we can cut corners and refer to “the probability of $X = 3$,” even though it doesn’t *exactly* make sense.

Which brings us to your question. When we speak of “Normal” or “Cauchy” random variables, we are describing **how** the random variables assign probabilities to events in the state space $\Omega$. We are **not** actually describing the state space itself. When we say that, for a Normal r.v. $X$ that $P(X \leq 1) = 0.84$, we are really saying that “the probability of all events $\omega$ that $X$ maps to a real number $\leq$ 1 is equal to 0.84.” But these events themselves can be anything.

So in short, the answer is: **it depends.**

- What is the correct English name of these lines?
- Asymptotic expansion of an integral
- Prove that $(0)$ is a radical ideal in $\mathbb{Z}/n\mathbb{Z}$ iff $n$ is square free
- Prove $S_4$ has only 1 subgroup of order 12
- Rolle's Theorem
- Find a formula for this sequence (and prove it).
- Geometric interpretation of Pythagorean_theorem in complex plane?
- What is $-i$ exactly?
- Help to prove that : $\int_{0}^{1}\int_{0}^{1}{1-x\over 1-xy}\cdot{x\over \ln{(xy)}}dxdy={1\over 2}\ln{1\over 2}$
- Proof of Closest Point Theorem in Hilbert Space
- Fourier Cosine Transform (Parseval Identity) for definite integral
- What is (fundamentally) a coordinate system ?
- Expected outcome for repeated dice rolls with dice fixing
- If weak topology and weak* topology on $X^*$ agree, must $X$ be reflexive?
- Proof of Cauchy Riemann Equations in Polar Coordinates