Articles of probability

Distribution of a Random variable bounded by another random variable

Suppose that $X$ and $Y$ are independent exponential r.v.’s with parameters $\lambda$ and $\mu$. If $X$ is bounded by $Y$, how do I find the expression of $P(X<x)$? Regarding this question, I have some doubts on the format of the probability. I can take it as a conditional probability $P(X<x|X<Y)=\frac{P(X<x\,\cap\, X<Y)}{P(X<Y)}$. In this case, how […]

Expectation on 1/X

In general can one say that for a random variable X: $E[\frac{1}{X}] = \frac{1}{E[X]}$ ? I’ve worked out a few examples where this works but I’m not sure how widely this is useful…

relation between multivariate probability generating function and univariate ones

Suppose I have two independent integer random variables $X_1$, $X_2$ (with constraint that $X_1+X_2\le N,0\le X_1\le N,0\le X_2\le N$), with probability generating functions $g_1(z)$, $g_2(z)$. Now I have a joint-distribution $P(X_1-X_2,2X_1+X_2-N)$, whose probability generating function is $G(z_1,z_2)$. $N$ is a constant. What is the relation between $G(z_1,z_2)$ and $g_1(z)$, $g_2(z)$? The following question is related […]

Frog on a 3 by 3 matrix

Imagine a frog (f) on a 3 by 3 matrix, with the frog being able to jump to neighboring places from where it sits. So for example if the frog is at $s_{0,1}$, it can subsequently jump to $s_{0,0}, s_{0,2},s_{1,0},s_{1,1}$ and $s_{1,2}$. $$ \left( \begin{array}{ccc} s_{0,0} & f & s_{0,2} \\ s_{1,0} & s_{1,1} & […]

What is the chance a team will have at least 10 more wins than losses at any point in a 100 game season? They have a 50% chance of winning each game.

More generally: Each game, $n = 1,2,…,N$, a team has probability, $p = 0.5$, of winning. Their standing $x$ is given by $x(n) = x(n-1)\pm1$ depending on whether they win ($+1$) or lose ($-1$). Their standing starts at $x(n=0)=0$. What is the chance the team will have at least $x_w$ more wins than losses at […]

Problem with counting method – full house

I’ve never been particularly good at counting, and I was pleasantly surprised to learn today that many typical probability-counting problems can be solved using the equation: $$P(A\cap B) = P(A)P(B|A)$$ and it’s generalisations, rather than by more traditional counting methods. However, my first attempt to apply this equation to a more difficult counting problem has […]

Probability of a certain ball drawn from one box given that other balls were drawn

Box 1 contains 2 green and 3 red balls, 2 has 4 green and 2 red, and 3 has 3 green and 3 red. Only one ball is drawn from each of the 3 boxes. What is the probability that a green ball was drawn from box 1 given that two green balls were drawn? […]

Expectation regarding Brownian Motion

This is a formula regarding getting expectation under the topic of Brownian Motion. \begin{align} E[W(s)W(t)] &= E[W(s)(W(t) – W(s)) + W(s)^2] \\ &= E[W (s)]E[W (t) – W (s)] + E[W(s)^2] \\ &= 0+s\\ &=\min(s,t) \end{align} How does $E[W (s)]E[W (t) – W (s)]$ turn into 0? Thanks alot!! Please let me know if you […]

pdf is defined as $f_X(x)=P(X=x)$ but isn't $P(X=x)=0$

When we define a probability distribution function, we say: $f_X(x)=P(X=x)$ and thats equal to some function such as a gaussian But isn’t $P(X=x)=0$ for a continuous random variable $X$. Is it correct that the height of the pdf function at a specific x represents the likelihood of this $x$.

A proof about random variables

How to prove that given a random variable $X$ defined on a sample space $W$ and $Y = X^2$ is also a random variable defined on the sample space $W$. I tried to use some definitions of random variables but I am not familiar with mapping. Can someone explain how to construct a formal proof?