Articles of probability distributions

Find the distribution of linear combination of independent random variables

Given independent and identically distributed random variables $X_1, X_2, \dots, X_n$, each of them has the same p.d.f $f(x) = Pr(X = x)$ on support $(a, b)$. How do I find the pdf or cdf of $Y = \sum_{i = 1}^n a_iX_i$, where $1 \le i \le n$ and $a_i$’s are constants?

Mean of gamma distribution.

So I was trying to prove the mean result of gamma distribution which is $\frac{\alpha}{\lambda}$. My attempt, $E(X)=\int_{0}^{\infty }x f(x)dx$ $=\int_{0}^{\infty } \frac{\lambda^{\alpha}}{\Gamma (\alpha)}x^{\alpha}e^{-\lambda x}dx$ After integrating it, I got the result $$\frac{\lambda^{\alpha}}{\Gamma (\alpha)} \cdot\frac{\alpha}{\lambda}(\int_{0}^{\infty } x^{\alpha-1}e^{-\lambda x}dx)$$. I’m stuck here. Could anyone continue it for me and explain? Thanks a lot.

Sum of random variable

Considering two continuous random variables $X$ and $Y$ with $d.f \; F_X, F_Y$ I want to fin the distribution and distribution function of the sum $Z=X+Y$. \begin{align} P\{Z \leq z\} &= P\{X+Y \leq z\} \\ &=P\{X \leq z – Y\} \\ &=F_X(z – Y) \end{align} I know here that I have to use somehow the […]

Big Balloon Game

The problem In this game, you are given empty balloons one by one, and for each balloon you are to inflate it with air until you are satisfied. If it does not burst, you gain happiness points proportional to the volume of air in the balloon (say 1 point per ml). If it bursts, you […]

Central Limit Theorem for exponential distribution

Suppose that $X_1$ ….. $X_n$ are a random sample from a population having an exponential distribution with rate parameter $\lambda$. Use the Central Limit Theorem to show that, for large n, $\sqrt{n}(\lambda\bar{x}-1) \sim Normal(0,1)$ My attempt: honestly I am really not understanding what this question is asking. I can see that for an exponential distribution, […]

Finding the marginal posterior distribution of future prediction, $y_{n+1}$

Assume the following bivariate regression model: $y_i = \beta x_i + u_i$ where $u_i$ is i.i.d $N(0, \sigma^2 = 9)$ for $i = 1, 2, …, n$. Assume a noninformative prior of the form: $p(\beta) \propto constant$, then it can be shown that the posterior pdf for $\beta$ is: $p(\beta|\mathbf{y}) = (18\pi)^{-\frac{1}{2}}\left(\sum_{i=1}^n x_i^2\right)^{\frac{1}{2}} \exp\left[-\frac{1}{18}\sum_{i=1}^n x_i^2 […]

Simplifying Multiple Integral for Compound Probability Density Function

Are there any ways to simplify this multiple integral? $$ \hat{f}\left(\left.y\right|\alpha\right)=\int_{-\infty}^{\infty}\cdots\int_{-\infty}^{\infty}\hat{f}\left(\left.y\right|\theta_{1}\right)\hat{h}_{1}\left(\left.\theta_{1}\right|\theta_{2}\right)\cdots\hat{h}_{K}\left(\left.\theta_{K}\right|\alpha\right)d\theta_{1}\cdots d\theta_{K} $$ Here, the density function $\hat{f}\left(\left.y\right|\theta_{1}\right)$ depends on parameter $\theta_{1}$ which is unknown and is governed by another density function, $\hat{h}_{1}\left(\left.\theta_{1}\right|\theta_{2}\right)$ with hyper-parameter $\theta_{2}$ which could again be governed by another density $\hat{h}_{2}\left(\left.\theta_{2}\right|\theta_{3}\right)$ with hyper-parameter $\theta_{3}$ and so no until we have density […]

PDF of $f(x)=1/\sin(x)$?

What is the probability density function (PDF) of $f(x)=1/\sin(x)$ when $x$ is uniformly distributed in $(0,90)$? $f(x)=\sin(x)$ has a known PDF, which has the form $2(\pi\sqrt{1-\sin(x)^2})^{-1}$, but I cannot find the PDF for $1/\sin(x)$. The latter would have interesting applications in astronomy, specially for so-called “luminosity functions”. Thank you very much Sebastian

How to show that these random variables are pairwise independent?

Given the arrays $C=[C_1,C_2,…,C_N]$ and $S=[S_1,S_2,…,S_N]$ of lengths $N$ with elements that are discrete iid uniform distributed with equal probability (p=1/2) of being $\pm$ 1 Consider the random variables (for a given $l, n, m$): $W=C_lC_mC_n$ $X=S_lS_mC_n$ $Y=C_lS_mS_n$ $Z=S_lC_mS_n$ It can be shown that these random variables ($W, X, Y, Z$) are zero mean, uniform […]

Convolution Theorem and Marginal Density Intuition.

In terms of marginal density, how does one know that summing over the $x$ (or rather along the linear line) values for the joint density of $(x,z-x)$ give us the density function of $z$? More importantly, can someone explain this intuitively? (aside from proofs)