Intereting Posts

The number of divisors of any positive number $n$ is $\le 2\sqrt{n}$
Is the sets of all maps from $\mathbb{N}$ to $\mathbb{N}$ countable?
Quadratic variation of Brownian motion and almost-sure convergence
Mathematical writing: why should we not use the phrase “we have that”?
Prove that if $n$ is a composite, then $2^n-1$ is composite.
Showing that $2 \Gamma(a) \zeta(a) \left(1-\frac{1}{2^{a}} \right) = \int_{0}^{\infty}\left( \frac{x^{a-1}}{\sinh x} – x^{a-2} \right) \, dx$
Can there be a function that's even and odd at the same time?
Publishing elementary proofs of theorems
Polygonal connected path proof
Norms on C inducing the same topology as the sup norm
Understanding proof by infinite descent, Fermat's Last Theorem.
How to evaluate $\int_{0}^{1}{\frac{{{\ln }^{2}}\left( 1-x \right){{\ln }^{2}}\left( 1+x \right)}{1+x}dx}$
Applications of Algebra in Physics
Why aren't these loops homotopic?
A question comparing $\pi^e$ to $e^\pi$

*The sum of two Gaussian variables is another Gaussian.*

It seems natural, but I could not find a proof using Google.

What’s a short way to prove this?

Thanks!

Edit: Provided the two variables are independent.

- proving identity for statistical distance
- Find the CDF with a given random variable
- Probability of throwing balls into bins
- Find the probability density function of $Y=3X+2$.
- What is the joint distribution of two random variables?
- Cumulative distribution function of sum of binomial random variables

- Central limit theorem confusion
- Show that there is no discrete uniform distribution on N.
- Joint distribution of dependent Bernoulli Random variables
- If $ X = \sqrt{Y_{1} Y_{2}} $, then find a multiple of $ X $ that is an unbiased estimator for $ \theta $.
- Distribution of $(XY)^Z$ if $(X,Y,Z)$ is i.i.d. uniform on $$
- Distribution for random harmonic series
- the joint distribution of dependent random variables
- What is the correct inter-arrival time distribution in a Poisson process?
- Maximum and minimum of an integral under integral constraints.
- How common are probability distributions with a finite variance?

I prepared the following as an answer to a question which happened to

close just as I was putting the finishing touches on my work. I posted it as a different (self-answered) question but following suggestions from Srivatsan Narayanan and Mike Spivey, I am putting it here and deleting my so-called question.

If $X$ and $Y$ are independent standard Gaussian random variables, what is

the cumulative distribution function of $\alpha X + \beta Y$?

Let $Z = \alpha X + \beta Y$. We assume without loss of generality that $\alpha$ and $\beta$ are positive real numbers since if, say, $\alpha < 0$, then we can replace $X$ by $-X$ and $\alpha$ by $\vert\alpha\vert$. Then, the cumulative probability distribution function of $Z$ is

$$

F_Z(z) = P\{Z \leq z\} = P\{\alpha X + \beta Y \leq z\} = \int\int_{\alpha x + \beta y \leq z} \phi(x)\phi(y) dx dy

$$

where $\phi(\cdot)$ is the unit Gaussian density function. But, since the integrand $(2\pi)^{-1}\exp(-(x^2 + y^2)/2)$ has circular symmetry, the value of the integral depends only on the distance of the origin from the line $\alpha x + \beta y = z$.

Indeed, by a rotation of coordinates, we can write

the integral as

$$

F_Z(z) = \int_{x=-\infty}^d \int_{y=-\infty}^{\infty}\phi(x)\phi(y) dx dy

= \Phi(d)

$$

where $\Phi(\cdot)$ is the standard Gaussian cumulative distribution function.

But,

$$d = \frac{z}{\sqrt{\alpha^2 + \beta^2}}$$

and thus the cumulative distribution function of $Z$ is that of a zero-mean Gaussian random variable with variance $\alpha^2 + \beta^2$.

I don’t know how I missed that one, indeed:

http://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables

Thanks Kaestur Hakarl!

I posted the following in response to a question that got closed as a duplicate of this one:

It looks from your comment as if the meaning of your question is different from what I thought at first. My first answer assumed you knew that the sum of independent normals is itself normal.

You have

$$

\exp\left(-\frac12 \left(\frac{x}{\alpha}\right)^2 \right) \exp\left(-\frac12 \left(\frac{z-x}{\beta}\right)^2 \right)

= \exp\left(-\frac12 \left( \frac{\beta^2x^2 + \alpha^2(z-x)^2}{\alpha^2\beta^2} \right) \right).

$$

Then the numerator is

$$

\begin{align}

& (\alpha^2+\beta^2)x^2 – 2\alpha^2 xz + \alpha^2 z^2 \\ \\

& = (\alpha^2+\beta^2)\left(x^2 – 2\frac{\alpha^2}{\alpha^2+\beta^2} xz\right) + \alpha^2 z^2 \\ \\

& = (\alpha^2+\beta^2)\left(x^2 – 2\frac{\alpha^2}{\alpha^2+\beta^2} xz + \frac{\alpha^4}{(\alpha^2+\beta^2)^2}z^2\right) + \alpha^2 z^2 – \frac{\alpha^4}{\alpha^2+\beta^2}z^2 \\ \\

& = (\alpha^2+\beta^2)\left(x – \frac{\alpha^2}{\alpha^2+\beta^2}z\right)^2 + \alpha^2 z^2 – \frac{\alpha^4}{\alpha^2+\beta^2}z^2,

\end{align}

$$

and then remember that you still have the $-1/2$ and the $\alpha^2\beta^2$ in the denominator, all inside the “exp” function.

(What was done above is *completing the square*.)

The factor of $\exp\left(\text{a function of }z\right)$ does not depend on $x$ and so is a “constant” that can be pulled out of the integral.

The remaining integral does not depend on “$z$” for a reason we will see below, and thus becomes part of the normalizing constant.

If $f$ is any probability density function, then

$$

\int_{-\infty}^\infty f(x – \text{something}) \; dx

$$

does not depend on “something”, because one may write $u=x-\text{something}$ and then $du=dx$, and the bounds of integration are still $-\infty$ and $+\infty$, so the integral is equal to $1$.

Now look at

$$

\alpha^2z^2 – \frac{\alpha^4}{\alpha^2+\beta^2} z^2 = \frac{z^2}{\frac{1}{\beta^2} + \frac{1}{\alpha^2}}.

$$

This was to be divided by $\alpha^2\beta^2$, yielding

$$

\frac{z^2}{\alpha^2+\beta^2}=\left(\frac{z}{\sqrt{\alpha^2+\beta^2}}\right)^2.

$$

So the density is

$$

(\text{constant})\cdot \exp\left( -\frac12 \left(\frac{z}{\sqrt{\alpha^2+\beta^2}}\right)^2 \right) .

$$

Where the standard deviation belongs we now have $\sqrt{\alpha^2+\beta^2}$.

- System of quadratic Diophantine equations
- parallel postulate of Euclidean geometry and curvature
- Exponential Generating Functions For Derangements
- Real numbers equipped with the metric $ d (x,y) = | \arctan(x) – \arctan(y)| $ is an incomplete metric space
- Why do we traditionally use letter U for open sets?
- How do I prove that 15462227 and 15462229 relatively prime?
- Is axiom of choice necessary for proving that every infinite set has a countably infinite subset?
- If $a_1,\ldots,a_n>0$ and $a_1+\cdots+a_n<\frac{1}{2}$, then $(1+a_1)\cdots(1+a_n)<2$.
- Why is the polynomial best approximation to an even function itself even?
- Integration by parts for general measure?
- Prove $\int_0^\infty \frac{\sin^4x}{x^4}dx = \frac{\pi}{3}$
- Aftermath of the incompletness theorem proof
- Why is the triangle inequality property important for metric spaces?
- Finding nonnegative solutions to an underdetermined linear system
- Noether's definition of right and left ideals?