Combinations of characteristic functions: $\alpha\phi_1+(1-\alpha)\phi_2$

Suppose we are given two characteristic functions: $\phi_1,\phi_2$ and I want to take a weighted average of them as below:

$\alpha\phi_1+(1-\alpha)\phi_2$ for any $\alpha\in [0,1]$

Can it be proven that the result is also a characteristic function? If so, I am guessing this result could extend to any number of combinations $\alpha_i$ as long as $\sum_i\alpha_i=1$


Secondly, if $\phi$ is again a characteristic function, then $\mathfrak{R}e\phi(t)=\frac12(\phi(t)+\phi(-t))$ is also a characteristic function. I don’t even know how to begin attempting this proof as I am not sure what the $\mathfrak{R}$ represents.


Lastly, regarding the symmetry of characteristic functions,

$\phi$ is symmetric about zero iff it is real-valued iff the corresponding distribution is symmetric about zero.

Once again, my lack of familiarity with the complex plane leaves me in the dark here. Why can a complex-valued function not be symmetric about zero?

Solutions Collecting From Web of "Combinations of characteristic functions: $\alpha\phi_1+(1-\alpha)\phi_2$"

To prove that these are characteristic functions, using random variables yields simpler, more intuitive, proofs.

In the first case, assume that $\phi_1(t)=\mathrm E(\mathrm e^{itX_1})$ and $\phi_2(t)=\mathrm E(\mathrm e^{itX_2})$ for some random variables $X_1$ and $X_2$ defined on the same probability space and introduce a Bernoulli random variable $A$ such that $\mathrm P(A=1)=\alpha$ and $\mathrm P(A=0)=1-\alpha$, independent of $X_1$ and $X_2$. Then:

The function $\alpha\phi_1+(1-\alpha)\phi_2$ is the characteristic function of the random variable $AX_1+(1-A)X_2$.

The extension to more than two random variables is direct. Assume that $\phi_k(t)=\mathrm E(\mathrm e^{itX_k})$ for every $k$, for some random variables $X_k$ defined on the same probability space and introduce an integer valued random variable $A$ such that $\mathrm P(A=k)=\alpha_k$ for every $k$, independent of $(X_k)_k$. Then:

The function $\sum\limits_k\alpha_k\phi_k$ is the characteristic function of the random variable $X_A=\sum\limits_kX_k\mathbf 1_{A=k}$.

In the second case, assume that $\phi(t)=\mathrm E(\mathrm e^{itX})$ for some random variable $X$ and introduce a Bernoulli random variable $A$ such that $\mathrm P(A=1)=\mathrm P(A=-1)=\frac12$, independent of $X$. Then:

The function $t\mapsto\frac12(\phi(t)+\phi(-t))$ is the characteristic function of the random variable $AX$.

By Bochner’s theorem, a function $\phi : \mathbb{R} \to \mathbb{C}$ is the characteristic function of a probability measure if and only if

  1. $\phi$ is positive definite,
  2. $\phi(0) = 1$, and
  3. $\phi$ is continuous at the origin.

Since these properties are conserved under convex combination, your second statement is true whenever $\alpha_i$ are non-negative.


The symbol $\Re(z)$ means the real part of a complex number $z$. If $\phi$ is a characteristic function, then $\phi(-t) = \bar{\phi}(t)$, thus we have

$$\Re \phi(t) = \frac{\phi(t)+\bar{\phi}(t)}{2} = \frac{\phi(t)+\phi(-t)}{2}. $$

Now let $\phi$ be the characteristic function of a probability measure $\mu$. Then clearly the mapping $t \mapsto \phi(-t)$ is the characteristic function of the measure $\tilde{\mu} : E \mapsto \mu(-E)$. Thus in view of the first answer, $\Re \phi(t)$ is also a characteristic function.


If your symmetry about zero means $\phi(-t) = \phi(t)$, then the first assertion follows by our second answer. Now since $\phi(t) = \phi(-t)$, the corresponding measure must coincide, that is, we must have $\mu(E) = \tilde{\mu}(E) = \mu(-E)$ for any Borel measurable $E \subset \mathbb{R}$. Thus $\mu$ is symmetric.