# Is this a Delta Function? (and Delta as limit of Gaussian?)

I have a set of users that generate calls. If I assign the same probability to each user, they have identical call generation probability which can be defined as $\delta$. These callers are chosen uniformly among the set of users. At the end of the generation process, the representation of the probability density function of the call rates should be a delta function (hence the shape is similar to a bell, isn’t it?)

The probability i assigned to each user is:
$$p_u = \frac{\lambda}{\sum_{i \in N_u} \lambda}$$

where $\lambda = \frac{1}{N_u}$ and $N_u$ is the number of users. In this way they are equally partitioned between 0 and 1 and i can take a random number uniformly distributed in order to select a random user.

My question is how can i demonstrate that this is really a Delta function? The information i wrote are enough to defined the Delta function (i don’t know if it is possible to formalize the p.d.f.)?

For example in figure we have 10000 that has the same generation probability: if I generate ca. 605000 calls the average is ca. 60.5 calls per user

#### Solutions Collecting From Web of "Is this a Delta Function? (and Delta as limit of Gaussian?)"

can someone explain me something about the fact that the Delta function is considered as a limit of Gaussian?

Delta probability measures are limits of (nondegenerate) Gaussian probability measures and in fact one often defines the class of Gaussian probability measures as the union of the class of positive variance Gaussian probability measures and of the class of Dirac probability measures.

That is, Delta probability measures ARE Gaussian.

One reason is simple, which is that every Delta probability measure $\delta_x$ is the limit of the (nondegenerate) Gaussian probability measures $N(x,\sigma^2)$ when $\sigma^2\to0$:

To see this, recall that the distribution of $X$ is $\delta_x$ if $P(X\in B)=1$ when $x\in B$ and $P(x\in B)=0$ when $x\notin B$, and that the distribution of $Y_\sigma$ is $N(x,\sigma^2)$ if the density of this distribution is the function you know or equivalently, if $Y_\sigma$ is distributed like $x+\sigma Z$, where the distribution of $Z$ is $N(0,1)$. As a consequence, for every positive $t$,
$$P(|Y_\sigma -x|\ge t)=P(|x+\sigma Z-x|\ge t)=P(|Z|\ge t/\sigma).$$
One sees that this goes to $0$ when $\sigma^2\to0$ and, likewise, that $P(|Y_\sigma -x|\le t)\to1$. Since $P(|X -x|\ge t)=0$ and $P(|X -x|\le t)=1$, $Y_\sigma$ converges (in distribution) to $X$.

Another (related) reason to include Dirac probability measures is the characterisation of Gaussian families:

To see this, recall that the vector-valued random variable $X=(X_1,\ldots,X_n)$ is Gaussian iff every linear combination $w\cdot X=w_1X_1+\cdots+w_nX_n$ of its coordinates is (real-valued) Gaussian. Here again one may land on degenerate Gaussian random variables, for example, if $X_1$ is $N(0,1)$, one wants $X=(X_1,X_2,X_3)$ to be Gaussian for $X_2=2X_1$ and $X_3=3X_1-4$, hence one is pleased that $2X_1-X_2=w\cdot X$ for $w=(2,-1,0)$ is Gaussian although its distribution is $\delta_0$, and likewise that $3X_1-X_3=v\cdot X$ for $v=(3,0,-1)$ is Gaussian although its distribution is $\delta_4$.

To sum up, Gaussianity is (and should be) a closed property when one considers limits in distribution and linear combinations of random variables.

I have a slightly more analytic explanation why the delta distribution is a limit of Gaussians than the one by @Did.

The delta distribution is nothing more than a linear function defined on the space of test functions $C_c^\infty(\mathbf R^d)$. It is given by the following duality pairing
$$\left<\delta, \phi\right> = \phi(0).$$

This view is thanks to Laurent Schwartz, a fabulous mathematician.

Now $\langle f, g\rangle = \int fg$. So we can see that a sequence of Gaussians (there are many others as well) approximate the delta distribution as follows:

Define $s_m(x) = \sqrt{\frac{m}{\pi}} e^{-m x^2}$, our sequence of Gaussians.

We can now prove that for a bounded, integrable function $\phi$ which is continuous in $0$ (so a test function certainly satisfies this) we have that

$$\lim_{m \to \infty} \int_{-\infty}^{\infty} \sqrt{\frac{m}{\pi}} e^{-m x^2} \phi(x) \, dx = \phi(0).$$

We have

$$\int_{-\infty}^{\infty} \sqrt{\frac{m}{\pi}} e^{-m x^2} \phi(x) \, dx = \phi(0) + \int_{-\infty}^{\infty} \sqrt{\frac{m}{\pi}} e^{-m x^2} (\phi(x) – \phi(0)) \, dx$$

By the boundedness $\phi$ we can show that the second integral goes to zero by writing the integral as $\int_{-\infty}^A + \int_A^\infty + \int_{-A}^A$ for some finite number $A > 0$. The continuity will help is make the last integral arbitrary small.

For me the following made it easier to understand: The delta distribution (or function, whatever you want) is a function, but not on the real line. It is a linear functional on the space of test functions.

There is some confusion here.

When you say “If I assign the same probability to each user, they have identical call generation probability which can be defined as $\delta$”, I interpret this as: “In any given hour, the probability that any given user will call is $\delta$, independently of what the other users do during this hour.” Under this model the number $X$ of users calling in any given hour is $\ {\it binomially\ distributed}$, which means that plotting for each $k$ on the $x$-axis the probability that exactly $k$ users will call during that hour, you will see a bell shaped staircase with peak at $x=N\delta$, where $N$ is the total number of users.

Using the letter $\delta$ for a certain probability does not mean that the “$\delta$-function”, which is an official thing, is involved per se. As long as you are dealing with reasonable numbers, as in your example, you remain in the world of Gaussians, the limit of such staircases. In planning of demand (say, for hospital beds) you will have to deal with the width of these Gaussians.

Now the official “$\delta$-function” has width zero; but it can indeed be thought of a limit of Gaussians. I won’t go into the details here, but if you let the number $N$ of users go to $\infty$ at the same time scale your time intervals ($1$ hour in your example) appropriately then you will see the resulting Gaussian converge to a $\delta$-function. In an intuitive sense, this is nothing else but the law of large numbers.