Intereting Posts

Proof that the $(\mathcal P (\mathbb N),\triangle)$ is an abelian group?
ancient principle of mathematics: figure = varying element
Norm of a functional on square integrable harmonic functions
Eigenvalues of $A+B$
If $q^k n^2$ is an odd perfect number with Euler prime $q$, which of the following relationships between $q^2$ and $n$ hold?
Subspace generated by permutations of a vector in a vector space
angular velocity around ellipse
How to calculate volume of 3d convex hull?
convolution a continuous function?
Help solving $ax^2+by^2+cz^2+dxy+exz+fzy=0$ where $(x_0,y_0,z_0)$ is a known integral solution
Evaluate $\lim_{x \rightarrow 0} x^x$
Turning Incomplete Beta Integral to Complete Beta Integral
When localization is an integral extension
Continuous and bounded variation does not imply absolutely continuous
Is there anything like “cubic formula”?

I have a set of users that generate calls. If I assign the same probability to each user, they have identical call generation probability which can be defined as $\delta$. These callers are chosen uniformly among the set of users. At the end of the generation process, the representation of the probability density function of the call rates should be a delta function (hence the shape is similar to a bell, isn’t it?)

The probability i assigned to each user is:

$$p_u = \frac{\lambda}{\sum_{i \in N_u} \lambda}$$

- Show that $E(X)=\int_{0}^{\infty}P(X\ge x)\,dx$ for non-negative random variable $X$
- Between bayesian and measure theoretic approaches
- $K$ consecutive heads with a biased coin?
- Independent Standard Normal Gaussian Random Variables
- Help with a Probability Proof
- Why is the probability that $(X_1+\ldots+X_n)/n$ converges either $0$ or $1$?

where $\lambda = \frac{1}{N_u}$ and $N_u$ is the number of users. In this way they are equally partitioned between 0 and 1 and i can take a random number uniformly distributed in order to select a random user.

My question is how can i demonstrate that this is really a Delta function? The information i wrote are enough to defined the Delta function (i don’t know if it is possible to formalize the p.d.f.)?

For example in figure we have 10000 that has the same generation probability: if I generate ca. 605000 calls the average is ca. 60.5 calls per user

- What is the relation between weak convergence of measures and weak convergence from functional analysis
- Checking proof that a given process is a martingale
- Find values for probability density function
- Mutually Exclusive Events (or not)
- Finding a Distribution When Introducing an Auxiliary Random Variable
- Poisson random variables and Binomial Theorem
- Two possible senses of a random variable being a function of another random variable
- How can it be meaningful to add a discrete random variable to a continuous random variable while they are functions over different sample spaces?

can someone explain me something about the fact that the Delta function is considered as a limit of Gaussian?

Delta probability measures are limits of (nondegenerate) Gaussian probability measures and in fact one often defines the class of Gaussian probability measures as the union of the class of positive variance Gaussian probability measures and of the class of Dirac probability measures.

That is, Delta probability measures **ARE** Gaussian.

One reason is simple, which is that every Delta probability measure $\delta_x$ is the limit of the (nondegenerate) Gaussian probability measures $N(x,\sigma^2)$ when $\sigma^2\to0$:

- To see this, recall that the distribution of $X$ is $\delta_x$ if $P(X\in B)=1$ when $x\in B$ and $P(x\in B)=0$ when $x\notin B$, and that the distribution of $Y_\sigma$ is $N(x,\sigma^2)$ if the density of this distribution is the function you know or equivalently, if $Y_\sigma$ is distributed like $x+\sigma Z$, where the distribution of $Z$ is $N(0,1)$. As a consequence, for every positive $t$,

$$

P(|Y_\sigma -x|\ge t)=P(|x+\sigma Z-x|\ge t)=P(|Z|\ge t/\sigma).

$$

One sees that this goes to $0$ when $\sigma^2\to0$ and, likewise, that $P(|Y_\sigma -x|\le t)\to1$. Since $P(|X -x|\ge t)=0$ and $P(|X -x|\le t)=1$, $Y_\sigma$ converges (in distribution) to $X$.

Another (related) reason to include Dirac probability measures is the characterisation of Gaussian *families*:

- To see this, recall that the vector-valued random variable $X=(X_1,\ldots,X_n)$ is Gaussian iff every linear combination $w\cdot X=w_1X_1+\cdots+w_nX_n$ of its coordinates is (real-valued) Gaussian. Here again one may land on degenerate Gaussian random variables, for example, if $X_1$ is $N(0,1)$, one wants $X=(X_1,X_2,X_3)$ to be Gaussian for $X_2=2X_1$ and $X_3=3X_1-4$, hence one is pleased that $2X_1-X_2=w\cdot X$ for $w=(2,-1,0)$ is Gaussian although its distribution is $\delta_0$, and likewise that $3X_1-X_3=v\cdot X$ for $v=(3,0,-1)$ is Gaussian although its distribution is $\delta_4$.

To sum up, Gaussianity is (and should be) a *closed* property when one considers limits in distribution and linear combinations of random variables.

I have a slightly more analytic explanation why the delta distribution is a limit of Gaussians than the one by @Did.

The delta distribution is nothing more than a linear function defined on the space of test functions $C_c^\infty(\mathbf R^d)$. It is given by the following duality pairing

$$\left<\delta, \phi\right> = \phi(0).$$

This view is thanks to Laurent Schwartz, a fabulous mathematician.

Now $\langle f, g\rangle = \int fg$. So we can see that a sequence of Gaussians (there are many others as well) approximate the delta distribution as follows:

Define $s_m(x) = \sqrt{\frac{m}{\pi}} e^{-m x^2}$, our sequence of Gaussians.

We can now prove that for a bounded, integrable function $\phi$ which is continuous in $0$ (so a test function certainly satisfies this) we have that

$$\lim_{m \to \infty} \int_{-\infty}^{\infty} \sqrt{\frac{m}{\pi}} e^{-m x^2} \phi(x) \, dx = \phi(0).$$

We have

$$\int_{-\infty}^{\infty} \sqrt{\frac{m}{\pi}} e^{-m x^2} \phi(x) \, dx = \phi(0) + \int_{-\infty}^{\infty} \sqrt{\frac{m}{\pi}} e^{-m x^2} (\phi(x) – \phi(0)) \, dx$$

By the boundedness $\phi$ we can show that the second integral goes to zero by writing the integral as $\int_{-\infty}^A + \int_A^\infty + \int_{-A}^A$ for some finite number $A > 0$. The continuity will help is make the last integral arbitrary small.

For me the following made it easier to understand: The delta distribution (or function, whatever you want) *is* a function, but not on the real line. It is a linear functional on the space of test functions.

There is some confusion here.

When you say “If I assign the same probability to each user, they have identical call generation probability which can be defined as $\delta$”, I interpret this as: “In any given hour, the probability that any given user will call is $\delta$, independently of what the other users do during this hour.” Under this model the number $X$ of users calling in any given hour is $\ {\it binomially\ distributed}$, which means that plotting for each $k$ on the $x$-axis the probability that exactly $k$ users will call during that hour, you will see a bell shaped staircase with peak at $x=N\delta$, where $N$ is the total number of users.

Using the letter $\delta$ for a certain probability does not mean that the “$\delta$-function”, which is an official thing, is involved per se. As long as you are dealing with reasonable numbers, as in your example, you remain in the world of Gaussians, the limit of such staircases. In planning of demand (say, for hospital beds) you will have to deal with the width of these Gaussians.

Now the official “$\delta$-function” has width zero; but it can indeed be thought of a limit of Gaussians. I won’t go into the details here, but if you let the number $N$ of users go to $\infty$ at the same time scale your time intervals ($1$ hour in your example) appropriately then you will see the resulting Gaussian converge to a $\delta$-function. In an intuitive sense, this is nothing else but the law of large numbers.

- Why isn't there a fixed procedure to find the integral of a function?
- Why an open set in $R^n$ might not be written as countable union of disjoint open intervals?
- Proving $\sum_{k=0}^n\binom{2n}{2k} = 2^{2n-1}$
- Why do units (from physics) behave like numbers?
- How to show a ring is normal or not, and how to show the normalisation of the ring
- Why is continuous differentiability required?
- How to prove the uniqueness of a continuous extension of a densely defined function?
- Proving binomial coefficients identity: $\binom{r}{r} + \binom{r+1}{r} + \cdots + \binom{n}{r} = \binom{n+1}{r+1}$
- Skyscraper sheaf?
- Are rank of T and T^* equal?
- Does the set of differences of a Lebesgue measurable set contains elements of at most a certain length?
- Does this series with alternating elements converge: $\sum_{k=10}^{+\infty}\frac{(-1)^k}{k+(-1)^k}$?
- Uniformly Convergent Subsequence
- How to find the length of a line in a tetrahedron
- Special orthogonal matrices have orthogonal square roots