Intereting Posts

What is vector division?
Solving Induction $\prod\limits_{i=1}^{n-1}\left(1+\frac{1}{i}\right)^{i} = \frac{n^{n}}{n!}$
Problem involving many squares and several variables.
Prove that every isometry on $\mathbb{R}^2$ is bijective
Inverse Trigonometric Integrals
On the smooth structure of $\mathbb{R}P^n$ in Milnor's book on characteristic classes.
Expected Value Of Dice Rolling Game
What is the exact definition of a reflexive relation?
Non-unital rings: a few examples
Prove that the interval $(0, 1)$ and the Cartesian product $(0, 1) \times (0, 1)$ have the same cardinality
Writing an expression as a sum of squares
mathematical difference between column vectors and row vectors
Regarding the notation $f: a \mapsto b$
Trust region sub-problem with Jacobi Condition
What is the value of this repeated square root: $\sqrt{1\sqrt{2\sqrt {3 \sqrt{4\cdots}}}}$

We know that $l_i=\log \frac{1}{p_i}$ is the solution to the Shannon’s source compression problem: $\arg \min_{\{l_i\}} \sum p_i l_i$ where the minimization is over all possible code length assignments $\{l_i\}$ satisfying the Kraft inequality $\sum 2^{-l_i}\le 1$.

Also $H(p)=\log \frac{1}{p}$ is additive in the following sense. If $E$ and $F$ are two independent events with probabilities $p$ and $q$ respectively, then $H(pq)=H(p)+H(q)$.

As far as I know, mainly for these two reasons $H(p)=\log \frac{1}{p}$ is considered as a measure of information contained in a random event $E$ with probability $p>0$.

- Have Information Theoretic results been used in other branches of mathematics?
- Expanding and understanding the poison pills riddle
- Intuitive explanation of entropy?
- Lower bound on binomial coefficient
- what is the mutual information of three variables?
- How to make the encoding of symbols needs only 1.58496 bits/symbol as carried out in theory?

On the other hand, if we average the exponentiated lengths, $\sum p_i2^{tl_i}, t>0$, subject to the same Kraft inequality constraints, the optimal solution is $l_i=\log \frac{1}{p_i'}$ where $p_i'=\frac{p_i^{\alpha}}{\sum_k p_k^{\alpha}}, \alpha=\frac{1}{1+t}$, called Campbell’s problem.

Now $H_{\alpha}(p_i)=\log \frac{1}{p_i'}$ is also additive in the sense that $H_{\alpha}(p_i p_j)=H_{\alpha}(p_i)+H_{\alpha}(p_j)$. Moreover $H_{\alpha}(1)=0$ as in the case of Shannon’s measure.

Also note that, when $\alpha=1$, $H_1(p_i)=\log \frac{1}{p_i}$ we get back Shannon’s measure.

My question is, are these reasons suffice to call $H_{\alpha}(p_i)=\log \frac{1}{p_i'}$ a (generalized) measure of information?

I don’t know whether the dependence of measure of information of an event also on the probabilities of the other events make sense.

- Random $0-1$ matrices
- An equality about conditional expectation.
- Prokhorov metric vs. total variation norm
- Conditional Expectation of Functions of Random Variables satisfying certain Properties
- Meaning of correlation
- Characteristic function of a square of normally distributed RV
- When Superposition of Two Renewal Processes is another Renewal Process?
- When does $\sum_{i=1}^{\infty} X_i$ exist for random sequences $\{X_i\}_{i=1}^{\infty}$?
- Showing $\cos(t^2)$ is not a Characteristic Function
- Intuitive explanation of entropy?

That’s exactly the extension known as Renyi entropy with a normalization factor of $\frac{1}{1-\alpha}$.

$$H_\alpha(X) = \frac{1}{1-\alpha}\log\Bigg(\sum_{i=1}^n p_i^\alpha\Bigg)$$

- Rényi, Alfréd (1961). “On measures of information and entropy”. Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability 1960. pp. 547–561.

- Is my shorter expression for $ s_m(n)= 1^m+2^m+3^m+\cdots+(n-1)^m \pmod n$ true?
- Do torsionfree abelian groups form a (possibly many-sorted) algebraic category?
- About finding the function such that $f(xy)=f(x)f(y)-f(x+y)+1$
- Understanding quotients of $\mathbb{Q}$
- Sum of the series $\sum_{n\geq1} \frac{P(n) \bmod Q(n)} {n^2}$
- Finding a combinatorial argument for an interesting identity: $\sum_k \binom nk \binom{m+k}n = \sum_i \binom ni \binom mi 2^i$
- counting full bipartite matchings
- How find this integral $I=\int\frac{1}{\sin^5{x}+\cos^5{x}}dx$
- For holomorphic functions, if $\{f_n\}\to f$ uniformly on compact sets, then the same is true for the derivatives.
- Show $100+t$ is reducible in $\mathbb{Z}]$
- Saturated damped harmonic oscillator
- A problem about generalization of Bezout equation to entire functions
- Can sets of cardinality $\aleph_1$ have nonzero measure?
- Lower bounds on the number of elements in Sylow subgroups
- Algorithm for determining whether two imaginary quadratic numbers are equivalent under a modular transformation