Articles of entropy

Entropy of sum of random variables

Let $x_1,x_2,\dots,x_n$ by random variables which take the values $0$ or $1$ with $P(x_i = 1) = p_i$ and $P(x_i = 0) = 1-p_i$, where $0 \leq p_i \leq 1$ for $i=1,2,\dots, n$. Let $$X= \sum_{i=1}^n x_i.$$ Is it true that the entropy of $X$ is maximal when $p_i = 1/2$ for $i=1,2,\dots, n$?

Question about entropy

Let $(X,A,\nu)$ be a probability space and $T\colon X\to X$ a measure preserving transformation $\nu$. Take a measurable partition $P=\{P_0,\dots,P_{k-1}\}$. Let$I$ be a set of all possible itineraries, that is, $I=\{(i_1,\dots,i_n,\dots)\in k^N;$ there is a $x\in X$, such that $T^n(x)\in P_{i_n}$ for all $n\in\Bbb N$. Suppose that $I$ is countably infinite. Is true that the […]

Entropy of a natural number

Let $H(n) = -\sum_{d|n} \frac{d}{\sigma(n)} \log(\frac{d}{\sigma(n)}) = \log(\sigma(n))-\frac{1}{\sigma(n)}\sum_{d|n} d\log(d)$ be one entropy defined on natural numbers. Then I can prove that $H$ is additive ( $\gcd(n,m) = 1 \rightarrow H(mn) = H(n)+H(m))$ It occured to me numerically that $\lim_{\alpha \rightarrow \infty} H(p^\alpha)$ always exists where $p$ is a prime. However I am not able to […]

Entropy of matrix

I am trying to understand entropy. From what I know we can get the entropy of a variable lets say X. What i dont understand is how to calculate the entropy of a matrix say m*n. I thought if columns are the attributes and rows are object we can sum the entropy of individual columns […]

Approximation of Shannon entropy by trigonometric functions

Define Shannon entropy by $$I(p) = -p \log_2 p$$ Numerical experimentation shows that $\sin(\pi p)^{1-1/e}$ is a good approximation to $I(p) + I(1-p)$ on $[0,1],$ never differing by more than 3.3%. A little more experimentation shows that the $L^{1}$ norm for the difference between $I(p) + I(1-p)$ and $\sin(\pi p)^{x}$ is minimized for $x = […]

toplogical entropy of general tent map

Measure theoretic entropy of General Tent maps The linked question made me wonder how to calculate the topological entropy of a general tent map. Let $I=[0,1]$ and $\alpha \in ( 0,1)$. Define $T: I \rightarrow I$ by $T(x)= x/\alpha$ for $x \in [0,\alpha]$ and $(1-x)/(1-\alpha)$ for $x \in [\alpha,1]$ What is the topological entropy of […]

Limit for entropy of prime powers defined by multiplicative arithmetic function

This question is related to my other question ( Entropy of a natural number ). Let $f \ge 0$ be a multiplicative arithmetic function and $F(n) = \sum_{d|n}f(d)$. Define the entropy of $n$ with respect to $f$ to be $H_f(n) = -\sum_{d|n} \frac{f(d)}{F(n)}\log(\frac{f(d)}{F(n)}) = \log(F(n)) – \frac{1}{F(n)}\sum_{d|n} f(d)\log(f(d))$ For instance in the last question we […]

“Empirical” entropy.

Information entropy is usually defined as $$\text{I}_b({\bf p}) = -\sum_{\forall i}p_i\log_b(p_i)$$ i.e. the expected value of the negative logarithm of the probabilities. This is all good when we have a finite set of outcomes i. This can also be estimated using a histogram, treating all values within each bin as the same outcome. Doing this […]

equivalence between uniform and normal distribution

The principle of insufficient reason says that all outcomes are equiprobable when we have no knowledge to guess otherwise. I understand this and that this corresponds to uniform distribution. However, different sources say that this is only true for discrete case. For continuous distributions, normal distribution corresponds to maximum entropy. Here is WP: the maximum […]

Prove that bitstrings with 1/0-ratio different from 50/50 are compressable

I’m looking for a proof, that $$ \sum_{i=0}^{\lambda n} \binom{n}{i} \le 2^{nH(\lambda)} $$ with $n>0$, $0 \le \lambda \le 1/2$ and $ H(\lambda)=-[\lambda log \lambda + (1-\lambda) log (1-\lambda)] $. Context: This shows, that if there is a bitstring with a ratio of ones and zeros (‘pick i from n’ where i is smaller than […]