consider this a soft-question.
Information Theory is fairly young branch of mathematics (60 years).
I am interested in question, whether there have been any information theoretic results that had impact on other seemingly ‘unrelated’ branches of mathematics?
Looking forward to hearing you thoughts.
There is a theorem that no comparison sorting algorithm can have a worst-case running time better than $O(n\log n)$, and this limit is often referred to as an information-theoretic limit on the running time. (For example, see Leiserson et al. Introduction to Algorithms (2001) pp. 181 and preceding, or Knuth The Art of Computer Programming 3 (1998) pp. 182–183 “Relation (1) is often called the information-theoretic lower bound…”.)
This is because there are $n!$ possible orders for a list of $n$ items, and therefore $\lceil \log_2 n!\rceil$ bits are required to specify one of these orders. By Stirling’s approximation, this is $O(n\log n)$ bits.
But a comparison of two elements produces only one bit of information about the permutation, so $O(n\log n)$ comparisons are required to identify which permutation is given.
The paper “The information theoretic bound for problems on ordered sets and graphs” (Michael Saks, 1985) says “This is an example of an information theoretic bound (ITB) for a combinatorial decision problem, one of the few general techniques known for obtaining lower bounds on the computation complexity of a problem. … This paper is a survey of a number of computational problems involving ordered sets andgraphis in which the ITB can be applied.”