Information #
In the late 1940s, while learning to juggle and ride the unicycle, Claude Shannon was working on what has now become known as information theory.
The central question was to quantify how much information is transmitted by a signal such as a radio transmission or a sequence of 0s and 1s. At some point in his work, Shannon encountered the following problem:
Find all continuous functions \(I: (0,1] \rightarrow \mathbb{R}\) that have the following properties:
- \(I\) is a non-negative decreasing function,
- \(I(1) = 0 \), and
- \(I(xy) = I(x) + I(y) \)
It is not difficult to check that \(I(x) = -\log x\) satisfies these desiderata. Show that in fact, every solution must take the form \(I(x) = - k \log x\) with \( k > 0\). With the choice of \(k = 1\), \(I\) is now known as the information function.
You may be reminded of the notion of an additive homomorphism introduced in an analysis course. In your solution, you may freely use the fact that all additive homomorphisms of the reals, that is, functions \(f:\mathbb{R} \rightarrow \mathbb{R} \) which satisfy the functional equation \(f(x+y) = f(x) + f(y)\), must take the form \(h(x) = k \cdot x\) for some constant \(k\).