×

Entropy: An inequality. (English) Zbl 0685.28010

The authors provide a simple proof of an elementary inequality concerning entropy, which they have found useful in previous work on the Rudin- Shapiro sequence.
Let \((p_ k| 0\leq k<\infty)\) be a probability vector satisfying for some \(\lambda >0,\quad \lambda p_ n\geq \sum^{\infty}_{k=n+1}p_ k\quad (n=0,1,2,...).\) The authors give a bound for \(\sum^{\infty}_{k=0}p_ k^{\alpha},\) with \(\alpha <1\) and utilize it to show \[ \sum^{\infty}_{k=0}p_ k \log(\frac{1}{p_ k})\leq \sum^{\infty}_{k=0}q_ k \log(\frac{1}{q_ k}), \] where \((q_ k)\) is a probability vector forming a geometric sequence for which \(\lambda q_ n=\sum^{\infty}_{k=n+1}q_ k\) \((n=0,1,2,...).\)
Reviewer: Y.Peres

MSC:

28D20 Entropy and other invariants
11B99 Sequences and sets
60F99 Limit theorems in probability theory
11K99 Probabilistic theory: distribution modulo \(1\); metric theory of algorithms
26D15 Inequalities for sums, series and integrals
Full Text: DOI