Kolmogorov complexity and information theory. With an interpretation in terms of questions and answers. (English) Zbl 1034.68051
Summary: We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual information and Kolmogorov (“algorithmic”) mutual information. We explain how universal coding may be viewed as a middle ground between the two theories. We consider Shannon’s rate distortion theory, which quantifies useful (in a certain sense) information. We use the communication of information as our guiding motif, and we explain how it relates to sequential question-answer sessions.
MSC:
68Q30 | Algorithmic information theory (Kolmogorov complexity, etc.) |
94A15 | Information theory (general) |