Tags
Language
Tags
December 2024
Su Mo Tu We Th Fr Sa
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31 1 2 3 4

Coding Theorems of Classical and Quantum Information Theory

Posted By: AvaxGenius
Coding Theorems of Classical and Quantum Information Theory

Coding Theorems of Classical and Quantum Information Theory By K. R. Parthasarathy
English | PDF | 2013 | 187 Pages | ISBN : 938025041X | 11.43 MB

The logarithmic connection between entropy and probability was first enun- ciated by L.E. Boltzmann (1844-1906) in his kinetic theory of gases. His famous formula for entropy S is S = k log W (as engraved on his tombstone in Vienna) where k is a constant and W is the number of possible microstates corresponding to the macroscopic state of a system of particles in agas. Ignoring the constant k and replacing log W by -log P(E) where P(E) is the probability of an event E in the probability space (n, F, P) of a statistical experiment, C. E. Shannon (1916- 2001) looked upon -logP(E) as a measure of the information gained about the probability space from the occurrence of E. If X is a simple random variable on this probability space assuming the values al, a2, … , ak from a finite set with P( X = aj) = Pj for each j then the famous Shannon entropy H(X) = - Lj Pj log Pj is th e e xp e cted information about (n, F, P) gained from observing X. Ce nt red around this idea of entropy a mathematical theory of communication was woven by Shannon in a celebrated pair of papers in the 1948 volume of the Bell System Technical Journal. Here Shannon established two fundamental coding theorems about the optimal compressibility of a text in its storag e and the optimal capacity of a channel in communicating a text after encoding.
The modern approach to information theory is to view a text in any al- phabetic language as a finite time realization of a stochastic process in discrete time with values in a finite set (calIed alphabet) and consider the quantity -~ log P(XO , Xl, … , Xn-l) as the rate at which information is gene rated by the text Xo , Xl, … , Xn-l during the per iod [0, n -lJ. Under fairly general conditions this rate exhibits an asymptotic stability property as n becomes large. Through the papers of B. Mcmillan, A. Feinstein, L. Breiman, J. Wolfowitz and others it is now known that an appeal to this stability property enlarges the scope of Shannon's coding theorems. This gets enriched further by exploiting the Kryloff-Bogoliouboff theory of disintegrating an invariant probability measure into its ergodic components. The first three chapters of this little book are devoted to Shannon's coding theorems and their enriched versions. However, we have not touched upon the coding theorems in their most general form as presented in th e book of Te Sun Han [14J .