search for books and compare prices
Tables of Contents for Abstract Methods in Information Theory
Chapter/Section Title
Page #
Page Count
Preface
vii
 
Entropy
1
66
The Shannon entropy
1
10
Conditional expectations
11
5
The Kolmogorov-Sinai entropy
16
13
Algebraic models
29
12
Entropy functionals
41
8
Relative entropy and Kullback-Leibler information
49
18
Bibliographical notes
63
4
Information Sources
67
54
Alphabet message spaces and information sources
67
4
Ergodic theorems
71
4
Ergodic and mixing properties
75
16
AMS sources
91
8
Shannon-McMillan-Breiman theorem
99
7
Ergodic decompositions
106
4
Entropy functionals, revisited
110
11
Bibliographical notes
119
2
Information Channels
121
68
Information channels
121
8
Channel operators
129
7
Mixing channels
136
11
Ergodic channels
147
9
AMS channels
156
10
Capacity and transmission rate
166
12
Coding theorems
178
11
Bibliographical notes
187
2
Special Topics
189
36
Channels with a noise source
189
7
Measurability of a channel
196
6
Approximation of channels
202
5
Harmonic analysis for channels
207
7
Noncommutative channels
214
11
Bibliographical notes
222
3
References
225
14
Indices
239
 
Notation index
239
5
Author index
244
3
Subject index
247