Entropy and data compression schemes

Some new ways of defining the entropy of a process by observing a single typical output sequence as well as a new kind of Shannon-McMillan-Breiman theorem are presented. This provides a new and conceptually very simple ways of estimating the entropy of an ergodic stationary source as well as new insight into the workings of such well-known data compression schemes as the Lempel-Ziv algorithm. >