Signals statistically
The entropy of a (digital) signal source describes how “predictable” the signal is, and is defined as
H = - S pilog2 pi 0 < H < 1
where pi is the probability of the signal value i
- The entropy gives the minimum number of bits per symbol needed to code the signal
- If the entropy of a source is less than the transmission rate of a channel the signal can be transmitted on the channel
Mathematical models of communication - information theory