Shannon entropy measures the amount of information in a communication. A perfectly random message has Shannon entropy of zero.

Shannon entropy measures the amount of information in a communication. A perfectly random message has Shannon entropy of zero.

%d bloggers like this:

January 9, 2017 at 10:37 am

“The concept was introduced by Claude E. Shannon in the paper „A Mathematical Theory of Communication” (1948). Shannon entropy allows to estimate the average minimum number of bits needed to encode a string of symbols based on the alphabet size and the frequency of the symbols.” =? Totally Random ?

I shouldn’t be touching this, but I’m having trouble with just an intuitive, qualitative understanding. The use of the word average seems contraindicative to “totally”, and the use of the word minimum seems contraindicative to “random”.

LikeLike

January 9, 2017 at 10:53 am

“Shannon entropy allows to estimate the average minimum number of bits needed to encode a string of symbols based on the alphabet size and the frequency of the symbols.”

[ http://www.shannonentropy.netmark.pl ]

“Totally Random” in communication would seem to indicate total chaos in content and total chaos in media (alphabet & symbol frequency)… and total chaos would seem to indicate infinite entropy. Not zero.

LikeLike

January 9, 2017 at 2:54 pm

Agreed that if we had a random stream and wished to encode it, we would need to encode every bit. Thus, an infinite stream would have infinite entropy as you point out. However, imagine that this stream had no (useful) information. In this case, all the different random streams could be considered equivalent. There is nothing to encode. The minimum number of bits to encode nothing is zero, thus entropy of 0 indicates no information to encode, which I took the liberty to equate no information with a totally random signal.

Maybe I took a little poetic license?

LikeLike