Shannon Entropy = 0

Posted on

Shannon entropy is a information theory measure of the amount of information in a message, measured in shannons. A computer byte is an 8-bit message which communicates 8 shannons of information. A message that communicates no information has a Shannon entropy of zero.

There are two way to accomplish this. An empty message has 0 shannons. Alternately, a perfectly random message also has 0 shannons of entropy.

You be the judge if this blog reaches its lofty objectives.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s