Shannon entropy is a information theory measure of the amount of information in a message, measured in shannons. A computer byte is an 8-bit message which communicates 8 shannons of information. A message that communicates no information has a Shannon entropy of zero.
There are two way to accomplish this. An empty message has 0 shannons. Alternately, a perfectly random message also has 0 shannons of entropy.
You be the judge if this blog reaches its lofty objectives.