![]() ![]() He captured it in a formula that calculates the minimum number of bits - a threshold later called the Shannon entropy - required to communicate a message. Shannon was the first person to make this relationship mathematically precise. More generally, the less you know about what the message will say, the more information it takes to convey. ![]() In the second you had a 1-in-4 chance of guessing the right answer - 25% certainty - and the message needed two bits of information to resolve that ambiguity. So, what’s the point? In the first scenario you had complete certainty about the contents of the message, and it took zero bits to transmit it. There are four possible messages - 00, 11, 01, 10 - and each requires two bits of information. We can communicate the result using binary code: 0 for heads, 1 for tails. In the second scenario I do my two flips with a normal coin - heads on one side, tails on the other. ![]()
0 Comments
Leave a Reply. |