![]() If in the second step we learn who the winner is, this will now carry an additional $4$ bits of information, since no matter who it is, the probability for this will be $\frac1 \right) \\ ![]() So the message in the first step, that the winner isn't the first person, carried $2$ bits of information. We can see in more detail how this fits together by adding the information from the two steps in which we're receiving the race result, and comparing this to the information that would have been transferred if we'd received the race result in a single step. It's just that before receiving the message, we expected the race result to contain little information, whereas now that we've received the message, we expect it to contain much information. The fact that the conditional distribution after receiving the message has a higher entropy, and thus more uncertainty, than the prior one, does not contradict the fact that a positive amout of information was received. The entropy of the distribution gives the expected information gain upon receiving a message distributed accordingly. The information gained is determined only by the probability of the message received, not by the rest of the distribution. However, the conclusion that the information “gained” is the difference of these entropies, and thus negative, is incorrect. The entropy calculations in Adriano's answer are also correct. Quinton's answer is correct: The message carried $2$ bits of information, and the remaining uncertainty is $4$ bits. ![]() The two existing answers contradict each other.
0 Comments
Leave a Reply. |