![]() ![]() The real justification of these definitions, however, will reside in their implications. It is given chiefly to lend a certain plausibility to some of our later definitions. This theorem, and the assumptions required for its proof, are in no way necessary for the present theory. If a choice be broken down into successive choices, the original H should be the weighted sum of individual values of H.There is more uncertainty with more possible events With the increase in number of possible outcomes, there should be an increase in H.H should be continuous in probabilities.This brings us to a practical definition of entropy - a measure of certainty in the outcome of a process.īut why this measure and why not something else? As Shannon mentions, only H of this form satisfies 3 key laws: This means there is no uncertainty - we 100% know the outcome as in a weighted coin or a loaded dice. it always occurs), then the entropy turns out to be 0. the coin toss example.) Whereas if one possibility has probability of 1 (i.e. Now if both possibilities are equally likely, the entropy turns out to be a maximum of 1 - which means you need 1 bit to encode the information (i.e. If you have 2 possibilities, one having probability p, the other has probability q=1-p. Now let’s go to why entropy is * the* measure of uncertainty, and by token, the measure of information. If you go through the math, you find H=2, meaning you need 2 bits to encode the 4 possibilities given above. Now consider you have 4 options: 11, 00, 10, 01. you need 1 bit to encode this information. (Fun fact: Shannon’s article was one of the first to use the concept of bits of information.) Each of the 2 choices has a probability of 1/2, giving it a Shannon Entropy of 1 bit - i.e. There’s 2 choices - head (1) or tail (0), essentially a “bit” of information. What was remarkable, was the relevance of the same measure to information theory, a feat for which Shannon is often referred to as the “ father of information theory.” So how does this serve as a measure of information? H is then, for example, the H in Boltzmann’s famous H theorem.” “The form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics where p_i is the probability of a system being in cell i of its phase space. This measure was not new, and he in fact recognizes this in his paper. In 1948, Claude Shannon came up with a remarkable measure of information. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |