Online Encyclopedia Search Tool

Your Online Encyclopedia

 

Online Encylopedia and Dictionary Research Site

Online Encyclopedia Free Search Online Encyclopedia Search    Online Encyclopedia Browse    welcome to our free dictionary for your research of every kind

Online Encyclopedia



Self-information

Within the context of information theory, self-information is defined as the amount of information that knowledge about (the outcome of) a certain event, adds to someone's overall knowledge. The amount of self-information is expressed in the unit of information: a bit.

By definition, the amount of self-information contained in a probabilistic event dependends only on the probability p that the event happens. More specifically: the smaller this probability is, the larger is the self-information associated with receiving information that the event indeed occurred.

Further, by definition, the measure of self-information has the following property. If an event C is composed of two mutually independent events A and B, then the amount of information at the proclamation that C has happened, equals the sum of the amounts of information at proclamations of event A and event B respectively.

Taking into account these properties, the self-information H(A) associated with event A that has a probability p is defined as:

H(A) = log2(1 / p)

bits. This definition, using the binary logarithm function, complies with the above conditions.

This definition can be rewritten as:

H(A) = - log2(p) (bits).

Examples

  • On tossing a coin, the chance of 'tail' is 0.5. When it is proclaimed that indeed 'tail' occurred, this amounts to
H('tail') = log2 (1/0.5) = log2 2 = 1 bits of information.
  • When throwing a die, the probability of 'four' is 1/6. When there is proclaimed that 'four' has been thrown, the amount of self-information is
H('four') = log2 (1/(1/6)) = log2 (6) = 2.585 bits.
  • When, independently, two dice are thrown, the amount of information associated with {throw 1 = 'two' & throw 2 = 'four'} equals
H('throw 1 is two & throw 2 is four') = log2 (1/Pr(throw 1 = 'two' & throw 2 = 'four')) = log2 (1/(1/36)) = log2 (36) = 5.170 bits.
This outcome equals the sum of the individual amounts of self-information associated with {throw 1 = 'two'} and {throw 2 = 'four'}; namely 2.585 + 2.585 = 5.170 bits.


Last updated: 10-24-2004 05:10:45