information entropy

Noun.  (context, information theory) A measure of the uncertainty associated with a random variable ; a measure of the average information content one is missing when one does not know the value of the random variable (usually in units such as bits); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters.

This is an unmodified, but possibly outdated, definition from Wiktionary and used here under the Creative Commons license. Wiktionary is a great resource. If you like it too, please donate to Wikimedia.

This entry was last updated on RefTopia from its source on 3/20/2012.