information entropy definition

  • noun:
    • a way of measuring the doubt connected with an arbitrary adjustable¬†; a way of measuring the typical information content you're missing whenever an individual cannot understand the value of the random variable (usually in devices such as for instance bits); the total amount of information (calculated in, state, bits) included per normal instance of a character in a stream of figures.

Related Sources

  • Synonym for "information entropy"
  • Cross Reference for "information entropy"
29 votes

How would you define information entropy?

All the definitions on AZdictionary were written by people just like you. Now's your chance to add your own!