Entropy quantifies the uncertainty in a random variable . Given a discrete random variable with range and probability mass function , the entropy of is defined as:

Values of

  1. Entropy is in bits

    • Most common in information theory
    • Measures uncertainty in terms of binary decisions
    • Standard when talking about data compression, communication, etc.
  2. Entropy is in nats

    • Used more in mathematics and physics (e.g., statistical mechanics)
    • The natural logarithm simplifies calculus-based derivations
  3. Entropy is in hartleys

    • Rarely used, but historically relevant