Entropy quantifies the uncertainty in a random variable . Given a discrete random variable with range and probability mass function , the entropy of is defined as:
Values of
-
→ Entropy is in bits
- Most common in information theory
- Measures uncertainty in terms of binary decisions
- Standard when talking about data compression, communication, etc.
-
→ Entropy is in nats
- Used more in mathematics and physics (e.g., statistical mechanics)
- The natural logarithm simplifies calculus-based derivations
-
→ Entropy is in hartleys
- Rarely used, but historically relevant