Mathematical Psychology
About

Shannon Entropy

Shannon entropy quantifies the average uncertainty or information content of a random variable, forming the foundation of information theory and its applications in psychology.

H(X) = −Σ p(x) · log₂ p(x)

Claude Shannon's 1948 paper "A Mathematical Theory of Communication" introduced entropy as a measure of the average uncertainty in a random variable. In psychology, Shannon entropy has become a fundamental tool for quantifying the information content of stimuli, the capacity of human information processing, and the predictability of behavioral sequences.

Shannon Entropy H(X) = −Σ p(xᵢ) · log₂ p(xᵢ)

Measured in bits (base 2)
Maximum when all outcomes equally likely: H_max = log₂(n)

Properties

Entropy is maximized when all outcomes are equally probable (maximum uncertainty) and minimized (H = 0) when one outcome is certain. For a fair coin, H = 1 bit; for a fair die, H ≈ 2.58 bits. The difference between maximum possible entropy and observed entropy gives the redundancy of the source, which measures how predictable the distribution is.

Applications in Psychology

George Miller's seminal 1956 paper "The Magical Number Seven" used information-theoretic measures to characterize the capacity of human information processing. Hick's Law (RT = a + b·H) directly links Shannon entropy to reaction time. In language processing, entropy predicts reading times: words that are less predictable in context (higher surprisal = −log p(word|context)) produce longer fixation durations. Entropy measures are also used in EEG analysis to quantify the complexity and regularity of neural signals.

Interactive Calculator

Each row provides an event label and its probability. The calculator computes Shannon entropy H = −Σ p(x)·log₂ p(x), information content per event, and redundancy.

Click Calculate to see results, or Animate to watch the statistics update one record at a time.

Related Topics

References

  1. Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27(3), 379–423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x
  2. Miller, G. A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 63(2), 81–97. https://doi.org/10.1037/h0043158
  3. Cover, T. M., & Thomas, J. A. (2006). Elements of information theory (2nd ed.). Wiley. https://doi.org/10.1002/047174882X
  4. Norwich, K. H. (1993). Information, sensation, and perception. Academic Press. https://doi.org/10.1016/C2009-0-03028-1

External Links