Claude Shannon's 1948 paper "A Mathematical Theory of Communication" introduced entropy as a measure of the average uncertainty in a random variable. In psychology, Shannon entropy has become a fundamental tool for quantifying the information content of stimuli, the capacity of human information processing, and the predictability of behavioral sequences.
Measured in bits (base 2)
Maximum when all outcomes equally likely: H_max = log₂(n)
Properties
Entropy is maximized when all outcomes are equally probable (maximum uncertainty) and minimized (H = 0) when one outcome is certain. For a fair coin, H = 1 bit; for a fair die, H ≈ 2.58 bits. The difference between maximum possible entropy and observed entropy gives the redundancy of the source, which measures how predictable the distribution is.
Applications in Psychology
George Miller's seminal 1956 paper "The Magical Number Seven" used information-theoretic measures to characterize the capacity of human information processing. Hick's Law (RT = a + b·H) directly links Shannon entropy to reaction time. In language processing, entropy predicts reading times: words that are less predictable in context (higher surprisal = −log p(word|context)) produce longer fixation durations. Entropy measures are also used in EEG analysis to quantify the complexity and regularity of neural signals.