Hick's Law (also known as the Hick-Hyman Law) was independently discovered by William Edmund Hick (1952) and Ray Hyman (1953). It states that the time to make a decision increases logarithmically with the number of choices, establishing a direct link between Shannon's information theory and human reaction time.
= a + b · H (where H = stimulus entropy in bits)
a = base RT (intercept)
b = rate of information processing (ms/bit)
n = number of equally likely alternatives
Information-Theoretic Interpretation
The +1 in log₂(n+1) accounts for the temporal uncertainty of stimulus onset (an additional "alternative" is that no stimulus has appeared yet). When stimulus probabilities are unequal, the generalization uses Shannon entropy H rather than log₂(n). The slope b reflects the human information processing rate, typically around 150–200 ms per bit for simple choice reactions.
Applications
Hick's Law is widely applied in human-computer interaction and UX design: it predicts that adding more options to a menu increases selection time logarithmically, not linearly. In sports psychology, it explains why feints and fakes are effective — they increase the number of possible stimuli the defender must discriminate, slowing their reaction. The law breaks down, however, when choices are highly practiced or when stimulus-response mappings are highly compatible.