Hick's law, independently formulated by William Edmund Hick (1952) and Ray Hyman (1953), establishes that the time to choose among n equally likely alternatives increases logarithmically with n — or, equivalently, linearly with the Shannon entropy H of the stimulus set. This elegant result connects information theory directly to the chronometry of decision making, implying that the human cognitive system processes information at a roughly constant rate, measured in bits per second.
The Hick-Hyman Equation
RT = a + b · log₂(n)
For unequal probabilities (Hyman, 1953):
RT = a + b · H(X) = a + b · [−Σ pᵢ · log₂(pᵢ)]
Information rate: R = H(X) / (RT − a) ≈ 1/b bits/second
Typical values: b ≈ 150 ms/bit, R ≈ 6–7 bits/second
Hick (1952) proposed the logarithmic relationship based on experiments in which subjects responded to stimulus lamps with corresponding response keys. Hyman (1953) generalized the result by showing that RT depends on the entropy of the stimulus distribution, not merely the number of alternatives. When stimuli are unequally probable, RT is better predicted by H(X) = −Σ pᵢ · log₂(pᵢ) than by log₂(n), confirming the information-theoretic basis of the effect. The slope b, typically around 150 ms/bit, represents the reciprocal of the human processing rate.
Theoretical Interpretations
Several theoretical accounts explain Hick's law. The serial self-terminating search model assumes that the subject searches through alternatives one at a time, with each comparison taking a fixed amount of time. The iterative dichotomization model proposes that the subject narrows down the response by successive binary decisions, each taking time b. The accumulator model (Usher & McClelland, 2001) treats choice as a race between evidence accumulators, one per alternative, with the logarithmic increase arising from mutual inhibition among accumulators.
Hick's law has practical applications in interface design. The time to select an item from a menu of n options is predicted by log₂(n + 1), where the +1 accounts for the option of not selecting. This principle guides the design of software menus, web navigation, and remote controls: hierarchical menus with fewer options at each level can reduce selection time compared to a single flat menu with many options, because the total entropy at each level is smaller.
Boundary Conditions
Hick's law has well-documented boundary conditions. It holds most cleanly for arbitrary stimulus-response mappings but breaks down when responses are highly compatible with stimuli (e.g., reaching toward the spatial location of a visual target), where RT increases little with n. Practice effects reduce the slope b, suggesting that overlearned responses bypass the information-processing bottleneck. The law also fails for tasks involving go/no-go decisions, where Donders' subtraction method is more appropriate.
Despite these limitations, Hick's law remains one of the most reliable quantitative laws in experimental psychology and one of the clearest demonstrations that human cognition can be productively analyzed using information-theoretic tools. The linear relationship between RT and entropy provides a direct psychophysical scaling of the "speed of thought."