Mathematical Psychology
About

Hick-Hyman Law

Hick's law states that choice reaction time increases linearly with the Shannon entropy of the stimulus set, implying that the human decision process operates at a roughly constant information rate.

RT = a + b · log₂(n)

Hick's law, independently formulated by William Edmund Hick (1952) and Ray Hyman (1953), establishes that the time to choose among n equally likely alternatives increases logarithmically with n — or, equivalently, linearly with the Shannon entropy H of the stimulus set. This elegant result connects information theory directly to the chronometry of decision making, implying that the human cognitive system processes information at a roughly constant rate, measured in bits per second.

The Hick-Hyman Equation

Hick-Hyman Law For equally likely alternatives:
RT = a + b · log₂(n)

For unequal probabilities (Hyman, 1953):
RT = a + b · H(X) = a + b · [−Σ pᵢ · log₂(pᵢ)]

Information rate: R = H(X) / (RT − a) ≈ 1/b bits/second
Typical values: b ≈ 150 ms/bit, R ≈ 6–7 bits/second

Hick (1952) proposed the logarithmic relationship based on experiments in which subjects responded to stimulus lamps with corresponding response keys. Hyman (1953) generalized the result by showing that RT depends on the entropy of the stimulus distribution, not merely the number of alternatives. When stimuli are unequally probable, RT is better predicted by H(X) = −Σ pᵢ · log₂(pᵢ) than by log₂(n), confirming the information-theoretic basis of the effect. The slope b, typically around 150 ms/bit, represents the reciprocal of the human processing rate.

Theoretical Interpretations

Several theoretical accounts explain Hick's law. The serial self-terminating search model assumes that the subject searches through alternatives one at a time, with each comparison taking a fixed amount of time. The iterative dichotomization model proposes that the subject narrows down the response by successive binary decisions, each taking time b. The accumulator model (Usher & McClelland, 2001) treats choice as a race between evidence accumulators, one per alternative, with the logarithmic increase arising from mutual inhibition among accumulators.

Hick's Law in Human-Computer Interaction

Hick's law has practical applications in interface design. The time to select an item from a menu of n options is predicted by log₂(n + 1), where the +1 accounts for the option of not selecting. This principle guides the design of software menus, web navigation, and remote controls: hierarchical menus with fewer options at each level can reduce selection time compared to a single flat menu with many options, because the total entropy at each level is smaller.

Boundary Conditions

Hick's law has well-documented boundary conditions. It holds most cleanly for arbitrary stimulus-response mappings but breaks down when responses are highly compatible with stimuli (e.g., reaching toward the spatial location of a visual target), where RT increases little with n. Practice effects reduce the slope b, suggesting that overlearned responses bypass the information-processing bottleneck. The law also fails for tasks involving go/no-go decisions, where Donders' subtraction method is more appropriate.

Despite these limitations, Hick's law remains one of the most reliable quantitative laws in experimental psychology and one of the clearest demonstrations that human cognition can be productively analyzed using information-theoretic tools. The linear relationship between RT and entropy provides a direct psychophysical scaling of the "speed of thought."

Interactive Calculator

Each row records n_alternatives (number of stimulus-response choices) and mean_rt (mean reaction time in ms). The calculator fits Hick's Law: RT = a + b·log₂(n+1).

Click Calculate to see results, or Animate to watch the statistics update one record at a time.

Interactive Calculator

Each row provides a joint observation: x (stimulus category) and y (response category). The calculator computes mutual information I(X;Y) = H(X) + H(Y) − H(X,Y) from the observed frequencies.

Click Calculate to see results, or Animate to watch the statistics update one record at a time.

Related Topics

References

  1. Hick, W. E. (1952). On the rate of gain of information. Quarterly Journal of Experimental Psychology, 4(1), 11–26. doi:10.1080/17470215208416600
  2. Hyman, R. (1953). Stimulus information as a determinant of reaction time. Journal of Experimental Psychology, 45(3), 188–196. doi:10.1037/h0056940
  3. Usher, M., & McClelland, J. L. (2001). The time course of perceptual choice: The leaky, competing accumulator model. Psychological Review, 108(3), 550–592. doi:10.1037/0033-295X.108.3.550
  4. Proctor, R. W., & Schneider, D. W. (2018). Hick's law for choice reaction time: A review. Quarterly Journal of Experimental Psychology, 71(6), 1281–1299. doi:10.1080/17470218.2017.1322622

External Links