Mathematical Psychology
About

Richard Golden

Richard Golden has contributed to the mathematical foundations of neural network models, statistical methods for model selection in mathematical psychology, and information-theoretic approaches to cognitive modeling.

Richard M. Golden, working at the University of Texas at Dallas, has made contributions at the intersection of mathematical psychology, neural network theory, and statistical methodology. His work spans the mathematical analysis of neural network learning, the development of statistical methods for evaluating cognitive models, and the application of information-theoretic principles to understanding cognitive processes.

Statistical Methods for Model Selection

Generalized Information Criteria GIC = -2 * log L(theta_hat) + 2 * trace(J^(-1) * K)

L = likelihood function
J = expected Hessian (sensitivity matrix)
K = expected outer product of gradients (variability matrix)
When model is correctly specified: GIC reduces to AIC

Golden has developed generalized information-theoretic criteria for model selection that extend the Akaike Information Criterion (AIC) to situations where the candidate models may be misspecified -- a common situation in cognitive modeling where all models are known to be approximations. These methods provide robust model comparison tools that account for both model fit and complexity without assuming that any candidate model is the true data-generating process.

Neural Network Models and Cognition

Golden's work on the mathematical analysis of neural network models has clarified the conditions under which networks converge to optimal solutions, the relationship between network architecture and representational capacity, and the statistical properties of network learning algorithms. This work provides rigorous mathematical foundations for connectionist models that are often analyzed primarily through simulation.

Information Theory in Cognition

Golden has applied information-theoretic methods to problems in cognitive modeling, including the development of entropy-based measures for assessing model adequacy and the use of Kullback-Leibler divergence for quantifying the information loss when a model approximates the true data-generating process. These methods connect the practical task of model evaluation to deep theoretical concepts about the nature of information and statistical inference.

Legacy and Impact

Golden's contributions span the theoretical foundations of cognitive modeling. His textbook Statistical and Mathematical Methods for Data Analysis has provided researchers with rigorous treatments of the statistical methods needed for evaluating formal cognitive models. His work exemplifies the importance of sound statistical methodology in mathematical psychology, ensuring that conclusions drawn from model fitting are justified by the data.

Related Topics

References

  1. Golden, R. M. (1996). Mathematical methods for neural network analysis and design. MIT Press.
  2. Golden, R. M. (2000). Statistical tests for comparing possibly misspecified and nonnested models. Journal of Mathematical Psychology, 44(1), 153-170. doi:10.1006/jmps.1999.1281
  3. Golden, R. M. (2003). Discrepancy risk model selection test theory for comparing possibly misspecified or nonnested models. Psychometrika, 68(2), 229-249. doi:10.1007/BF02294799

External Links