Mathematical Psychology
About

Self-Organizing Maps

Kohonen's self-organizing map is an unsupervised neural network that learns a topographic mapping from high-dimensional input space to a low-dimensional grid, preserving neighborhood relationships through competitive learning.

wᵢ(t+1) = wᵢ(t) + α(t) · h(i, i*, t) · [x(t) − wᵢ(t)]

The self-organizing map (SOM), developed by Teuvo Kohonen (1982, 2001), is an unsupervised neural network algorithm that produces a low-dimensional (typically two-dimensional) discretized representation of the input space. The SOM consists of a grid of nodes, each associated with a weight vector of the same dimensionality as the input data. Through a process of competitive learning and topographic ordering, the map comes to reflect the statistical structure of the input, with similar inputs mapping to nearby nodes on the grid.

The Learning Algorithm

SOM Learning Rule 1. Present input vector x(t)
2. Find best-matching unit (BMU): i* = argminᵢ ‖x(t) − wᵢ(t)‖
3. Update weights: wᵢ(t+1) = wᵢ(t) + α(t) · h(i, i*, t) · [x(t) − wᵢ(t)]

α(t) = learning rate (decreasing over time)
h(i, i*, t) = neighborhood function (e.g., Gaussian):
h(i, i*, t) = exp(−‖rᵢ − rᵢ*‖² / 2σ(t)²)

Learning proceeds through competition and cooperation. When an input is presented, the node whose weight vector is most similar to the input "wins" the competition — it is the best-matching unit (BMU). The BMU and its topographic neighbors then adjust their weight vectors toward the input, with the degree of adjustment decreasing with distance from the BMU according to the neighborhood function. Both the learning rate α(t) and the neighborhood radius σ(t) decrease over time, so early learning establishes the global topology while later learning fine-tunes the local structure.

Topographic Organization and Cognitive Relevance

The key emergent property of the SOM is topographic organization: inputs that are similar in the high-dimensional input space are mapped to nearby locations on the two-dimensional grid. This property mirrors the topographic maps observed throughout the cortex — retinotopic maps in visual cortex, tonotopic maps in auditory cortex, and somatotopic maps in somatosensory cortex — suggesting that SOM-like competitive learning could be a mechanism by which cortical maps self-organize during development.

SOMs in Categorization Research

In mathematical psychology, SOMs have been used to model category formation, phonetic category learning, and the development of semantic maps. The unsupervised nature of SOMs makes them suitable for modeling how organisms discover categorical structure in continuous sensory input without explicit feedback. For example, SOMs trained on acoustic speech features develop regions corresponding to phonetic categories, mirroring how infants learn to partition continuous acoustic space into discrete phonemic categories during the first year of life.

Unlike supervised networks trained with backpropagation, SOMs discover structure without labeled training data, making them models of perceptual learning and unsupervised category discovery. The SOM's reliance on competitive learning — where units compete to respond to each input — connects it to biological winner-take-all mechanisms mediated by lateral inhibition. Theoretical analyses have shown that the SOM approximately performs a form of vector quantization that minimizes distortion, linking it to information-theoretic principles of efficient coding.

Related Topics

References

  1. Kohonen, T. (1982). Self-organized formation of topologically correct feature maps. Biological Cybernetics, 43(1), 59–69. doi:10.1007/BF00337288
  2. Kohonen, T. (2001). Self-organizing maps (3rd ed.). Springer. doi:10.1007/978-3-642-56927-2
  3. Miikkulainen, R., Bednar, J. A., Choe, Y., & Sirosh, J. (2005). Computational maps in the visual cortex. Springer. doi:10.1007/0-387-28806-6
  4. Guenther, F. H., & Gjaja, M. N. (1996). The perceptual magnet effect as an emergent property of neural map formation. Journal of the Acoustical Society of America, 100(2), 1111–1121. doi:10.1121/1.416296

External Links