Foundations
Hebbian learning
Hebbian learning is a fundamental concept in neuroscience and artificial intelligence that describes how synaptic connections between neurons strengthen when they are activated simultaneously. In simpler terms, it's often summarized as "neurons that fire together, wire together."
Explanation
Hebbian learning, proposed by Donald Hebb in 1949, provides a biological model for associative learning. The core idea is that if two neurons are active at the same time, the strength of the connection (synapse) between them increases. Mathematically, this is often represented as Δwᵢⱼ = η * xᵢ * xⱼ, where Δwᵢⱼ is the change in synaptic weight between neuron i and j, η is the learning rate, and xᵢ and xⱼ are the activation levels of the neurons. This simple rule leads to the formation of neural networks that can learn patterns and associations from data. In AI, Hebbian learning is used in various neural network architectures, particularly in unsupervised learning scenarios. It helps networks discover patterns in the input data without explicit labels, creating feature representations based on co-occurrence. While the original Hebbian rule can lead to unbounded weight growth, variations and extensions incorporate weight decay or normalization techniques to stabilize learning and prevent divergence. Modern implementations often involve more complex learning rules, but the underlying principle of strengthening connections between simultaneously active neurons remains central to understanding how brains and artificial neural networks can learn and adapt.