Back to Glossary
Neural Networks

Jump connections

Jump connections, also known as skip connections, are a type of connection in neural networks that allow the activation of a layer to bypass one or more layers and connect directly to a later layer. This creates a 'shortcut' through the network. They are a key component in architectures like ResNet and DenseNet.

Explanation

Jump connections address the vanishing gradient problem, which can hinder the training of very deep neural networks. During backpropagation, gradients can become increasingly small as they pass through many layers, making it difficult for earlier layers to learn effectively. By allowing activations to 'jump' over layers, jump connections provide an alternative path for gradients to flow, mitigating the vanishing gradient problem and enabling the training of deeper networks. They also help to preserve the original input information across the network. In ResNet, the input of a layer is added to the output of the layer (addition). In DenseNet, the input of a layer is concatenated with the output of the layer (concatenation), allowing layers to access the feature maps of all preceding layers. Jump connections have shown to significantly improve the performance and trainability of deep neural networks across various tasks, including image recognition, natural language processing, and speech recognition.

Related Terms