AI Glossary

ResNet

A deep CNN architecture using skip connections that enabled training of networks with 100+ layers.

Overview

ResNet (Residual Network), introduced by He et al. in 2015, revolutionized deep learning by solving the degradation problem — the observation that simply stacking more layers actually decreased performance. ResNet's key innovation is the skip connection (residual connection), where the input to a block is added to its output, allowing the network to learn residual functions.

Key Details

Skip connections enable gradient flow through very deep networks (up to 1000+ layers), solving the vanishing gradient problem. ResNet-50 and ResNet-101 became standard backbones for image classification, object detection, and segmentation. The residual connection concept has been adopted across all modern architectures, including transformers.

Related Concepts

cnnskip connectionvanishing gradient

← Back to AI Glossary

Last updated: March 5, 2026