Skip to content
Home » ResNet – Residual Network

ResNet – Residual Network

I’m building a super tall tower out of Lego blocks. Each block is a layer in a neural network. The taller the tower, the more complex patterns it can learn.

But the problem is “Tall Towers Collapse”.

As I stack more layers, the network starts forgetting what it learned earlier. It gets confused and performs worse — even though it’s “deeper.”

“That’s called the vanishing gradient problem — the signal fades as it travels backward through the layers.”

“But why not give the signal a shortcut?” The dragon said. Bibo surprised, “What’s that?”

“Use ResNet. Just like in Mario, ResNet adds shortcut paths (called SKIP or RESIDUAL connections) that let information jump over layers.”

“So, instead of forcing every layer to learn something new, ResNet lets layers pass it along?”

Yes. ResNet skips the layer and adds the original input of the layer back to the output of the layer. Like saying: “Here’s what I learned, plus what I already knew.”

“So cool. Then I can build deeper networks without forgetting earlier knowledge!”

“And faster training because gradients flow more easily.”

Hello my tower!

Leave a Reply

error: Content is protected !!