Weight Decay
Concept
A regularization technique used in neural networks to prevent overfitting by adding a penalty to the loss function that is proportional to the magnitude of the weights.
Mentioned in 2 videos
Videos Mentioning Weight Decay

Jeremy Howard: fast.ai Deep Learning Courses and Research | Lex Fridman Podcast #35
Lex Fridman
A regularization technique used in neural networks to prevent overfitting by adding a penalty to the loss function that is proportional to the magnitude of the weights.

Foundations of Deep Learning (Hugo Larochelle, Twitter)
Lex Fridman
A term often used interchangeably with L2 regularization, which penalizes large weights.