ReLU
Rectified Linear Unit, a common non-linearity used in neural networks that allows for faster training compared to sigmoids and Tanh.
Videos Mentioning ReLU

Deep Learning for Computer Vision (Andrej Karpathy, OpenAI)
Lex Fridman
Rectified Linear Unit, a common non-linearity used in neural networks that allows for faster training compared to sigmoids and Tanh.

Foundations of Deep Learning (Hugo Larochelle, Twitter)
Lex Fridman
Rectified Linear Unit activation function, which outputs 0 for negative inputs and the input value for positive inputs. It's popular due to its computational simplicity and ability to introduce sparsity.

Kevin Systrom: Instagram | Lex Fridman Podcast #243
Lex Fridman
Rectified Linear Unit, an activation function frequently used in neural networks, also cited as a 'general good idea' in network architecture.

Stanford CS336 Language Modeling from Scratch | Spring 2026 | Lecture 5: GPUs, TPUs
Stanford Online
Rectified Linear Unit, an activation function in neural networks, mentioned in the context of control divergence and low precision computations.