ReLU
Concept
Rectified Linear Unit, a common non-linearity used in neural networks that allows for faster training compared to sigmoids and Tanh.
Mentioned in 2 videos
Videos Mentioning ReLU

Deep Learning for Computer Vision (Andrej Karpathy, OpenAI)
Lex Fridman
Rectified Linear Unit, a common non-linearity used in neural networks that allows for faster training compared to sigmoids and Tanh.

Foundations of Deep Learning (Hugo Larochelle, Twitter)
Lex Fridman
Rectified Linear Unit activation function, which outputs 0 for negative inputs and the input value for positive inputs. It's popular due to its computational simplicity and ability to introduce sparsity.