ReLU
Software / App
Rectified Linear Unit, an activation function that worked better than expected in deep nets, surprising Bengio.
Mentioned in 3 videos
Videos Mentioning ReLU

Ep 18: Petaflops to the People — with George Hotz of tinycorp
Latent Space
George Hotz dislikes the object-oriented implementation of ReLU in PyTorch, preferring a functional approach.

deeplearning.ai's Heroes of Deep Learning: Yoshua Bengio
DeepLearningAI
Rectified Linear Unit, an activation function that worked better than expected in deep nets, surprising Bengio.

MIT 6.S094: Recurrent Neural Networks for Steering Through Time
Lex Fridman
Rectified Linear Unit, a popular activation function in neural networks, discussed in the context of potential issues like zero gradients.