ReLU

Concept

Rectified Linear Unit, a common non-linearity used in neural networks that allows for faster training compared to sigmoids and Tanh.

Mentioned in 2 videos