Rectified Linear Units (ReLU) in Deep Learning | Kaggle

The Rectified Linear Unit is the most commonly used activation function in deep learning models. The function returns 0 if it receives any negative input, but for any positive value x it returns that value back. So it can be written as f(x)=max(0,x) .

Read the story >>