The Rectified Linear Unit is the most commonly used activation function in deep learning models. The function returns 0 if it receives any negative input, but for any positive value x it returns that value back. So it can be written as f(x)=max(0,x) .
Huafeng (Hua) Zhang
Everyone has their own Pi, irrational by nature.