Question-48

calculus
derivatives
neural networks
activation function
DA-2025

Which of the following statements is/are correct about the rectified linear unit (ReLU) activation function defined as \(\text{ReLU}( x) =\max( x,0)\), where \(x\in \mathbb{R}\)?

ReLU is piece-wise linear. It is continuous everywhere but not differentiable at \(x=0\). The last option is clearly incorrect.