TorchedUp
ProblemsPremium
TorchedUp
ReLU & VariantsEasy
ProblemsPremium

ReLU

Implement ReLU (Rectified Linear Unit).

Signature: def relu(x: np.ndarray) -> np.ndarray

Math

Asked at

Python (numpy)0/3 runs today

Test Results

○mixed values
○all negative
○all positive🔒 Premium
○output is non-negative
○idempotent: relu(relu(x)) == relu(x)
○preserves argmax for positive inputs
Advertisement