TorchedUp
ProblemsPremium
TorchedUp
Gradient ClippingEasy
ProblemsPremium

Gradient Clipping

Implement gradient clipping by global norm.

Signature: def clip_gradients(grads: list, max_norm: float) -> list

Compute the global L2 norm across all gradient arrays. If it exceeds max_norm, scale all gradients by max_norm / global_norm.

Math

Asked at

Python (numpy)0/3 runs today

Test Results

○clip needed
○no clip needed
○two grad arrays🔒 Premium
○clipped grads have norm equal to max_norm when input norm exceeds it
○idempotent when input norm ≤ max_norm (no clipping)
Advertisement