TorchedUp
ProblemsPremium
TorchedUp
Layer NormalizationMedium
ProblemsPremium

Layer Normalization

Implement the layer normalization forward pass.

Signature: def layer_norm(x: np.ndarray, gamma: np.ndarray, beta: np.ndarray, eps: float = 1e-5) -> np.ndarray

Normalize across the feature dimension (last axis) for each sample independently.

Math

Asked at

Python (numpy)0/3 runs today

Test Results

○identity scale/shift 1D
○2D batch
○with scale and shift🔒 Premium
○shift-invariant: ln(x) == ln(x + c) (per-row mean subtracted)
○per-row output sums to 0 with identity gamma/beta (1D)
Advertisement