TorchedUp
ProblemsPremium
TorchedUp
KL DivergenceEasy
ProblemsPremium

KL Divergence

Implement the KL divergence KL(P || Q).

Signature: def kl_divergence(P: np.ndarray, Q: np.ndarray) -> float

Clip Q values to avoid log(0). Both P and Q are valid probability distributions (sum to 1).

Math

Asked at

Python (numpy)0/3 runs today

Test Results

○identical distributions
○P degenerate vs uniform
○non-trivial divergence🔒 Premium
○KL is non-negative
○KL is non-negative (large divergence)
○KL(P || P) = 0
Advertisement