TorchedUp
LearnBetaProblemsSystem DesignSoonPremium
TorchedUp
LearnBetaProblemsSystem DesignSoonPremium
Learn/ML Basics
∑

ML Basics

Start here if you want to build intuition for the math primitives every ML engineer should be able to implement from scratch. These are the layers and loss functions you'll find inside every neural network — most of these are interview classics.

12 problems · suggested order

  1. ○1#1Numerically Stable Softmaxeasy
  2. ○2#214LogSumExp Trickeasy
  3. ○3#2Sigmoideasy
  4. ○4#3ReLUeasy
  5. ○5#4Cross-Entropy Losseasy
  6. ○6#5MSE Losseasy
  7. ○7#16KL Divergenceeasy
  8. ○8#11Dropout Forwardmedium
  9. ○9#6Batch Normalizationmedium
  10. ○10#7Layer Normalizationmedium
  11. ○11#9Adam Optimizer Stepmedium
  12. ○12#212AdamW (Decoupled Weight Decay)medium
Tracks are curated by hand. The order above is the suggested learning progression — feel free to skip around if you already know a topic.

© 2026 TorchedUp. All rights reserved.

ChangelogContact UsTerms of ServicePrivacy PolicyRefund Policy