Machine Learning Module Wiki
Map of the module’s weekly themes and key concepts.
Weeks
- week 01 — What Does It Mean to Learn from Data?
- week 02 — Solving Logistic Regression: From MLE to IRLS
- week 03 — Beyond the Line: Basis Expansion and Maximum Margin
- week 04 — Going Dual: Lagrangians, Kernels, and the Trick That Makes Them Tractable
- week 05 — Allowing Mistakes: Soft Margins and How to Actually Solve the QP
- week 06 —
- week 07 — Linear Regression and Why Squared Error Isn’t Arbitrary
- week 08 — From Bayesian Priors to Generalization Bounds
- week 09 —
- week 10 —
- week 11 —
- week 12 —
Cross-Week Topics
- optimization algorithms — GD vs Newton-Raphson/IRLS vs SMO: when each is the right tool, and why
- classification approaches — Logistic regression vs hard-margin SVM vs soft-margin SVM: same hypothesis, different criteria
Exam Prep
- flashcards
- past papers
- revision guide