Hands-on reimplementations with annotated notebooks. Each entry below is a self-contained notebook.


Backpropagation & Autograd Engine

Backpropagation is the algorithm that computes gradients through a neural network by applying the chain rule backwards through the computation graph. Built here as a scalar-valued autograd engine — each operation tracks its inputs and knows how to propagate gradients — plus a minimal neural net library on top. ~200 lines of Python, zero dependencies.

Bigram Language Model & Neural Net Equivalent

A bigram model predicts the next character based only on the current one — the simplest possible language model. Built first as a counting model (character pair frequencies → probabilities), then rebuilt as an equivalent single-layer neural network trained with gradient descent, showing both approaches converge to the same solution.

JPEG Compression

JPEG is a lossy image compression standard. Built here from scratch: colour space conversion, 8×8 DCT blocks, quantization, zig-zag scanning, RLE, and Huffman coding.

Audio Fingerprinting

Audio fingerprinting identifies a song from a short, noisy clip by hashing the time-frequency peaks in its spectrogram. Built here from scratch — the same approach Shazam uses: FFT, STFT spectrogram, peak detection, and a hash-based song database.