This repo contains a numpy from-scratch implementation of some ML algorithms, initially designed for the MNIST Digit classification task (~3% error, averaged over 20 runs of 5 fold cross-validation)
- kNN
- 3 Layer MLP (ReLU & Softmax activations), Cross-Entropy Loss
- Least Squares
- Winnow
- One-vs-One and One-vs-All Muliticlass Kernel Perceptron
- random train/test split
- Cross Validation
- Gram Matrix (for polynomial and Gaussian kernels)
TODO: change output type from error rate to prediction