MNIST Digit Classifier
Neural network implementation from scratch with 98.32% accuracy on the MNIST handwritten digit dataset.

Overview
This project implements a neural network from scratch using only NumPy and PyTorch (for matrix operations), achieving 98.32% accuracy on the MNIST handwritten digit classification task. I built the entire network architecture including forward propagation, backpropagation, and various optimization algorithms to understand the fundamental concepts behind neural networks before using higher-level frameworks.
Key Features
- Custom implementation of backpropagation algorithm
- Multiple activation functions (ReLU, Softmax)
- Mini-batch gradient descent with momentum
- Learning rate scheduling
- Regularization techniques (ex. dropout)
Challenges & Solutions
Implementing backpropagation from scratch was challenging, especially since there's so much going on (weight initializations, dropout rates, etc.), and ensuring correct gradient computation (that was done by hand, ouch). I verified my implementation by comparing results with PyTorch's autograd on simple cases.
Status
CompletedTech Stack
Year
2024
Related Projects
Heida
An AI command center that unifies 220+ AI models with your own API keys, featuring document intelligence, interactive tools, and persistent knowledge graphs.
Contextual Retrieval System
A hybrid retrieval system combining semantic search and BM25 with context enrichment, achieving (a naive) 2.92/3.0 average accuracy on complex queries.
UBC Metrics
Course difficulty prediction system with 4.84% error rate based on historical grade distributions.