MNIST Digit Classifier
Neural network implementation from scratch with 98.32% accuracy on the MNIST handwritten digit dataset.

Overview
This project implements a neural network from scratch using only NumPy and PyTorch (for matrix operations), achieving 98.32% accuracy on the MNIST handwritten digit classification task. I built the entire network architecture including forward propagation, backpropagation, and various optimization algorithms to understand the fundamental concepts behind neural networks before using higher-level frameworks.
Key Features
- Custom implementation of backpropagation algorithm
- Multiple activation functions (ReLU, Softmax)
- Mini-batch gradient descent with momentum
- Learning rate scheduling
- Regularization techniques (ex. dropout)
Challenges & Solutions
Implementing backpropagation from scratch was challenging, especially since there's so much going on (weight initializations, dropout rates, etc.), and ensuring correct gradient computation (that was done by hand, ouch). I verified my implementation by comparing results with PyTorch's autograd on simple cases.
Status
CompletedTech Stack
Year
2024
Related Projects
Spec2MCP
Turn any API docs (OpenAPI) into ready-to-use MCP server schemas—no deep technical setup needed. 🏆 Top 3 at YC MCP Hackathon.
Merin
An intelligent email platform reimagined for the AI era, designed to help users process emails faster with AI-powered assistance.
Obsidian Vercel
A tool for Obsidian users to avoid paying for publish/sync and host their notes on Vercel via a CI/CD pipeline.