Back to all projects

MNIST Digit Classifier

Neural network implementation from scratch with 98.32% accuracy on the MNIST handwritten digit dataset.

MNIST Digit Classifier screenshot

Overview

This project implements a neural network from scratch using only NumPy and PyTorch (for matrix operations), achieving 98.32% accuracy on the MNIST handwritten digit classification task. I built the entire network architecture including forward propagation, backpropagation, and various optimization algorithms to understand the fundamental concepts behind neural networks before using higher-level frameworks.

Key Features

  • Custom implementation of backpropagation algorithm
  • Multiple activation functions (ReLU, Softmax)
  • Mini-batch gradient descent with momentum
  • Learning rate scheduling
  • Regularization techniques (ex. dropout)

Challenges & Solutions

Implementing backpropagation from scratch was challenging, especially since there's so much going on (weight initializations, dropout rates, etc.), and ensuring correct gradient computation (that was done by hand, ouch). I verified my implementation by comparing results with PyTorch's autograd on simple cases.

Status

Completed

Tech Stack

PythonNumPyPyTorchNeural NetworksMachine LearningComputer Vision

Year

2024

Related Projects