Skip to content

Latest commit

 

History

History
18 lines (15 loc) · 947 Bytes

README.md

File metadata and controls

18 lines (15 loc) · 947 Bytes

MLP

This repo contains a full implementation of a fully extensible multilayer perceptron. The problem is formulated as a series of matrix multiplication using numpy only.

MNIST

MNIST is a standard benchmark for neural networks. With our simple implementation we get good results on this dataset.

Handwritten digit classification

After training our network, we use user input to classify test example. In this case the classification is live & interactive Image 1 Image 2 Image 3 Image 4

About: MLP NN

Our multilayer perceptron uses two hidden layers with 100 units each.

Special Thanks:

Brian Dolhansky: His tutorials gave me great understanding of backpropagation algorithm Siraj Raval: The live classification was inspired from one of his tutorials