Python Perceptron Tutorial

Next Story

The Different Types Of Machine Learning Problems

Python Perceptron

 

Neural networks have been a popular topic lately. It comes up quite often in conversations throughout the tech circle.

The basic building block of any neural network is the perceptron. It has actually been around for some time in computer science.

To learn about how they work, lets build our own perceptron class from scratch with python.

Check out the code below for a sample implementation of a perceptron.


# import numpy to help with math operations
import numpy as np

class Perceptron(object):
    
    def __init__(self, eta=0.01, n_iter=10):
        # in the class initializer we set the learning rate(eta) 
        # and number of iterations to run(n_iter)
        self.eta = eta
        self.n_iter = n_iter
    
    def fit(self, X, y):
        # pass in the X and y to fit our new perceptron model
        # initiate our numpy array of weights and array of errors
        self.w_ = np.zeros(1 + X.shape[1])
        self.errors_ = []
        
        # train our model by looping n_iter times
        for _ in range(self.n_iter):
            errors = 0
            for xi, target in zip(X,y):
                # for each individual X and y data pair, update the
                # update our model weights via learning rate * prediction error
                update = self.eta * (target - self.predict(xi))
                self.w_[1:] += update * xi
                self.w_[0] += update
                errors += int(update != 0.0)
            self.errors_.append(errors)

        # return our model after it has passed n_iter times through our training dataset
        return self
    
    def net_input(self, X):
        # helper function for predict function
        # to get the value for weights * x + weight_0
        return np.dot(X, self.w_[1:]) + self.w_[0]
    
    def predict(self, X):
        # predict on unseen data with our trained model
        return np.where(self.net_input(X) >= 0.0, 1, -1)

Now that we have created our perceptron class, let’s try to use it to classify some flowers from the Iris dataset. The iris dataset we will use comes from sklearn. We will only be using 2 of the flower classes, for binary classification, since there are actually 3 types of irises.


# import the iris dataset from sklearn and other helpful functions
from sklearn import datasets
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error

# load our data
iris = datasets.load_iris()
X = iris.data
y = iris.target

# preprocess our X and y so that we only have data for 2 of the irises
# the dataset we will use will only contain labeled data for the 0 and 1 classes
X = X[(y == 0) | (y == 1)]
y = y[(y == 0) | (y == 1)]

# change classes labeled with 0 to -1 so it works better with the perceptron
y = np.where(y == 0, -1, y)

# make a 70/30 train/test split so we can test our perceptron performance
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42)

# create our perceptron object
perceptron = Perceptron(eta=0.1, n_iter=10)

# lets fit our model on the training data
perceptron.fit(X_train, y_train)

# predict labels for the irises with our test dataset
y_predict = perceptron.predict(X_test) 

# get the root mean squared error for our model with this train/test dataset
rmse = np.sqrt(mean_squared_error(y_test, y_predict))

# its easy to predict on a single sample as well.
# lets use our model to predict one of the samples feature sets
perceptron.predict([ 5.1,  3.5,  1.4,  0.2])

I hope this post has been enlightening. The perceptron is the basic building block of a neural net and it really isn’t that hard to understand 🙂

https://www.googletagmanager.com/gtag/js?id=UA-63695651-4