# Implement Perceptron

## Goal¶

This post aims to introduce how to implement Perceptron, which is the foundation of neural network and a simple gate function returning 0 (no signal) or 1 (signal) given a certain input.

In this post, the following fate functions are implemented:

• AND
• NAND
• OR
• XOR
$$y = f(\mathbf{x})=\begin{cases} 0 & (b + \mathbf{wx} \le 0)\\ 1 &(b + \mathbf{wx} \gt 0) \end{cases}$$

## Implement AND gate¶

In :
def AND(x0, x1, w0=0.5, w1=0.5, b=0.6):
return ((x0 * w0 + x1 * w1) > b) * 1.0

In :
for x0, x1 in [(0, 0), (0, 1), (1, 0), (1, 1)]:
print(f"AND(x0={x0}, x1={x1}) = {AND(x0, x1)}")

AND(x0=0, x1=0) = 0.0
AND(x0=0, x1=1) = 0.0
AND(x0=1, x1=0) = 0.0
AND(x0=1, x1=1) = 1.0


## Implement NAND gate¶

In :
def NAND(x0, x1, w0=-0.5, w1=-0.5, b=-0.6):
return ((x0 * w0 + x1 * w1) > b) * 1.0

In :
for x0, x1 in [(0, 0), (0, 1), (1, 0), (1, 1)]:
print(f"NAND(x0={x0}, x1={x1}) = {NAND(x0, x1)}")

NAND(x0=0, x1=0) = 1.0
NAND(x0=0, x1=1) = 1.0
NAND(x0=1, x1=0) = 1.0
NAND(x0=1, x1=1) = 0.0


## Implement OR gate¶

In :
def OR(x0, x1, w0=0.5, w1=0.5, b=0.2):
return ((x0 * w0 + x1 * w1) > b) * 1.0

In :
for x0, x1 in [(0, 0), (0, 1), (1, 0), (1, 1)]:
print(f"OR(x0={x0}, x1={x1}) = {OR(x0, x1)}")

OR(x0=0, x1=0) = 0.0
OR(x0=0, x1=1) = 1.0
OR(x0=1, x1=0) = 1.0
OR(x0=1, x1=1) = 1.0


## Implement XOR gate¶

In :
def XOR(x0, x1):
n0 = NAND(x0, x1)
n1 = OR(x0, x1)
return AND(n0, n1)

In :
for x0, x1 in [(0, 0), (0, 1), (1, 0), (1, 1)]:
print(f"XOR(x0={x0}, x1={x1}) = {XOR(x0, x1)}")


XOR(x0=0, x1=0) = 0.0
XOR(x0=0, x1=1) = 1.0
XOR(x0=1, x1=0) = 1.0
XOR(x0=1, x1=1) = 0.0