Implement Perceptron
Goal¶
This post aims to introduce how to implement Perceptron, which is the foundation of neural network and a simple gate function returning 0
(no signal) or 1
(signal) given a certain input.
In this post, the following fate functions are implemented:
- AND
- NAND
- OR
- XOR
Implement AND
gate¶
In [10]:
def AND(x0, x1, w0=0.5, w1=0.5, b=0.6):
return ((x0 * w0 + x1 * w1) > b) * 1.0
In [11]:
for x0, x1 in [(0, 0), (0, 1), (1, 0), (1, 1)]:
print(f"AND(x0={x0}, x1={x1}) = {AND(x0, x1)}")
Implement NAND
gate¶
In [24]:
def NAND(x0, x1, w0=-0.5, w1=-0.5, b=-0.6):
return ((x0 * w0 + x1 * w1) > b) * 1.0
In [25]:
for x0, x1 in [(0, 0), (0, 1), (1, 0), (1, 1)]:
print(f"NAND(x0={x0}, x1={x1}) = {NAND(x0, x1)}")
Implement OR
gate¶
In [34]:
def OR(x0, x1, w0=0.5, w1=0.5, b=0.2):
return ((x0 * w0 + x1 * w1) > b) * 1.0
In [35]:
for x0, x1 in [(0, 0), (0, 1), (1, 0), (1, 1)]:
print(f"OR(x0={x0}, x1={x1}) = {OR(x0, x1)}")
Implement XOR gate¶
In [36]:
def XOR(x0, x1):
n0 = NAND(x0, x1)
n1 = OR(x0, x1)
return AND(n0, n1)
In [37]:
for x0, x1 in [(0, 0), (0, 1), (1, 0), (1, 1)]:
print(f"XOR(x0={x0}, x1={x1}) = {XOR(x0, x1)}")
Comments
Comments powered by Disqus