3 ways of creating a neural network in PyTorch

Goal

This post aims to introduce 3 ways of how to create a neural network using PyTorch:

Three ways:

  • nn.Module
  • nn.Sequential
  • nn.ModuleList

image

Reference

Libraries

In [11]:
import torch
from torch import nn
import torch.nn.functional as F

Using nn.Module

This way inherits nn.Module when creating a neural network class and specify each layers in __init__ and define the order of layers and process in forward.

Template

In [12]:
class ABC(nn.Module):
    def __init__(self, param1, param2, param3):
        # execute super class's __init__()
        super().__init__()
        
        # Instanciate nn.Module class and assign as a member
        self.abc = nn.XYZ(param1, param2)
        self.edf = nn.PQR(param3)
        
        
    def forward(self, x):
        # write the sequence of layers and processes
        # x -> abc -> edf -> output
        x = self.abc(x)
        x = self.edf(x)
        return x

Example

In [21]:
class NeuralNetwork(nn.Module):
    def __init__(self, n_input, n_unit1, n_output):
        super().__init__()
        
        # Inputs to 1st hidden layer linear transformation 
        self.hidden = nn.Linear(n_input, n_unit1)
        
        self.sigmoid = nn.Sigmoid()

        # Output layer 
        self.output = nn.Linear(n_unit1, n_output)
        self.softmax = nn.Softmax(dim=1)

        
    def forward(self, x):
        x = self.hidden(x) 
        x = self.sigmoid(x)
        x = self.output(x)
        x = self.softmax(x)
        
        return x
In [22]:
model = NeuralNetwork(n_input=10, n_unit1=30, n_output=2)
model
Out[22]:
NeuralNetwork(
  (hidden): Linear(in_features=10, out_features=30, bias=True)
  (sigmoid): Sigmoid()
  (output): Linear(in_features=30, out_features=2, bias=True)
  (softmax): Softmax()
)

Using nn.Sequential

Template

In [ ]:
model = nn.Sequential(
    nn.ABC(n_inputs, param1),
    nn.DEF(),
    nn.GHI()
)

Example

In [23]:
model = nn.Sequential(
    nn.Linear(10, 30),
    nn.Sigmoid(),
    nn.Linear(30, 2),
    nn.Softmax()
)
model
Out[23]:
Sequential(
  (0): Linear(in_features=10, out_features=30, bias=True)
  (1): Sigmoid()
  (2): Linear(in_features=30, out_features=2, bias=True)
  (3): Softmax()
)

Using nn.ModuleList

Template

In [24]:
class ABC(nn.Module):
    def __init__(self, param1, param2, param3):
        # execute super class's __init__()
        super().__init__()
        
        # Instanciate nn.Module class and assign as a member
        abc = nn.XYZ(param1, param2)
        edf = nn.PQR(param3)
        l = [abc, edf]
        self.module_list = nn.ModuleList(l)
        
    def forward(self, x):
        # write the sequence of layers and processes
        # x -> abc -> edf -> output
        for f in self.module_list:
            x = f(x)
        return x

Example

In [32]:
class NeuralNetwork(nn.Module):
    def __init__(self, n_inputs, n_hidden_unit, n_output):
        super().__init__()
        l1 = nn.Linear(n_inputs, n_hidden_unit)
        a1 = nn.Sigmoid()
        l2 = nn.Linear(n_hidden_unit, n_output)
        s = nn.Softmax(dim=1)
        
        l = [l1, a1, l2, s]
        self.module_list = nn.ModuleList(l)

    def forward(self, x):
        for f in self.module_list:
            x = f(x)
        return x
In [33]:
model = NeuralNetwork(n_inputs=10, n_hidden_unit=30, n_output=2)
model
Out[33]:
NeuralNetwork(
  (module_list): ModuleList(
    (0): Linear(in_features=10, out_features=30, bias=True)
    (1): Sigmoid()
    (2): Linear(in_features=30, out_features=2, bias=True)
    (3): Softmax()
  )
)

Comments

Comments powered by Disqus