Friday, December 12, 2025
HomeLanguagesWhat is TanhLayer in PyBrain

What is TanhLayer in PyBrain

In this article, we will be looking at various functionality with the defined examples of the TanhLayer in PyBrain. Layers in Pybrain are functions that are used on hidden layers of the network. TanhLayer executes the tanh squashing function.

Syntax:

Import TanhLayer: from pybrain.structure import TanhLayer

Use in python code: net = buildNetwork(2, 3, 1, bias=True, hiddenclass=TanhLayer)

Example 1:

In this example, we import the TanhLayer using the import command to create the network using buildNetwork() with input, hidden, and output layer. We take a hidden class as TanhLayer, Now give the sizes of input and output dataset using SupervisedDataSet(). To add sample dataset to AND table and NOR table. Then train this network using BackpropTrainer(). We have 2500 iterations and then testing starts and we can see the errors, corrections, max, errors, etc.In this case, the sample data we have taken in AND table are ((0,0), (0,)) and ((0,1),(1,)) and NOR table are ((0,0),(0,)) and (0,1),(1,))

Python




from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure import TanhLayer
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
  
# two inputs, two hidden, and one output
net = buildNetwork(2, 3, 1, bias=True, hiddenclass=TanhLayer)
gate_set = SupervisedDataSet(2, 1)
test_dataset = SupervisedDataSet(2, 1)
  
# AND truth table
gate_set.addSample((0, 0), (0,))
gate_set.addSample((0, 1), (1,))
  
  
# NOR truth table
test_dataset.addSample((0, 0), (0,))
test_dataset.addSample((0, 1), (1,))
  
# Train the network with net and gate_set
backpr_tr = BackpropTrainer(net, gate_set)
  
# 2500 iteration
for i in range(2500):
      backpr_tr.train()
    
# Testing....
backpr_tr.testOnData(dataset=test_dataset, verbose = True)


Output:

Example 2:

Under this example, the sample data we have taken in AND table are ((0,0), (1,)) and ((0,1),(1,)) and NOR table are ((0,0),(0,)) and (0,1),(0,)) and we can see the testing output with average errors, max errors, median errors, etc.

Python




from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure import TanhLayer
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
  
# two inputs, two hidden, and one output
net = buildNetwork(2, 3, 1, bias=True, hiddenclass=TanhLayer)
gate_set = SupervisedDataSet(2, 1)
test_dataset = SupervisedDataSet(2, 1)
  
# AND truth table
gate_set.addSample((0, 0), (1,))
gate_set.addSample((0, 1), (1,))
  
# NOR truth table
test_dataset.addSample((0, 0), (0,))
test_dataset.addSample((0, 1), (0,))
  
#Train the network with net and gate_set
backpr_tr = BackpropTrainer(net, gate_set)
  
# 2500 iteration
for i in range(2500):
      backpr_tr.train()
    
# Testing....
backpr_tr.testOnData(dataset=test_dataset, verbose = True)


Output:

Dominic
Dominichttp://wardslaus.com
infosec,malicious & dos attacks generator, boot rom exploit philanthropist , wild hacker , game developer,
RELATED ARTICLES

Most Popular

Dominic
32444 POSTS0 COMMENTS
Milvus
105 POSTS0 COMMENTS
Nango Kala
6813 POSTS0 COMMENTS
Nicole Veronica
11951 POSTS0 COMMENTS
Nokonwaba Nkukhwana
12028 POSTS0 COMMENTS
Shaida Kate Naidoo
6945 POSTS0 COMMENTS
Ted Musemwa
7198 POSTS0 COMMENTS
Thapelo Manthata
6892 POSTS0 COMMENTS
Umr Jansen
6881 POSTS0 COMMENTS