How to customize Neural Networks' activation function

646 views (last 30 days)
Shahar Hochma
Shahar Hochma on 18 Nov 2016
Answered: David Willingham on 19 May 2022
Hi, I would like to implement, using Matlab, a neural network with 3 hidden layers, each using ReLU activation function. How can i do this?
Currently, I know i can set the activation function using:
net.layers{i}.transferFcn = reluLayer();
But this only allows to set a specific type of function that is predefined (like logsig), but ReLU is not one of those functions.
Is there a way to change the layer to the ReLU layer? Thanks

Answers (5)

Darío Pérez
Darío Pérez on 24 Oct 2017
As far as I am concern, you can use the predefined function 'poslin' (which is a ReLU):
net.layers{i}.transferFcn = 'poslin';
but "other differentiable transfer functions can be created and used if desired": Multilayer Neural Network Architecture.
Not sure how discontinuity at x=0 would affect training stage. In addition, recent articles state that ReLU should be used for regression problems but it achieves worst results than 'tansig' or 'logsig' in one of my examples. Has anyone any thoughts/conclusions in this regard?
Regards!

daniel
daniel on 30 Nov 2017
its useful for deeper nets so depends on your # layers, it's mostly for minimizing vanishing/exploding gradients
  1 Comment
Jan
Jan on 21 Jan 2019
[MOVED from flags] Flagged by Amine Bendali on 18 Jan 2019 at 22:31.
i want to check it later
[Please use flags only to inform admins and editors about messages, which should be reviewed. Thanks]

Sign in to comment.


peter chevo
peter chevo on 31 Aug 2018
Edited: peter chevo on 31 Aug 2018
1. Copy folder and file of C:\Program Files\MATLAB\MATLAB Production Server\R2015a\toolbox\nnet\nnet\nntransfer\ such as +tansig and tansig.m to current path 2. edit file name such as tansig.m is my_transfer.m 3. edit folders name such as +tansig is +my_transfer 4. edit last line in apply.m to your formula equation
  1 Comment
Abdelwahab Afifi
Abdelwahab Afifi on 3 Mar 2021
This method doesn't work. Because each Activation function has its own files with its own sturcture/values/equation.

Sign in to comment.


Maria Duarte Rosa
Maria Duarte Rosa on 5 Apr 2019
For Deep Learning networks one can create a custom activation layer using:

David Willingham
David Willingham on 19 May 2022
Hi,
@Maria Duarte Rosa gave a good answer on how to create a custom activation layer by visiting this page: Define Custom Deep Learning Layers
This extended answer is aimed at addressing how to define a "Relu" function in MATLAB.
With MATLAB's current Deep Learning framework, ReLu function is a standard layer you can define.
Here is an example:
Create a ReLU layer with the name 'relu1'.
layer = reluLayer('Name','relu1')
layer =
ReLULayer with properties:
Name: 'relu1'
Include a ReLU layer in a Layer array.
layers = [ ...
imageInputLayer([28 28 1])
convolution2dLayer(5,20)
reluLayer
maxPooling2dLayer(2,'Stride',2)
fullyConnectedLayer(10)
softmaxLayer
classificationLayer]
layers =
7x1 Layer array with layers:
1 '' Image Input 28x28x1 images with 'zerocenter' normalization
2 '' Convolution 20 5x5 convolutions with stride [1 1] and padding [0 0 0 0]
3 '' ReLU ReLU
4 '' Max Pooling 2x2 max pooling with stride [2 2] and padding [0 0 0 0]
5 '' Fully Connected 10 fully connected layer
6 '' Softmax softmax
7 '' Classification Output crossentropyex
For more information on ReLu layer see this page: reluLayer

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by