Basic implementation of a neural network with fully connected layers (multi-layer perceptron) for regression analysis.
Currently it support an arbitrary number of layers, all with the same activation function ('tanh', 'sigmoid' or 'relu') and L2 regularization factor. It is possible to add activation functions using an internal dictionary. The activation function for the output layer is linear.
Alberto Comin (2019). NeurNetRegr (https://www.mathworks.com/matlabcentral/fileexchange/69084-neurnetregr), MATLAB Central File Exchange. Retrieved .