Neural net backpropogation as per EasyNN-plus

3 views (last 30 days)
Hi all. I'm not sure if anyone has heard of EasyNN-plus (EasyNN-plus) but I'm trying to build something similar in Matlab. According to the FAQ EasyNN-plus uses a backpropagation neural net and a transfer function of logistic function. That is 1.0 / (1.0 + e (-net input)) . I'm trying to design something in Matlab that would do the same. It doesn't seem to be possible in the Deep Network Designer as this doesn't allow data input of the type used in the Regression Learner for example 9 inputs with one output i.e. not graphical input or time series. I may have completly missed what is already possible so if someone could point me in the right direction it would be most appreciated.

Accepted Answer

Stephen Gray
Stephen Gray on 11 Jul 2019
Looking at this again, I think I need to be using feedforwardnet. The problem is fitting my data to the lines
net = feedforwardnet(10);
net = train(net,x,t);
The input data is 9 fields * 1000 lines and the target data is one field * 1000 lines i.e. 8 inputs and 1 output. The problem is getting this data into the train function as I'm not quite sure what :-
The cell array format is more general, and more convenient for networks with multiple inputs and outputs,
allowing sequences of inputs to be presented. Each element X{i,ts} is an Ri-by-Q matrix, where Ri = net.inputs{i}.size.

More Answers (0)


Find more on Deep Learning with Time Series and Sequence Data in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by