Neural net backpropogation as per EasyNN-plus

1 vue (au cours des 30 derniers jours)
Stephen Gray
Stephen Gray le 11 Juil 2019
Hi all. I'm not sure if anyone has heard of EasyNN-plus (EasyNN-plus) but I'm trying to build something similar in Matlab. According to the FAQ EasyNN-plus uses a backpropagation neural net and a transfer function of logistic function. That is 1.0 / (1.0 + e (-net input)) . I'm trying to design something in Matlab that would do the same. It doesn't seem to be possible in the Deep Network Designer as this doesn't allow data input of the type used in the Regression Learner for example 9 inputs with one output i.e. not graphical input or time series. I may have completly missed what is already possible so if someone could point me in the right direction it would be most appreciated.

Réponse acceptée

Stephen Gray
Stephen Gray le 11 Juil 2019
Looking at this again, I think I need to be using feedforwardnet. The problem is fitting my data to the lines
net = feedforwardnet(10);
net = train(net,x,t);
The input data is 9 fields * 1000 lines and the target data is one field * 1000 lines i.e. 8 inputs and 1 output. The problem is getting this data into the train function as I'm not quite sure what :-
The cell array format is more general, and more convenient for networks with multiple inputs and outputs,
allowing sequences of inputs to be presented. Each element X{i,ts} is an Ri-by-Q matrix, where Ri = net.inputs{i}.size.

Plus de réponses (0)


En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange


Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by