Time complexity of neural network

Matlab use many training algos, like LM, CG, SD etc for ANN training. I am searching for the time complexity for LM & CG based ANN implementation in Matlab. specificly, What is the time complexity of LM & CG method implementd in Matlab. Are they faster than O(n3).
Thanks in advance for your help.

 Réponse acceptée

Greg Heath
Greg Heath le 5 Jan 2013
Modifié(e) : Greg Heath le 5 Jan 2013

0 votes

It depends on the network efficiency settings that control the speed/storage tradeoff
For example, the current algorithms for regression and classification are FITNET and PATTERNNET, respectively. BOTH call FEEDFORWARDNET.
The default settings are 1 but can be overwritten.
net = feedforwardnet;
% To verify unity defaults
efficiency = net.efficiency
cacheDelayedInputs = net.efficiency.cacheDelayedInputs
flattentime = net.efficiency.flattenTime
memoryReduction = net.efficiency.memoryReduction
% To overide defaults
net.efficiency.cacheDelayedInputs = cacheDelayedInputs0;
net.efficiency.flattenTime = flattentime0;
net.efficiency.memoryReduction = memoryReduction0;
Search the newsgroup and answers using
neural speed memory
Hope this helps.
Thank you for formally accepting my answer.
Greg

Plus de réponses (1)

Matt J
Matt J le 30 Déc 2012
Modifié(e) : Matt J le 30 Déc 2012

0 votes

As a disclaimer, I have no background in neural networks. I do have some background in function optimization, though, and I think I can confidently say,
  1. The time complexity will depend on the structure of your network, i.e., how densely things are interconnected. If connections are sparse, then sparse math can be used for the gradient computations, etc... leading to reduced complexity.
  2. In general, the worst case complexity won't be better than O(N^3). LM requires the solution of an NxN system of equations in every iteration, which can be O(N^3). CG can require O(N^2) for each gradient computation per iteration and N iterations to converge for a total of O(N^3). These are worst case bounds, of course, and might not reflect what you see in practice.

5 commentaires

Walter Roberson
Walter Roberson le 30 Déc 2012
I thought I had read that the solution of an NxN system of linear equations could be done better than O(N^3). I do not recall the optimum at the moment, though. O(N^e) or O(N^2*log(N)) or something like that.
Matt J
Matt J le 30 Déc 2012
For structured matrices maybe (e.g. Toeplitz) I'm not aware of it being true for general matrices.
David
David le 5 Jan 2013
Thanks for answers.
I know that NN solutions depend on network architecture. Also research papers are availlable, where NN with back progagation has implemented in O(N) with more storage (IJNS). I am intrested in knowing how Matlab programmer coded these algos and with what complexity.
Matt J
Matt J le 5 Jan 2013
Since this information isn't given in the documentation, it stands to reason that TMW doesn't want the internal details of the code exposed.
However, you could test the complexity yourself just by running examples with increasing N and plotting the trend in the computation time as a function of N.
Greg Heath
Greg Heath le 5 Jan 2013
The current version of the source code is hard to decipher
type trainlm
Although obsolete versions of the code may be easier to understand, they may not be relevant re what the new code does.

Connectez-vous pour commenter.

Catégories

En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by