Sparse Autoencoder with Adam optimization
6 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Hello!
I have a data set that contains 4 parts 1- Train Attribute( 121x125973 double ) , 2- Train Label (1x125973 double ), 3- Test Attribute(121x22544 double ) , 4- Test Label (1x22544 double) for NSL KDD dataset and it is ready to implement algorithem.
I applied sparse autoencoder and works with out any problem
options.Method = 'lbfgs' ;
options.maxIter = maxIter ;
options.useMex = 0 ;
[opttheta, cost] = minFunc( @(p)sparseAutoencoderCost(p, inputSize, ...
hs, l1, sp, beta, trainAttr), theta, options) ;
trainFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
trainAttr) ;
testFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
testAttr) ;
But when I try to optimize the result using Adam optimizer I faced this problem " Unrecognized property 'GRADIENTDECAYFACTOR' for class 'nnet.cnn.TrainingOptionsADAM'.
this is my code
options = trainingOptions('adam', ...
'InitialLearnRate',3e-4, ...
'SquaredGradientDecayFactor',0.99, ...
'MaxEpochs',20, ...
'MiniBatchSize',64, ...
'Plots','training-progress');
[opttheta, cost] = minFunc( @(p)sparseAutoencoderCost(p, inputSize, ...
hs, l1, sp, beta, trainAttr), theta, options) ;
trainFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
trainAttr) ;
testFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
testAttr) ;
I wonder how can apply sparse autoencoder with adam optimization ?
0 commentaires
Réponses (0)
Voir également
Catégories
En savoir plus sur Eigenvalue Problems dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!