hyperparameter optimization (deep learning) using bayesopt
Afficher commentaires plus anciens
Following the answer here . I am trying to select best hyperparameters for my Recurrent neural network (RNN).
I want to optimize below hyperparameters in the given code using 'bayesopt()'.
How to define below parameters for 'bayesopt()' using ''optimizableVariable''.
training_function = {'traingd' 'traingda' 'traingdm' 'traingdx'}
optimizers= {'SGD', 'RMSprop', 'Adam'}
activation_functions= {'ReLU','Dropout'};
Transfer_functions= {'tansig,'tanh'};
The complete code is:
% Make some data
Daten = rand(100, 3);
Daten(:,3) = Daten(:,1) + Daten(:,2) + .1*randn(100, 1); % Minimum asymptotic error is .1
[m,n] = size(Daten) ;
% Split into train and test
P = 0.7 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTrain = Training(:,1:n-1);
YTrain = Training(:,n);
XTest = Testing(:,1:n-1);
YTest = Testing(:,n);
% Define a train/validation split to use inside the objective function
cv = cvpartition(numel(YTrain), 'Holdout', 1/3);
% Define hyperparameters to optimize
vars = [optimizableVariable('hiddenLayerSize', [1,20], 'Type', 'integer');
optimizableVariable('epochs', [20,200], 'Type', 'integer')
optimizableVariable('lr', [1e-3 1], 'Transform', 'log')];
----------------------------------
ADD ABOVE HYPERPARAMETERS HERE
--------------------------------
% Optimize
minfn = @(T)kfoldLoss(XTrain', YTrain', cv, T.hiddenLayerSize, T.lr);
results = bayesopt(minfn, vars,'IsObjectiveDeterministic', false,...
'AcquisitionFunctionName', 'expected-improvement-plus');
T = bestPoint(results)
Réponses (1)
Sammit Jain
le 29 Jan 2020
0 votes
Hello Ali,
It appears you're looking to create a BayesianOptimization object, for your set of hyperparameters. The following link has some examples that will help you customize your code:
Catégories
En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!