Effacer les filtres
Effacer les filtres

trainNetwork reports too many input arguments in 2024a

64 vues (au cours des 30 derniers jours)
Peter
Peter le 21 Août 2024 à 16:07
Commenté : Peter il y a environ 6 heures
Transfer learning code, based on the help example, that runs in 2023b, fails in 2024a
Error using trainNetwork (line 191)
Too many input arguments.
What has changed in the 2024a version? I see that trainnet is now recommended and I can do that going forward, but I would expect old code still to run.
  3 commentaires
Peter
Peter le 21 Août 2024 à 19:51
Déplacé(e) : Voss le 21 Août 2024 à 19:54
Error using trainNetwork (line 191)
Too many input arguments.
Error in train_faces2c_resnet50 (line 97)
netTransfer = trainNetwork(augimdsTrain,lgraph,options);
Caused by:
Error using gather
Too many input arguments.
Cris LaPierre
Cris LaPierre le 22 Août 2024 à 13:18
I'm not able to duplicate the error given the information you've shared. Can you provide a working example we can test with? If not, then I'd suggest contacting support: https://www.mathworks.com/support/contact_us.html

Connectez-vous pour commenter.

Réponse acceptée

Hitesh
Hitesh le 22 Août 2024 à 11:56
Modifié(e) : Hitesh le 23 Août 2024 à 11:59
Hello Peter!
I've replicated your scenario in MATLAB R2024a, and the"trainNetwork"function is providing the expected results. Please refer to the following example code:
numImages = 100; % Number of images
imageSize = [28, 28, 1]; % Image size (e.g., 28x28 pixels, 1 channel for grayscale)
numClasses = 10; % Number of classes (for classification)
X = rand(imageSize(1), imageSize(2), imageSize(3), numImages); % Random images
Y = categorical(randi([1, numClasses], numImages, 1)); % Random labels
imds = arrayDatastore(X, 'IterationDimension', 4); % Create an image datastore
lds = arrayDatastore(Y); % Create a label datastore
combinedDS = combine(imds, lds); % Combine the image and label datastores
% Define a simple network architecture
layers = [
imageInputLayer(imageSize)
convolution2dLayer(3, 8, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer(2, 'Stride', 2)
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer];
% Set training options
options = trainingOptions('sgdm', ...
'MaxEpochs', 5, ...
'InitialLearnRate', 0.01, ...
'Verbose', false, ...n
'Plots', 'training-progress');
% Train the network using the combined datastore
net = trainNetwork(combinedDS, layers, options);
Here, the command"net = trainNetwork(combinedDS, layers, options)"trains and returns a network trainedNet for a classification problem. "combinedDS" is an ImageDatastore with categorical labels, "layers" is an array of network layers or a LayerGraph, and "options" is a set of training options.
So, the"trainNetwork"function is functioning as anticipated, even though they have introduced the"trainnet"function, which includes a loss function parameter for training.
Please refer to the following documentation of the “trainNetwork” function : (Not recommended) Train neural network - MATLAB trainNetwork (mathworks.com)
I hope this addresses your issue. Please refer the attached images, which include the results, from my replication using the "trainNetwork" function.
  3 commentaires
Hitesh
Hitesh le 23 Août 2024 à 5:44
Hi Peter !
Could you share the code or file that will reproduce the same error that you are facing? The "trainNetwork" function seems to be operating as intended in its normal functionality.
Peter
Peter il y a environ 6 heures
Hi Hitesh, thanks for your time on this. Having rebooted the whole machine, rather than just Matlab, my code now runs in 2024a, too. Something pretty weird, since it stopped 2024a and not 2023b, I guess we'll never know.
Thanks, Peter

Connectez-vous pour commenter.

Plus de réponses (0)

Catégories

En savoir plus sur Image Data Workflows dans Help Center et File Exchange

Produits


Version

R2024a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by