What is causing an Undefined or variable 'oldVariableNames' Error when using trainNetwork?
Afficher commentaires plus anciens
I am getting the following error when calling trainNetwork from the script facetrain.m which is shown further down.
>> facetrain Error using trainNetwork (line 150) Undefined function or variable 'oldVariableNames'.
Error in facetrain (line 87) net = trainNetwork(train_patch_ds, lgraph, options);
Caused by: Undefined function or variable 'oldVariableNames'.
The crash does not come immediately, a window with the text 'imageNormalization' is present for a quite some time and when the window disappears the error appears.
I have without success attempted to follow suggestions from other threads recommending the removal of any custom paths via the following steps: restoredefaultpath % This will remove any custom paths rehash toolboxcache savepath
The problem started when I swapped to using imageDatastores instead of using X and Y variables for injecting image data in the call to trainNetwork. The imageDatastores used in 'facetrain.m' for training and validation are each a merge of two imageDatastores via a randomPatchExtractionDatastore operation. The imageDatastores are augmented.
What could be causing this error to appear?
%================================= % facetrain.m %================================= dataroot = '/hd1/Data/FaceNNData/';
num_epochs = 10000; batch_size = 30; img_width = 128; img_height = 128; num_channels = 3;
layers = [ imageInputLayer([img_width img_height num_channels],'Name','input_face')
convolution2dLayer(3,128,'Padding','same','Name','conv128_enc')
batchNormalizationLayer('Name','bn128_enc')
reluLayer('Name','relu128_enc')
maxPooling2dLayer(2,'Stride',2,'Name','pool128_enc','HasUnpoolingOutputs',true)
convolution2dLayer(3,64,'Padding','same','Name','conv64_enc')
batchNormalizationLayer('Name','bn64_enc')
reluLayer('Name','relu64_enc')
maxPooling2dLayer(2,'Stride',2,'Name','pool64_enc','HasUnpoolingOutputs',true)
convolution2dLayer(3,32,'Padding','same','Name','conv32')
batchNormalizationLayer('Name','bn32_enc')
reluLayer('Name','relu32_enc')
maxPooling2dLayer(2,'Stride',2,'Name','pool32_enc','HasUnpoolingOutputs',true)
maxUnpooling2dLayer('Name','pool32_dec')
convolution2dLayer(3,64,'Padding','same','Name','conv32_dec')
batchNormalizationLayer('Name','bn32_dec')
reluLayer('Name','relu32_dec')
maxUnpooling2dLayer('Name','pool64_dec')
convolution2dLayer(3,128,'Padding','same','Name','conv64_dec')
batchNormalizationLayer('Name','bn64_dec')
reluLayer('Name','relu64_dec')
maxUnpooling2dLayer('Name','pool128_dec')
convolution2dLayer(1,3,'Padding','same','Name','conv128_dec')
regressionLayer('Name','output_depthmap')
];
lgraph = layerGraph(layers); lgraph = connectLayers(lgraph,'pool32_enc/indices','pool32_dec/indices'); lgraph = connectLayers(lgraph,'pool32_enc/size','pool32_dec/size'); lgraph = connectLayers(lgraph,'pool64_enc/indices','pool64_dec/indices'); lgraph = connectLayers(lgraph,'pool64_enc/size','pool64_dec/size'); lgraph = connectLayers(lgraph,'pool128_enc/indices','pool128_dec/indices'); lgraph = connectLayers(lgraph,'pool128_enc/size','pool128_dec/size');
train_face_ds = imageDatastore([dataroot, '/face/train'],'ReadFcn',@myreadfcn); train_world_ds = imageDatastore([dataroot '/world/train'],'ReadFcn',@myreadfcn); valid_face_ds = imageDatastore([dataroot, '/face/valid'],'ReadFcn',@myreadfcn); valid_world_ds = imageDatastore([dataroot '/world/valid'],'ReadFcn',@myreadfcn);
train_augmenter = imageDataAugmenter('RandXReflection', true, ... 'RandRotation', [-5.0 5.0], ... 'RandScale', [0.98 1.2], ... 'RandXTranslation', [-8 8], ... 'RandYTranslation',[-8 8]); valid_augmenter = imageDataAugmenter('RandXReflection', true, ... 'RandRotation', [-5.0 5.0], ... 'RandScale', [0.98 1.2], ... 'RandXTranslation', [-8 8], ... 'RandYTranslation',[-8 8]);
train_patch_ds = randomPatchExtractionDatastore(train_face_ds, train_world_ds, ... [img_width img_height], ... 'DataAugmentation', train_augmenter, ... 'PatchesPerImage',32); valid_patch_ds = randomPatchExtractionDatastore(valid_face_ds, valid_world_ds, ... [img_width img_height], ... 'DataAugmentation', valid_augmenter, ... 'PatchesPerImage',32);
options = trainingOptions('adam',... 'MiniBatchSize',batch_size,... 'MaxEpochs',num_epochs,... 'InitialLearnRate',1e-4,... 'Shuffle','every-epoch',... 'ValidationData',valid_patch_ds,... 'ValidationFrequency',100,... 'ValidationPatience',Inf,... 'Plots','training-progress',... 'ExecutionEnvironment','gpu',... 'Plots','training-progress',... 'Verbose',false); net = trainNetwork(train_patch_ds, lgraph, options);
function J = myreadfcn(filename) I = imread(filename); J = imresize(I,[128 128]); end
Thanks /Mats
1 commentaire
Mats Åhlander
le 21 Sep 2018
Réponse acceptée
Plus de réponses (0)
Catégories
En savoir plus sur Shifting and Sorting Matrices dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!