I'm getting an error error "using tall/cellfun" while doing a project listed in mathworks "Denoise Speech Using Deep Learning Networks".

26 vues (au cours des 30 derniers jours)
为
le 2 Déc 2024
Modifié(e) : le 7 Déc 2024 à 16:42
I don't know what happened. I see that many people in the community make mistakes because they did not add the 'HelperGenerateSpeechDenoisingFeatures' file, but I still made an error when I added it.
[noise_chaiYouJi,Fs]=audioread("noise.mp3");
datafolder = "dataset";
%ads0 = audioDatastore(fullfile(datafolder,"clips"));
ads0 = audioDatastore(fullfile(datafolder,"clips"),IncludeSubfolders=true);
%-------------------------------
windowLength = 256;
win = hamming(windowLength,"periodic");
overlap = round(0.75*windowLength);
fftLength = windowLength;
inputFs = 48e3;
fs = 8e3;
numFeatures = fftLength/2 + 1;
numSegments = 8;
%-------------------------------
%src = dsp.SampleRateConverter("InputSampleRate",44100,"OutputSampleRate",8000, "Bandwidth",7920);
src = dsp.SampleRateConverter(InputSampleRate=inputFs,OutputSampleRate=fs,Bandwidth=7920);
%-------------------------------
reset(ads0)
numSamples = numel(ads0.Files)
trainsamples=600;
ads0=subset(ads0,1:trainsamples);
T = tall(ads0)
[targets,predictors] = cellfun(@(x)HelperGenerateSpeechDenoisingFeatures(x,noise_chaiYouJi,src),T,UniformOutput=false);
[targets,predictors] = gather(targets,predictors);
predictors = cat(3,predictors{:});
noisyMean = mean(predictors(:));
noisyStd = std(predictors(:));
predictors(:) = (predictors(:) - noisyMean)/noisyStd;
targets = cat(2,targets{:});
cleanMean = mean(targets(:));
cleanStd = std(targets(:));
targets(:) = (targets(:) - cleanMean)/cleanStd;
predictors = reshape(predictors,size(predictors,1),size(predictors,2),1,size(predictors,3));
targets = reshape(targets,1,1,size(targets,1),size(targets,2));
inds = randperm(size(predictors,4));
L = round(0.99*size(predictors,4));
trainPredictors = predictors(:,:,:,inds(1:L));
trainTargets = targets(:,:,:,inds(1:L));
validatePredictors = predictors(:,:,:,inds(L+1:end));
validateTargets = targets(:,:,:,inds(L+1:end));
layers = [imageInputLayer([numFeatures,numSegments])
convolution2dLayer([9 8],18,Stride=[1 100],Padding="same")
batchNormalizationLayer
reluLayer
repmat( ...
[convolution2dLayer([5 1],30,Stride=[1 100],Padding="same")
batchNormalizationLayer
reluLayer
convolution2dLayer([9 1],8,Stride=[1 100],Padding="same")
batchNormalizationLayer
reluLayer
convolution2dLayer([9 1],18,Stride=[1 100],Padding="same")
batchNormalizationLayer
reluLayer],4,1)
convolution2dLayer([5 1],30,Stride=[1 100],Padding="same")
batchNormalizationLayer
reluLayer
convolution2dLayer([9 1],8,Stride=[1 100],Padding="same")
batchNormalizationLayer
reluLayer
convolution2dLayer([129 1],1,Stride=[1 100],Padding="same")
];
options = trainingOptions("adam", ...
MaxEpochs=3, ...
InitialLearnRate=1e-5, ...
MiniBatchSize=miniBatchSize, ...
Shuffle="every-epoch", ...
Plots="training-progress", ...
Verbose=false, ...
ValidationFrequency=floor(size(trainPredictors,4)/miniBatchSize), ...
LearnRateSchedule="piecewise", ...
LearnRateDropFactor=0.9, ...
LearnRateDropPeriod=1, ...
ValidationData={validatePredictors,permute(validateTargets,[3 1 2 4])});
denoiseNetFullyConvolutional = trainnet(trainPredictors,permute(trainTargets,[3 1 2 4]),layers,"mse",options);
filename = 'denoiseNet.mat';
save(filename, 'denoiseNetFullyConvolutional');
summary(denoiseNetFullyConvolutional)
%-------------------------------

Réponses (1)

Govind KM
Govind KM le 4 Déc 2024 à 9:32
Hi @为,
From what I understand using Google Translate, the error message in the provided image when using tall/gather seems to be "Insufficent memory".
One of the differences between tall arrays and in-memory arrays in MATLAB is that tall arrays typically remain unevaluated until the user requests the calculations to be performed, enabling working with large data sets quickly without waiting for command execution. The gather function is used to evaluate the queued operations on a tall array, returning the result as an in-memory array. Since gather returns results as in-memory MATLAB arrays, standard memory considerations apply.
A possilble reason for the mentioned error could be that the dataset being used in your code is too large, causing MATLAB to run out of memory for the output from gather. To check whether the result can fit in memory, you can use
gather(head(X))
%or
gather(tail(X))
to perform the full calculation, but bring only the first or last few rows of the result into memory.
A subset of the dataset can be used to test the same code with smaller memory requirements. Another possible workaround is mentioned in this related MATLAB Answers post, which suggests bypassing the usage of tall :
Hope this is helpful!
  1 commentaire
为
le 7 Déc 2024 à 16:36
Modifié(e) : le 7 Déc 2024 à 16:42
Thank you for your support, sir.
I didn't expect someone to provide me with help so quickly, thank you sincerely.
I have already solved the problem, this is the method provided by one of my friends. But to be honest, I don't know why it works.
the official provides the 'HelperGeneratedSpeechDenosingFeatures. m' function, which generates target and predicted data by inputting parameters such as a tall array.
The following is the part of the main training program that calls the "HelperGeneratedSpeedDenoisingFeatures" function.
[targets,predictors] = cellfun(@(x)HelperGenerateSpeechDenoisingFeatures(x,noise,src),T,UniformOutput=false);
But the problem lies here, there is a part of the code in this function that is used to obtain a segment of noise with the same length as the original audio. It was originally like this.
randind = randi(numel(noise) - numel(audio) , [1 1]);
noiseSegment = noise(randind : randind + numel(audio) - 1);
The problem of 'out of memory' in my code comes from here. Changing it to the following would solve the problem smoothly.
ls=length(audio);
ln=length(noise);
noiseSegment=noise(randi([0 ln-ls])+(1:ls)');
The modified program can achieve the expected effect perfectly, and the subsequent actual denoising and real-time denoising tests have also proved this.
But if changing this part of the code to its original state, there will still be an "out of memory" error.
To be honest, I don't understand why this change would solve the problem.

Connectez-vous pour commenter.

Produits

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by