Estimation of NegEntropy in MATLAB
6 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I am trying to evaluate the different ICA algorithms using MATLAB. To do that, one of the measure which I use is to estimate the non-gaussianity using NegEntropy. I am trying to find a function which can do this on time series EEG data. Would appreciate your help.
Thanks
Kind regards,
Mari
Réponses (2)
William Rose
le 27 Mai 2021
Modifié(e) : William Rose
le 27 Mai 2021
pentropy(), in the SIgnal Processing Toolbox, computes the entropy of a signal's power spectrum, or of its spectrogram (i.e. time-varying spectrum). By default, it divides the signal into segments and computes a spectrum and an entropy for each segment. This can be useful in analysis of speech signals. If you want to estimate the entropy (H) of the entire signal, without segmentation, specify 'Instantaneous',false, as shown below:
y=(rand(1,256)-.5)*sqrt(12); %uniform random noise with mean=0, st.dev.=1
H=pentropy(y,1,'Instantaneous',false)
The entropy returned is normalized, by default, to the entropy of a white noise signal of equal length. You can test pentropy() by calculating H for a Gaussian white noise signal. In the following example, 100 Gaussian random signals are generated, each of length N=256. The entropy of each is computed and saved. At the end, the mean, standard deviation, minimum, amd maximum entropies are displayed.
N=256; H=zeros(1,100);
for i=1:100,H(i)=pentropy(randn(1,N),1,'Instantaneous',false);end
disp([mean(H),std(H),min(H),max(H)]);
As the signal duration becomes long, H approaches unity. See example below, which is like above, but with N=32768 this time.
N=32768; H=zeros(1,100);
for i=1:100,H(i)=pentropy(randn(1,N),1,'Instantaneous',false);end
disp([mean(H),std(H),min(H),max(H)]);
A value of H less than unity represents non-Gaussianity. The smaller H is, the more non-Gaussian y is.
pentropy() is not sensitive to some important forms of non-Gaussianity. The spectral entropy of uniform random noise (in the first example above) is almost identical to Gaussian random noise. This should not happen, because in theory, a Gaussian signal has maximum entropy.
The difficulty in controlling how the spectrum is computed inside pentropy() is another potential disadvatage of pentropy(). When I turned off the automatic scaling, and varied the signal duration, the entropy of the supposedly unscaled signal did not change as I expected. I think this is because the power spectrum routine called by pentropy() has default settings for the number of frequencies in the spectrum. See examples below.
N=4096;
H=pentropy(randn(1,N),1,'Instantaneous',false,'Scaled',false)
I expected H=log2(N)=12.
N=256;
H=pentropy(randn(1,N),1,'Instantaneous',false,'Scaled',false)
I expected H=log2(N)=8.
The shortcomings of pentropy() show that a time-domain entropy function is needed. This function will construct the histogram of the time-domain values of the signal, then normalize the histogram so it is a probability density, then compute the entropy of the density. A search of Matlab file exchange for negentropy returns two hits, but neither one looks like what we want.
0 commentaires
William Rose
le 28 Mai 2021
Here is code to estimate the entropy and negentropy of time domain signal, y. In this example, y is uniform random noise, but y could be any signal. The units for Hent(y) and J(y) are bits.
N=1024; %signal length
y=sqrt(12)*(rand(1,N)-.5); %uniform random noise, mean=0, stdev=1
h=histogram(y,'Normalization','pdf'); %estimate f(x)=the p.d.f.
Hent=-h.BinWidth*sum(h.Values(h.Values>0).*log(h.Values(h.Values>0))/log(2)); %compute entropy
J=log(std(y))/log(2)+2.0471-Hent; %compute negentropy
fprintf('Uniform random noise: Hent=%.4f, J=%.4f\n',Hent,J); %display results
The justification for the code above is in the attached document.
0 commentaires
Voir également
Catégories
En savoir plus sur Spectral Estimation dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!