Issue Regarding KL divergence Implementation in MATLAB
7 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Muhammad Nauman Nasir
le 4 Août 2017
Réponse apportée : Muhammad Nauman Nasir
le 21 Août 2017
I have some issue regarding KL divergence as i am confused that if i am going in right way or not .
I am explaining my issue by an example as my data set is too big so just for clarity of concept i am providing an example.
I have one reference sensor signal and 1 measured value of some sensor.
I want to find out the error or difference between Ref and Measured sensor signal Values.
So I am using KL divergence.
First I normalized my reference and sensor signal histogram and then applied KL divergence.
My data is too much and complicated means it contains a lot of zeroes and negative values and also 0.001 like these values also.
I was applying KL divergence but unfortunately was not being able to get some good results so I was wondering that may be I did not able to get the good concept of KL divergence or I am doing wrong at some point in The code.
It will nice of people if someone help me out in this. I shall be grateful.
Am i on right way or there is some fault in my concepts .
Thanks a lot in advance.
Code
ref = [2 3 4 5 6 7 8 9 -2 -3 -4];
measured_sensor = [3 3 4 5 7 8 9 9 -1 -2 -3];
%normalized histograms for
C= hist( ref);
C1 = C ./ sum(C);
D = hist(measured_sensor);
D1 = D ./ sum(D);
figure(1)
ax11=subplot(321);
bar(C1)
ax12=subplot(322);
bar(D1)
d = zeros(size(C1));
goodIdx = C1>0 & D1>0;
d1 = sum(C1(goodIdx) .* log(C1(goodIdx) ./ D1(goodIdx)))
d2 = sum(D1(goodIdx) .* log(D1(goodIdx) ./ C1(goodIdx)))
d(goodIdx) = d1 + d2
Mean Based Gaussian Hysterisis (Means Error Finding)
ref = [5 6 7 5 8 7 8 9 -2 -3 -4];
measured_sensor = [3 3 4 5 7 8 9 9 -1 -2 -3];
sig_diff = ref - measured_sensor ;
m = mean(sig_diff)
deviation = std(sig_diff);
pos = sig_diff(sig_diff>0)
neg = sig_diff(sig_diff<0)
m_pos = mean(pos)
m_neg = mean (neg)
hysterisis = abs( m_pos)+ abs(m_neg)
figure(6)
ax11=subplot(321);
histfit(sig_diff)
hold on
plot([m m],[0 5000],'r')
plot([m-deviation m-deviation],[0 5000],'r')
plot([m+deviation m+deviation],[0 5000],'r')
hold off
The error value or Hysterisis value that I am getting with mean based Gaussian Distribution is 3.2500.
So I am expecting the error values from KL divergence near to 3.2500 value or in the range with some tolerance is also accepted.
0 commentaires
Réponse acceptée
Chris Perkins
le 9 Août 2017
Modifié(e) : Chris Perkins
le 9 Août 2017
Hi Muhammad,
KL Divergence produces a number between 0 and 1, where 0 indicates the expectation of extremely similar behavior between the two distributions and 1 indicates that the two distributions behave extremely differently. See more here: https://en.wikipedia.org/wiki/Kullback-Leibler_divergence
As your given sample data mirrors each other very closely, we would expect a KL Divergence value close to zero.
I hope this helps clear up some confusion.
0 commentaires
Plus de réponses (1)
Voir également
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!