Can recognize what distance is this ?

1 vue (au cours des 30 derniers jours)
Kamil Kacer
Kamil Kacer le 13 Nov 2020
Commenté : Kamil Kacer le 13 Nov 2020
Can anyone explain me what does this code do,
Could it be the after first if is calculated euclodean distance and in else there is second distance called mahalanobis distance ?

Réponse acceptée

Walter Roberson
Walter Roberson le 13 Nov 2020
The clue is in the name of the variable: useL1Distance . If it is true, then the L1 norm is used, which is also called "taxi-cab distance".
If it is false, then the L2 norm is used, which is also called Euclidean distance.
Neither branch has anything to do with maholonobis distance.
  4 commentaires
Walter Roberson
Walter Roberson le 13 Nov 2020
d{i} = pdist2(F{i}.', testSample, 'mahalanobis');
Kamil Kacer
Kamil Kacer le 13 Nov 2020
Thank you it worked iam assuming it calculates the distance between testsample and every audio segment F in the dataset a stores it into d
Iam I right ?

Connectez-vous pour commenter.

Plus de réponses (1)

Kamil Kacer
Kamil Kacer le 13 Nov 2020
I have this function where iam clyssifing a sample and i just want to another distance based on which is the sample classified
I have euclidean and I want to add mahalanobis as you showed me. but it doesnt work any suggestions please.
function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, useL1distance )
% function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, useL1distance);
%
% This function is used for classifying an unknown sample using the kNN
% algorithm, in its multi-class form.
%
% ARGUMENTS:
% - F: an CELL array that contains the feature values for each class. I.e.,
% F{1} is a matrix of size numOfDimensions x numofSamples FOR THE FIRST
% CLASS, etc.
%
% - testSample: the input sample to be classified
% - k: the kNN parameter
% - NORMALIZE: use class priors to weight results
% - useL1distance: use L1 instead of L2 distance
%
% RETURNS:
% - Ps: an array that contains the classification probabilities for each class
% - winnerClass: the label of the winner class
error( nargchk(4,5,nargin) )
if ( nargin < 5 )
useL1distance = false;
end
numOfClasses = length(F);
if (size(testSample, 2)==1)
testSample = testSample';
end
% initilization of distance vectors:
numOfDims = zeros( 1, numOfClasses );
numOfTrainSamples = zeros( 1, numOfClasses );
d = cell(numOfClasses,1);
% d{i} is a vector, whose elements represent the distance of the testing
% sample from all the samples of i-th class
testSample(isnan(testSample)) = 0.0;
for i=1:numOfClasses
[ numOfDims(i), numOfTrainSamples(i) ] = size( F{i} );
d{i} = inf*ones(max(numOfTrainSamples), 1); % we fill it with inf values
F{i}(isnan(F{i})) = 0.0;
end
if (length(testSample)>1)
for i=1:numOfClasses % for each class:
if (numOfTrainSamples(i)>0)
if ( useL1distance )
d{i} = sum( abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}'),2); % L1
else
%[size(repmat(testSample, [numOfTrainSamples(i) 1])) size(F{i}')]
%sum(sum(isnan(F{i}')))
% d{i} = sum( ((repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}').^2 ),2); % L2
d{i} = pdist(testSample, F{i}, 'mahalanobis');
end
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
else % single dimension (NO SUM required!!!)
for i=1:numOfClasses
if (numOfTrainSamples(i)>0)
d{i} = (abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}')');
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
end
kAll = zeros(numOfClasses, 1);
for j=1:k
curArray = zeros(numOfClasses, 1);
for i=1:numOfClasses
curArray(i) = d{i}(kAll(i)+1);
end
[MIN, IMIN] = min(curArray);
kAll(IMIN) = kAll(IMIN) + 1;
end
if ( NORMALIZE == 0 )
Ps = (kAll ./ k);
else
Ps = kAll ./ numOfTrainSamples';
Ps = Ps / sum(Ps);
end
[MAX, IMAX] = max(Ps);
winnerClass = IMAX;
  1 commentaire
John D'Errico
John D'Errico le 13 Nov 2020
Please don't answer your question with a followup question.

Connectez-vous pour commenter.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by