Effacer les filtres
Effacer les filtres

How to apply Explainable AI on user defined classification models (without using inbuilt classifiers)?

8 vues (au cours des 30 derniers jours)
I am working on a classification model where I have used LIBSVM for classication. I want to investigate the key features responsible for classification using Explainable AI. Kindly suggest me a solution.

Réponses (2)

Drew
Drew le 23 Août 2023
Modifié(e) : Walter Roberson le 30 Août 2023
You can use the MATLAB explainable AI functions shapley, lime, and partialDependence while specifying the model using a function handle to the predict method for your LIBSVM model. In more detail:
If this answer helps you, it is recommended to accept the answer.
  5 commentaires
Walter Roberson
Walter Roberson le 30 Août 2023
Note that libsvm is a third-party product not supported by Mathworks. There was a period during which Mathworks used a modified version of libsvm, but that was completely replaced about 5 or so years ago.
Drew
Drew le 30 Août 2023
Modifié(e) : Drew le 31 Août 2023
You will need classifier scores from the predict function of libsvm in order to do shapley or lime analysis. Explainability analysis cannot be done with only the predicted labels.
Here is an example which uses fitcsvm, but with a function handle. Note that the classifier scores from prediction are used. Note that the function definition for the function handle appears at the very bottom of the code here.
% Load Fisher's iris data set. Remove all observed setosa irises.
% Leave versicolor and virginica, which are not linearly separable.
load fisheriris.mat
inds = ~strcmp(species,'setosa');
X = meas(inds,:);
Ylabels = species(inds);
% For classification problems, for each query point, there are
% Shapley Values for each class and for each predictor.
% When using a function handle, need to get the shapley values
% for one class at a time. Start with the first class, which is versicolor.
rng(1);
mdl = fitcsvm(X,Ylabels);
f= @(x) getFirstClassScore(mdl,x); % See definition at bottom of this file
explainer = shapley(f,X);
explainer = fit(explainer,X(1,:));
plot(explainer)
% Look at the explainer and ShapleyValues for one class, when using function handle
explainer
explainer =
shapley with properties: BlackboxModel: @(x)getFirstClassScore(mdl,x) QueryPoint: [7 3.2000 4.7000 1.4000] BlackboxFitted: 1.7136 ShapleyValues: [4×2 table] NumSubsets: 16 X: [100×4 double] CategoricalPredictors: [] Method: 'interventional-kernel' Intercept: -0.0188
% The first class is versicolor, so these are the versicolor shapley values
explainer.ShapleyValues
ans = 4×2 table
Predictor ShapleyValue _________ ____________ "x1" 0.4394 "x2" 0.32068 "x3" 0.4188 "x4" 0.55359
Now, add an example of how this looks when using fitcsvm ClassificationSVM object with shapley:
% Much more convenient to load fisher iris with the predictor labels, and
% use builtin shapley command to get shapley values for both classes at
% once.
t=readtable("fisheriris.csv");
inds = ~strcmp(t{:,5},'setosa');
t_selected=t(inds,:);
rng(1);
mdl=fitcsvm(t_selected,"Species");
explainer=shapley(mdl,t_selected);
explainer=fit(explainer,t_selected(1,:));
plot(explainer)
% Or you can plot shapley values for multiple classes at once
plot(explainer,ClassNames=mdl.ClassNames);
% Look at the explainer and ShapleyValues for both classes when using
% ClassificationSVM
% explainer also reports that it is using the 'interventional-linear'
% method, which is faster than 'interventional-kernel' (especially when
% the number of predictors is large).
explainer
explainer =
shapley with properties: BlackboxModel: [1×1 ClassificationSVM] QueryPoint: [1×5 table] BlackboxFitted: {'versicolor'} ShapleyValues: [4×3 table] NumSubsets: 16 X: [100×5 table] CategoricalPredictors: [] Method: 'interventional-linear' Intercept: [-0.0188 0.0188]
explainer.ShapleyValues
ans = 4×3 table
Predictor versicolor virginica _____________ __________ _________ "SepalLength" 0.4394 -0.4394 "SepalWidth" 0.32068 -0.32068 "PetalLength" 0.4188 -0.4188 "PetalWidth" 0.55359 -0.55359
% Create function to return the score for the first class.
% This is used in the function handle example above.
function score = getFirstClassScore(mdl, X)
[~, score] = predict(mdl, X);
score = score(:, 1); % Extract scores of the first class
end

Connectez-vous pour commenter.


BHARTI RANA
BHARTI RANA le 29 Déc 2023
Can you help me resolve the following error:
"Error using shapley (line 226)
The value of 'X' is invalid. Expected input to be two-dimensional."
I am trying to run shapley as:
shapley(f,data3D);
where f: function handler
data3D: 3D matrix where 3rd dimension indicates feature. (say n x n x d dimensional, where n are number of samples and d are number of features)
my intent is to predict importance of features.
Kindly suggest.
  4 commentaires
Walter Roberson
Walter Roberson le 5 Jan 2024
When you specify a blackbox model using a function handle, then the data matrix you supply must have rows corresponding to observations and must have colums corresponding to variables. 3D data is not supported.
BHARTI RANA
BHARTI RANA le 5 Jan 2024
No, I am modelling using Composite Kernel where I am having 3D data (n x n x d), where 3rd dimension corresponds to observations.

Connectez-vous pour commenter.

Catégories

En savoir plus sur Statistics and Machine Learning Toolbox dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by