Inconsistencies in functions called by incremental regression kernel

4 vues (au cours des 30 derniers jours)
Yasmine
Yasmine le 25 Juil 2024
Regarding classreg.learning.rkeutils.featureMapper, which is called by incremental learning kernel. There are inconsistencies I am unable to resolve:
1. In the help of the file you state
NU = [nu_1 nu_2 ... nu_(n/d)]
nu_i = diag(S(:,i))*H*diag(G(:,i))*PM*H*diag(B(:,i)) ./ (sigma*sqrt(d))
This means the maximum i is n/d, which can't be because S,G,B have dimensions d x n/2/d...which means i can't be greater than n/2/d
2. You state that Z = [cos(X*NU) sin(X*NU)]
this means if Z dimensions 1*ncols, for example,
Z(1,1:ncols/2).^2+Z(1,ncols/2+1:ncols).^2=ones(1,ncols/2)
(sum of sin squared and cos squared)
If there is some scaling, the one would be replaced by the scaling factor. However, this isn't the case when you use map function to get Z. mapfwht gives different result that satisfies sin^2+cos^2=1 rule, but this isn't the one used by default.
Finally, a special request: please, in future matlab versions, in the returned model by updatemetricsandfit, return the mapping, beta and bias of SVM regression plane as explicit public properties. It's very important for us as researchers.

Réponses (0)

Produits


Version

R2024a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by