soft-margin SVM optimization

18 vues (au cours des 30 derniers jours)
Ece Sureyya Birol
Ece Sureyya Birol le 21 Avr 2020
Hello
I am trying to find the cost function in the unconstrained form of the binary soft-margin SVM optimization problem which is given by g(θ) = f0(θ) + (fj(θ)). The fj function is given by fj(θ) = C*max(0, 1 − yj*θ'* xj ), j = 1, . . . , n, and their sub gradients are given by ∇θ f0 (θ) = 0.5*||w||^2 , and ∇θ fj(θ) = (−C*yj*xj) if yj*θ'* xj < 1 and 0 otherwise.
I cannot implement fj(θ) = C*max(0, 1 − yj*θ'* xj ), j = 1, . . . , n where I don't know how to find the maximum. Is there a built in function for me to find the max?
yj is a vecor of size 105 by 1 which is the y label vector.
xj is a matrix of size 105 by 3 which is the feature vector consisting of training data.
θ is a 3 by 1 vector which takes in the value θ = (w b)' and is the vector of parameters of the soft-margin binary SVM classifier.
C is just a scalar value
if there is any tips and trick you may be able to tell me i would really appriciate it.
Thank you,
AJ

Réponses (1)

Hiro Yoshino
Hiro Yoshino le 21 Avr 2020
Are you using MATLAB or other Open Source Software?
Either way, I bet there is a package for your porpose, i.e., you do not need to implement by yourself. If your eally want, you should hit a proper book. The algorithm is not that complex.
For MATLAB, Check this out for the brief explanation:

Catégories

En savoir plus sur Statistics and Machine Learning Toolbox dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by