Computing Cross Entropy and the derivative of Softmax
Afficher commentaires plus anciens
Hi everyone,
I am trying to manually code a three layer mutilclass neural net that has softmax activation in the output layer and cross entropy loss. I think my code for the derivative of softmax is correct, currently I have
function delta_softmax = grad_softmax(z)
delta = eye(size(z));
delta_softmax = ssmax(z).*(delta-ssmax(z));
end
However, I am having some trouble converting Python code to MATLAB for Cross Entropy Loss. In Python, the code is
def cross_entropy(X,y):
"""
X is the output from fully connected layer (num_examples x num_classes)
y is labels (num_examples x 1)
"""
m = y.shape[0]
p = softmax(X)
log_likelihood = -np.log(p[range(m),y])
loss = np.sum(log_likelihood) / m
return loss
whereas my MATLAB code is
function cross_entropy = cross_entropy(X, y)
m = size(y, 1);
v = ssmax(X);
y = y + ones(m,1);
v = v(1:m, y);
llhood = -log(v);
cross_entropy = sum(llhood)/m;
end
However, I get an issue when I try to use the same indexing convention as the Python code. I am working with a large data set so MATLAB throws a size error.
Réponses (1)
Greg Heath
le 6 Mai 2018
Search both
comp.soft-sys.matlab
and
ANSWERS
for
greg crossentropy
Hope this helps.
Thank you for formally accepting my answer
Greg
Catégories
En savoir plus sur Parallel and Cloud dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!