What is the Normalized MSE algorithm for the NN performance?

13 vues (au cours des 30 derniers jours)
Hugo Mendonça
Hugo Mendonça le 14 Juin 2015
Hi, everyone!
To verify the performance of a neural network, the NN toolbox calculate the MSE (mean squared error). Besides, there is the possibility to calculate the same MSE normalized setting 'standard' or 'percent'.
I have looked for the algorithm to calculate both of them with no success. So, does anyone know how matlab normalizes the MSE?
Many thanks in advance!
Hugo

Réponses (1)

Greg Heath
Greg Heath le 16 Juin 2015
The purpose of a regression or curve-fitting net is, given the input signal variations, to model the corresponding target variations.
The average biased (e.g., divide by N) target variance is
MSE00 = mean(var(t'),1)
When adjusted (e.g., dividing by N-1) for the bias of using the estimate of the mean from the same data, the unbiased target variance is
MSE00a = mean(var(t'),0)
It is not difficult to show that MSE00 is the minimum mean-square-error resulting from a naïve constant output model. Of course, the minimum occurs when the constant is just the mean of the target. Consequently, the result is the variance.
When trying to model target variations, the constant output model is probably the most useful reference. This results in the scale-free entitities
NMSE = mse(t-y)/MSE00 % Normalized MSE
and
R2 = 1- NMSE % Rsquare (AKA R^2 and the coefficient of determination)
Rsquare is interpreted as the fraction of target variance that is modelled by the net.
https://www.google.com/?gws_rd=ssl#q=r+squared+wikipedia
Thank you for formally accepting my answer
Greg

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by