writing mistake using GRNN or NEWFF?
6 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Hi all, I have been working on training neural network on 3 alphabets (A,B, C) for the system which could be able to identify the mistakes in writing. For this purpose I trained my system with quite a lot examples of alphabets with few different styles. I have used newff and achieved 95% accuracy in classification. Now if I give a wrong sample as testing data which i meant to be classified as none among i have trained my network on, the system does not work fair. It identifies as the closest match among all the classes. for example I give 'A' without '-'between A(same as V upside down) it should be a mistake and my system should give a response like it belongs none of the three classes on which i have trained it. instead its predicting as Class A which i dont want.
On the other hand if I train the system with 2 classes only, having all of the correct alphabets as 1 class and all of the wrong alphabets as 0 class, the system is giving 0% accuracy on correct one and 100% on wrong one which means its classifying all of the data as class 0.
Can anyone tell me how to solve this problem? What if I use NEWGRNN instead of NEWFF as it looks regression problem more than classification. If it is so than how can i use it efficiently and how to determine the value of spread in GRNN parameter? please help thanks
0 commentaires
Réponse acceptée
Greg Heath
le 31 Mar 2012
1. Order classes in block form,i.e. [All A's, All B'as, etc].
2. Train 4 classes without a threshold and test. If satisfied, then thresholds are not necessary.
3. Make multiple single class test runs using nontraining data for 0.25 < thresh < 1. Record the results and choose the best threshold for each class.
y = sim(net,input);
[maxP classy] = max(y)
if classy < 4 & maxP < thresh(class)
classy = 4
end.
Hope this helps.
Greg
0 commentaires
Plus de réponses (2)
Greg Heath
le 14 Mar 2012
This is a conditional classification problem. Use 4 outputs with LOGSIG or SOFTMAX output activation functions. The training targets for [ A, B, C, ~(A|B|C)] are the four columns of eye(4) respectively. The net output is then an estimate of the class posterior probabilities, conditional on the input.
If you want the probabilities to have a unit sum, either use SOFTMAX or use LOGSIG and divide the outputs by the sum.
The input is assigned to the "Unknown" category (4) UNLESS the largest output value corresponds to one of the other categories AND that output value exceeds a specified threshold.
Construct examples of the last category and train without a threshold.
Use a separate validation set with the trained net to determine what the threshold should be.
Hope this helps.
Greg
1 commentaire
Greg Heath
le 14 Mar 2012
I have designed successful classifiers using NEWFF on the 26 capital
letters and 10 digits in the MATLAB alphabet datase. The difference between my classifiers was the level of SNR that they could tolerate.
Although I used NEWFF, NEWFIT, PATTERNNET and NEWRB are alternatives.
Hope this helps.
Greg
Voir également
Catégories
En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!