simulink neural network producing different outputs to workspace

16 vues (au cours des 30 derniers jours)
william edeg
william edeg le 10 Jan 2020
Commenté : zhu a le 24 Nov 2024 à 17:13
I have trained a network and when i test it with plotresponse I get the graph in plotresponse below, but when i create a simulink block of this network and test with the same input i get the graph in the scope.png file below. (yellow is target). I though it was a problem with normalisation, but now i don't know what could be causing it.
thanks in advance.

Réponse acceptée

william edeg
william edeg le 12 Fév 2020
If anyone has the same problem and finds this then pay attention to the number of data points you are using for training.
I was using "to workspace blocks" with sample times of 0.001 to collect my training data, but they didn't collect at anything near the proper times, or time intervals. (intervals of 0.001 for 200s should obviously produce 200000 data poitnts, but i was collecting someting like 66667).
I switched to using scope blocks to collect my data instead, and now I have the correct data, and my gensim network responds identically to the network it was gensimed from (when using identical inputs).

Plus de réponses (3)

Nima SALIMI
Nima SALIMI le 25 Jan 2020
I assume that when you are using the simulink block you are training a new network from the scratch. any time you train a network the results would be different due to the random initializations of the weights and bias values (and/or different splitting of the train and test datasets) though using the same datasets and hyperparameters. For this reason, a good pracrice is to train and test the model (either using simulink or toolbox functions) for a number of times to have a more convincing decision about performance of your model.
  3 commentaires
Nima SALIMI
Nima SALIMI le 3 Fév 2020
My short answer to your question: nothing is wrong to get different results by one time using simulink and another time not-using simulink (even the same network and same dataset)!
My long answer: as I said in my previous answer, this is a normal behavior of any neural network that you will get different results any time training and testing the same network using the same datasets. Even if you use only command-line and not simulink block, using the same network and exact the same dataset you should get n different results training and testing the model for n different times (so its absolutely normal and nothing is wrong!). for further reading to know the reason of this behaviour: https://machinelearningmastery.com/reproducible-results-neural-networks-keras/
So what I suggest you is:
  1. to get exactly the same results each time training and testing the model use the rng() function (e.g. rng(2)) for the sake of reproducibly of the results and see that you can get the same results :)
  2. but as I said, when you want to decide to choose between several models (lets say 2 models), you should run both models for several times (+30, lets say 40 times) . In this way, you will have 40 accuracy values for each model/network. Then you should take the mean and std of those 40 values of two models to pick the better one (even a better way is to apply some statistical test of significance on those accuracy values for the model selection).
I hope I could anwer your question in more details in this comment.
Thanks for formally accepting my answer.
Best,
Nima
william edeg
william edeg le 3 Fév 2020
Oh, I see. Thanks for your help.

Connectez-vous pour commenter.


Greg Heath
Greg Heath le 25 Jan 2020
A simpler solution is to ALWAYS begin the program with a resetting of the random number generator. For example, choose your favorite NONNEGATIVE INTEGER as a seed and begin your program with
rng(seed)
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 commentaire
william edeg
william edeg le 26 Jan 2020
Modifié(e) : william edeg le 26 Jan 2020
Thanks for the response. I think I misunderstood what you meant for a moment. do you mean the random number generator for the initial network weights? the simulink network was created using the gensim function so i think it should be identical to the workspace network. the inputs are also identical, which is why i'm confused about getting different responses.

Connectez-vous pour commenter.


Nima SALIMI
Nima SALIMI le 25 Jan 2020
From machine learning perspective its a better practice to train the model several times and compare the results accordingly (than fixing the random seed) as we are interested in making the effect of randomness as negligible as possible. The solution I proposed can also be found in the MATLAB documentation (https://au.mathworks.com/help/deeplearning/gs/classify-patterns-with-a-neural-network.html, 2nd last parag).
Any way, but if your time is so limited and you want to check the effect of some variables on the model performamce (depends on your problem in hand) then you can just fix the seed!
  4 commentaires
william edeg
william edeg le 26 Jan 2020
Thanks again for your response. I think i might not have explained my problem well sorry. It seems like you and greg have read my problem as being a different reponse from different networks, but the simulink net was made using the gensim function, so it should be identical to the other network i'm comparing it to.
I've successfully trained networks on simpler narx functions and used gensim to create simulink networks that respond identically to their workspace versions, but for some reason it's not working for this more complex function.
zhu a
zhu a le 24 Nov 2024 à 17:13
Hello, may I ask if the issue has been resolved? I have encountered a similar problem. The simulation results of Elman neural network using MATLAB script are inconsistent with the simulation results of the Gensim exported model.

Connectez-vous pour commenter.

Catégories

En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange

Produits


Version

R2019b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by