code generation for LSTM NN

4 vues (au cours des 30 derniers jours)
Giuseppe Menga
Giuseppe Menga le 10 Fév 2022
I generated the C/C++ code for a trained LSTM NN on a 2021b Matlab
the type of mynet;/* '<Root>/MATLAB Function' */
c_coder_ctarget_DeepLearningN_T
is missed.
How to solve the problem?
Giuseppe
  1 commentaire
Hariprasad Ravishankar
Hariprasad Ravishankar le 10 Fév 2022
Can you provide some reproduction files to reproduce the issue at our end?
Particularly, I'd like to take a look at the entry point function and how the network was loaded.
Hari

Connectez-vous pour commenter.

Réponses (3)

Giuseppe Menga
Giuseppe Menga le 10 Fév 2022
Hi Hari,
thank for your interest in my problem. I have solved that particular aspect, it was my fault.
However, the code generation of LSTM NN raises several other problems.
I have to implement this NN in the kernel of a Linux operating system, for that reason I don't want the use of any external library or of a mathematical processor.
1) I have to cancel by hand any reference to real time functions, monitoring, handling of the time, etc,
2) to optimize the performance, when the input signal is a vector pairs of data in contiguous iterations are linked together and operated through functions of the kind of _mm_add_pd, _mm_mul_pd, _mm_div_pd, _mm_sub_pd
Is it possible to ask the code generator to create pure C/C++ code?
In particular in the documentation is unclear the role of the file grt.tlc and how it can be seen and modified.
Giuseppe
  1 commentaire
Hariprasad Ravishankar
Hariprasad Ravishankar le 10 Fév 2022
Yes, we can specify vanilla C/C++ code generation for deep learning models by using the 'none' deep learning target library.
The link below showcases an example using MATLAB coder:
Here is another example using LSTMs:
If you are using Simulink, you can refer to the example below:
By default, the code-generator calls into OpenMP to parallelize the loops. This can be disabled by setting the EnableOpenMP option on the coder config to false as shown here:
To generate code for SIMD and packed data types you may refer to the following page:
Please let me know if this answer was helpful. I would also recommend to try out the new enahncements to the code-generator when R2022a goes live as there are have been some performance optimizations for the matrix multiplication operations in RNNs for single threaded applications.
Hari

Connectez-vous pour commenter.


Giuseppe Menga
Giuseppe Menga le 3 Mar 2022
Dear Hari,
your suggestions were very helpful. I took sometime to go through the papers you indicated, but finally I succeded.
Most of the technical problems were solved: building and training a LSTM NN and generating a C/C++ software, using matlab coder, almost libraries free.
Only a small problem, I imagine a software bug in the matlab coder: in spite of no libraries and no enable openMP in the configuration, in the initialize and terminate functions of the net the call to omp_init_nest_lock(&function_c_lstm_No_nestLockGlobal) and omp_destroy_nest_lock(&function_c_lstm_No_nestLockGlobal)were present.
I don't know was they do, and I simply commented them in the code.
However I would like to ask you a more conceptual question.
I'm using NN to predict in real time, one step ahead for control, from input time series output time series. I recorded input and output in three sets of esperiments,
the first for training the others, recorded in slightly different operating conditions of the system, for testing and validation.
Originally I was using NARX nets, were the three sets are incorportared in the optimization algorithm.
The training using the first set was taking into account someway the other two sets (also the error inside them was reduced).
This technique showed that the net was robust enough to explain the behaviour of the system even in slightly different operating conditions.
I tried also to merge in the train all the three sets. I recognized the difference in the longer time that the algorithm took to reach the convergence.
In this case the owerall performance was a slight better than in the previous case, but not much.
For the same application, the LSTM network was set for sequence-to-sequence regression.
This net also accepts training and validation sets. In validation set I included both previous test and validation.
However the training on test set was not reducing the error in the validation set. To obtain a satisfactory result all around I had to merge in the training set also the validation.
I'm not able to enter into the optimization algorithms of training of the two kinds of nets.
Can you explain me those different behaviours?
Giuseppe
  1 commentaire
Hariprasad Ravishankar
Hariprasad Ravishankar le 3 Mar 2022
Modifié(e) : Hariprasad Ravishankar le 9 Mar 2022
Hi Giueseppe,
>>in spite of no libraries and no enable openMP in the configuration, in the initialize and terminate functions of the net the call to.
This is not expected. Can you please share any simple reproduction scripts and commands to reproduce this issue? We will look into it.
With reference to your other question on training of LSTM, I'm afraid I do not have the expertise to answer this question. However, based on the description it seems you are training on the validation set which may cause issues such as overfitting on the validation set. It is best to keep these mutually exclusive.
I would recommend you to post this question in a new MATLAB Answers post and tag "Deep Learning Toolbox" product. This will help to make your question more visible to experts in the Deep Learning/Neural Networks domain.
Hari

Connectez-vous pour commenter.


Giuseppe Menga
Giuseppe Menga le 3 Mar 2022
Hari,
I discovered the problem related to the openMP library.
When you run Coder, in one of the first windows it suggests you to set the configuration, I did it, setting no libraries and no openMP library.
Before the generation of code it offers again the icon to check the configuration.
The first time I didn't pay attention, confident of the first configuration setting, and I moved forward to generate the code.
This time, before generating the code I checked again the configuration, and I found that, in spite of the first setting, the openMP library was set again to yes.
In other word the configuration between the first and the last window was reset
Giuseppe
  1 commentaire
Hariprasad Ravishankar
Hariprasad Ravishankar le 9 Mar 2022
Thanks for clarifying the issue. You are right in that the "Check for Run-Time Issues" settings do not get carried to the "Generate Code" step and you have to set the same configuration before generating code.
I will forward this feedback to the concerned teams.
Hari

Connectez-vous pour commenter.

Catégories

En savoir plus sur Simulink Coder dans Help Center et File Exchange

Produits


Version

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by