Matlab, DNN support for TensorFlow 2.7 with on-device training

2 vues (au cours des 30 derniers jours)
Peter Balazovic
Peter Balazovic le 16 Déc 2021
Commenté : Peter Balazovic le 30 Déc 2021
I would like to ask if there is Matlab, DNN support to deploy the TF-2.7 models. TensorFlow Lite 2.7 has support the models with on-device training, in addition to running inference. This on-device training allows personalization where models are fine-tuned then.
I wonder if there is some special recommendation on usage within Matlab, DNN toolbox and
  • how is this on-device training feature supported?
  • Does Matlab Coder, Embedde Coder (if applies) are able to generate the code with this feature?
  • Is there any Matlab, DNN exampe with on-device training feature?
Thank you.

Réponses (1)

Sayan Saha
Sayan Saha le 21 Déc 2021
Hi Peter,
We are adding support for deployment of Tensorflow Lite (version 2.4.1) models using MATLAB in 22a release. We have not tested models created with the 2.7 version of the Tensorflow Lite, so they may not work. This deployment support is only for inference applications (calling predict on the neural network model). You'll be able to load Tensorflow Lite models in MATLAB and generate code for that for deployment. You'll be able to simulate the model in MATLAB as well.
Currently, there is no on-device training feature available in MATLAB that supports deployment to a device. So we'd like to know more about your requirements:
  1. How does on-device training fit in your workflow?
  2. What kind of device are you targeting for this workflow?
  3. What kind of networks are you using?
  4. Are there any specific layers that you are interesting in training on the device? For example, if you think about transfer learning it might be enough to re-train the last fully-connected layer on-device instead of the entire network.
Thanks,
Sayan
  1 commentaire
Peter Balazovic
Peter Balazovic le 30 Déc 2021
This is generic TFL enablement targeting mainly i.MX8 (Linux-based) since on-device training feature relates to TensorFlow Lite only (not TFL-Micro). This feature should construct the models to be incrementally trained and improved using the application itself. I would like to generate this on-device training code and find out if this feature can be deployed in the quantization scheme. The on-device training is a newly added feature that I am currently applying with the MobilNets (v1,v2, ssd).

Connectez-vous pour commenter.

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Produits


Version

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by