Why is there nnet.layer.Formattable in the deep learning toolbox ?

4 vues (au cours des 30 derniers jours)
Jack Xiao
Jack Xiao le 18 Avr 2022
I defined a custom layer in terms of the given demo of "Define Custom Recurrent Deep Learning Layer" which defined peepholeLstmLayer.
I removed the nnet.layer.Formattable as I need to operate the data which does not need Format and has followed my settings.
However, it does not work. I wonder why there is nnet.layer.Formattable in the deep learning toolbox ? Only nnet.layer.Layer does not work effectively? Why are there so many settings for data? This makes coders more careful and cautious.
I think too many embranchment and prescribe will make the deep learning toolbox overstaffed.
This brings too much trouble and inconvenience. I think the deep learning toolbox need pruning and should be concise and plain.

Réponses (1)

Maksym Tymchenko
Maksym Tymchenko le 21 Juil 2023
When you create a custom deep learning layer, inheriting from nnet.layer.Formattable gives you several advantages:
  • The input data to the forward function will be a formatted dlarray object. This means that the input data contains labels that mark each dimension with a label: one of Spatial (S) Channel(C) Batch(B) Time(T) or Unknown(U).
  • You can use the dimension information in the forward function to easily rehape, flatten or unflatten your data.
  • You can define layers where the inputs and outputs have different formats.
As you observed inheriting from nnet.layer.Formattable is optional. However, if you remove the inheritance you might need to update your forward function to work with unlabelled data, which is probably what caused the error that you mentioned.

Catégories

En savoir plus sur Deep Learning with GPU Coder dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by