How to convert .mat deep learning model to ONNX without placeholder?

28 vues (au cours des 30 derniers jours)
Sneha Sharma
Sneha Sharma le 6 Août 2025
Réponse apportée : Ronit le 11 Août 2025 à 9:05
I used the function
exportONNXNetwork(net, filename, "OpsetVersion", 11); to get my ONNX model. \
Now when I load this in my python, I get an error Fatal error: com.mathworks:Placeholder(-1) is not a registered function/op
Is there a way to export to ONNX without the placeholders?
I appreciate any help.
  1 commentaire
Sneha Sharma
Sneha Sharma le 6 Août 2025
It would also help if there was a way to remove the placeholder layer from my network.

Connectez-vous pour commenter.

Réponses (1)

Ronit
Ronit le 11 Août 2025 à 9:05
When you use exportONNXNetwork in MATLAB to convert a deep learning model to ONNX, you may see errors in Python. This happens because your MATLAB network contains one or more layers that are not supported by the ONNX format. During export, MATLAB inserts "placeholder" layers for these unsupported components. But, Python ONNX tools cannot interpret these placeholders, so you will see errors when you try to load the model.
To resolve this, you can use the following suggestions:
  • Identify the unsupported layers by running "analyzeNetwork(net)" and try to remove or replace them with ONNX-compatible alternatives by using the function "removeLayers".
  • Ensure that your final network contains only layers supported for ONNX export.
  • Export your updated network to ONNX again using "exportONNXNetwork".
Please refer to the following documentaion pages for information regarding the following functions:
I hope this resolves your query!

Catégories

En savoir plus sur Call Python from MATLAB dans Help Center et File Exchange

Produits


Version

R2025a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by