onnxExport/onnxImport functions not working correctly

2 vues (au cours des 30 derniers jours)
Luke Hubbard
Luke Hubbard le 27 Mar 2023
I'm having trouble exporting to onnx. As a test case, I tried using https://www.mathworks.com/help/deeplearning/ug/classify-sequence-data-using-lstm-networks.html. I add the following section between the training and testing blocks and got the data format error as shown below.
exportONNXNetwork doesn't raise any issues, but importONNXNetwork does.
If I import with the following option set,
importONNXNetwork('onnxEx1.onnx','OutputsDataFormat','TBC')
it works fine. All is well, right? except that onnxruntime in other environments doesn't work. Trying to load the onnx file with onnxruntime in another environment (python, java) results in a 'ShapeInferenceError' complaining that the tensor must have rank 3.
It must be that exportONNX failing to define something that is required. Are there any workarounds known for this issue? I've tried this locally with R2022b and online which uses R2023a. Both give me the same issue.
  1 commentaire
Sivylla Paraskevopoulou
Sivylla Paraskevopoulou le 4 Avr 2023
Does it help to specify the BatchSize name-value argument when you use the exportONNXNetwork function?
Also, if you are planning to use the exported network in TensorFlow, with the exportNetworkToTensorFlow function you can export directly to TensorFlow without first converting to ONNX format.

Connectez-vous pour commenter.

Réponses (0)

Catégories

En savoir plus sur Image Data Workflows dans Help Center et File Exchange

Produits


Version

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by