Effacer les filtres
Effacer les filtres

onnx export to openvino

3 vues (au cours des 30 derniers jours)
Markus Walser
Markus Walser le 9 Août 2019
Commenté : Markus Walser le 12 Août 2019
Hi,
  • Matlab 2019a
  • DeeplabV3+
  • OpenVino 2019 R2
  • Ubuntu 16.04 LTS
For Matlabs ONNX export I tested opset version 6,7 and 9. All of them give the following error during model optimizer step:
vision@ubuntu:~/wam0101$ /opt/intel/openvino_2019.2.242/deployment_tools/model_optimizer/mo.py --input_model ~/wam0101/skynet-v7.onnx --data_type FP16
Model Optimizer arguments:
Common parameters:
- Path to the Input Model: /home/vision/wam0101/skynet-v7.onnx
- Path for generated IR: /home/vision/wam0101/.
- IR output name: skynet-v7
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: Not specified, inherited from the model
- Mean values: Not specified
- Scale values: Not specified
- Scale factor: Not specified
- Precision of IR: FP16
- Enable fusing: True
- Enable grouped convolutions fusing: True
- Move mean values to preprocess section: False
- Reverse input channels: False
ONNX specific parameters:
Model Optimizer version: 2019.2.0-436-gf5827d4
[ ERROR ] Concat input shapes do not match
[ ERROR ] Shape is not defined for output 0 of "dec_cat1".
[ ERROR ] Cannot infer shapes or values for node "dec_cat1".
[ ERROR ] Not all output shapes were inferred or fully defined for node "dec_cat1".
For more information please refer to Model Optimizer FAQ (https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html), question #40.
[ ERROR ]
[ ERROR ] It can happen due to bug in custom shape infer function <function concat_infer at 0x7fac2be730d0>.
[ ERROR ] Or because the node inputs have incorrect values/shapes.
[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.
[ ERROR ] Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "dec_cat1" node.
Attached is the full debug output.
  1. Is it possible the shapes are not correctly defined in Matlabs ONNX export?
  2. What could be done to solve this issue?
Regards, Markus
  2 commentaires
Maria Duarte Rosa
Maria Duarte Rosa le 9 Août 2019
Hi Markus,
Are you able to reproduce the tutorial if you use one of our pretrained models, such as googlenet? Is this just a problem when you use ONNX?
Thanks.
Markus Walser
Markus Walser le 12 Août 2019
Hi Maria
Thank you for taking time to look at my problem!
googlenet ONNX exports and inports fine to openvino, see examples on the buttom. What is really strange and I realized just now: Export the pretrained deeplabv3+ network from the Mathworks example
'https://www.mathworks.com/supportfiles/vision/data/deeplabv3plusResnet18CamVid.mat';
to ONNX and import to OpenVino also works:
vision@ubuntu:~$ /opt/intel/openvino_2019.2.242/deployment_tools/model_optimizer/mo.py --input_model ~/wam0101/deeplabv3.onnx
Model Optimizer arguments:
Common parameters:
- Path to the Input Model: /home/vision/wam0101/deeplabv3.onnx
- Path for generated IR: /home/vision/.
- IR output name: deeplabv3
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: Not specified, inherited from the model
- Mean values: Not specified
- Scale values: Not specified
- Scale factor: Not specified
- Precision of IR: FP32
- Enable fusing: True
- Enable grouped convolutions fusing: True
- Move mean values to preprocess section: False
- Reverse input channels: False
ONNX specific parameters:
Model Optimizer version: 2019.2.0-436-gf5827d4
[ SUCCESS ] Generated IR model.
[ SUCCESS ] XML file: /home/vision/./deeplabv3.xml
[ SUCCESS ] BIN file: /home/vision/./deeplabv3.bin
[ SUCCESS ] Total execution time: 6.07 seconds.
However, the transfer learned deeplabv3+ export which was generated by the helperDeeplabv3PlusResnet18() method and trained with the trainNetwork() method complaines about an undefined shape. In Matlab this transfer learned network runs fine.
Is there a Matlab tool to compare two ONNX export files? Acutually they should have identical structure.
Matlab googlenet export
>> net = googlenet;
>> exportONNXNetwork(net,'C:\Temp\googlenet.onnx')
OpenVino googlenet import
vision@ubuntu:~$ /opt/intel/openvino_2019.2.242/deployment_tools/model_optimizer/mo.py --input_model ~/wam0101/googlenet.onnx
Model Optimizer arguments:
Common parameters:
- Path to the Input Model: /home/vision/wam0101/googlenet.onnx
- Path for generated IR: /home/vision/.
- IR output name: googlenet
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: Not specified, inherited from the model
- Mean values: Not specified
- Scale values: Not specified
- Scale factor: Not specified
- Precision of IR: FP32
- Enable fusing: True
- Enable grouped convolutions fusing: True
- Move mean values to preprocess section: False
- Reverse input channels: False
ONNX specific parameters:
Model Optimizer version: 2019.2.0-436-gf5827d4
[ SUCCESS ] Generated IR model.
[ SUCCESS ] XML file: /home/vision/./googlenet.xml
[ SUCCESS ] BIN file: /home/vision/./googlenet.bin
[ SUCCESS ] Total execution time: 5.08 seconds.
Also resnet18 exports and inports to openvino
Resnet18 export from Matlab
>> net = resnet18;
>> exportONNXNetwork(net,'C:\Temp\resnet18.onnx')
Resnet18 import to OpenVino
vision@ubuntu:~$ /opt/intel/openvino_2019.2.242/deployment_tools/model_optimizer/mo.py --input_model ~/wam0101/resnet18.onnx
Model Optimizer arguments:
Common parameters:
- Path to the Input Model: /home/vision/wam0101/resnet18.onnx
- Path for generated IR: /home/vision/.
- IR output name: resnet18
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: Not specified, inherited from the model
- Mean values: Not specified
- Scale values: Not specified
- Scale factor: Not specified
- Precision of IR: FP32
- Enable fusing: True
- Enable grouped convolutions fusing: True
- Move mean values to preprocess section: False
- Reverse input channels: False
ONNX specific parameters:
Model Optimizer version: 2019.2.0-436-gf5827d4
[ SUCCESS ] Generated IR model.
[ SUCCESS ] XML file: /home/vision/./resnet18.xml
[ SUCCESS ] BIN file: /home/vision/./resnet18.bin
[ SUCCESS ] Total execution time: 5.32 seconds.

Connectez-vous pour commenter.

Réponses (0)

Produits


Version

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by