Main Content

exportONNXNetwork

Export network to ONNX model format

Description

example

exportONNXNetwork(net,filename) exports the deep learning network net with weights to the ONNX™ format file filename. If filename exists, then exportONNXNetwork overwrites the file.

This function requires the Deep Learning Toolbox™ Converter for ONNX Model Format support package. If this support package is not installed, then the function provides a download link.

exportONNXNetwork(net,filename,Name=Value) exports a network using additional options specified by one or more name-value arguments.

Examples

collapse all

Load the pretrained SqueezeNet convolutional neural network.

net = squeezenet
net = 
  DAGNetwork with properties:

         Layers: [68×1 nnet.cnn.layer.Layer]
    Connections: [75×2 table]
     InputNames: {'data'}
    OutputNames: {'ClassificationLayer_predictions'}

Export the network net as an ONNX format file called squeezenet.onnx. Save the file to the current folder. If the Deep Learning Toolbox Converter for ONNX Model Format support package is not installed, then exportONNXNetwork provides a link to the required support package in the Add-On Explorer. To install the support package, click the link, and then click Install.

filename = "squeezenet.onnx";
exportONNXNetwork(net,filename)

Now you can import the squeezenet.onnx file into any deep learning framework that supports ONNX import.

Export a layer graph with or without an output layer to the ONNX format by using exportONNXNetwork.

Load a pretrained SqueezeNet convolutional neural network, and convert the pretrained network to a layer graph.

net = squeezenet;
lgraph1 = layerGraph(net)
lgraph1 = 
  LayerGraph with properties:

         Layers: [68×1 nnet.cnn.layer.Layer]
    Connections: [75×2 table]
     InputNames: {'data'}
    OutputNames: {'ClassificationLayer_predictions'}

Analyze the layer graph. analyzeNetwork displays an interactive plot of the network architecture and a table containing information about the network layers. You can also detect errors and issues in the layer graph lgraph1 before exporting to the ONNX format. lgraph1 is error free.

analyzeNetwork(lgraph1)

analyzeLgraph1.png

Export the layer graph lgraph1 as an ONNX format file in the current folder called squeezeLayers1.onnx.

exportONNXNetwork(lgraph1,"squeezeLayers1.onnx")

Now, you can import the squeezeLayers1.onnx file into any deep learning framework that supports ONNX import.

Remove the output layer of lgraph1.

lgraph2 = removeLayers(lgraph1,lgraph1.Layers(end).Name)
lgraph2 = 
  LayerGraph with properties:

         Layers: [67×1 nnet.cnn.layer.Layer]
    Connections: [74×2 table]
     InputNames: {'data'}
    OutputNames: {1×0 cell}

Analyze the layer graph lgraph2 by using analyzeNetwork. The layer graph analysis detects a missing output layer and an unconnected output. You can still export lgraph2 to the ONNX format.

analyzeNetwork(lgraph2)

analyzeLgraph2.png

Export the layer graph lgraph2 as an ONNX format file in the current folder called squeezeLayers2.onnx.

exportONNXNetwork(lgraph2,"squeezeLayers2.onnx")

Now, you can import the squeezeLayers2.onnx file into any deep learning framework that supports ONNX import.

Input Arguments

collapse all

Trained network or graph of network layers, specified as a SeriesNetwork, DAGNetwork, dlnetwork, or LayerGraph object.

You can get a trained network (SeriesNetwork, DAGNetwork, or dlnetwork) in these ways:

  • Import a pretrained network. For example, use the googlenet function.

  • Train your own network. Use trainNetwork to train a SeriesNetwork or DAGNetwork. Use trainnet or a custom training loop to train a dlnetwork object.

A LayerGraph object is a graph of network layers. Some of the layer parameters of this graph might be empty (for example, the weights and bias of convolution layers, and the mean and variance of batch normalization layers). Before using the layer graph as an input argument to exportONNXNetwork, initialize the empty parameters by assigning random values. Alternatively, you can do one of the following before exporting:

  • Convert a LayerGraph object to a dlnetwork object by using the layer graph as an input argument to dlnetwork. The empty parameters are automatically initialized.

  • Convert a LayerGraph object to a trained DAGNetwork object by using trainNetwork. Use the layer graph as the layers input argument to trainNetwork.

You can detect errors and issues in a trained network or graph of network layers before exporting to an ONNX network by using analyzeNetwork. exportONNXNetwork requires SeriesNetwork, DAGNetwork, and dlnetwork objects to be error free. exportONNXNetwork permits exporting a LayerGraph object with a missing or unconnected output layer.

Name of file, specified as a character vector or string scalar.

Example: "network.onnx"

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Example: exportONNXNetwork(net,filename,NetworkName="my_net") exports a network and specifies "my_net" as the network name in the saved ONNX network.

Name of ONNX network to store in the saved file, specified as a character vector or a string scalar.

Example: NetworkName="my_squeezenet"

Version of ONNX operator set to use in the exported model, specified as a positive integer in the range [6 13]. If the default operator set does not support the network you are trying to export, then try using a later version. If you import the exported network to another framework and you used an operator set during export that the importer does not support, then the import can fail.

To ensure that you use the appropriate operator set version, consult the ONNX operator documentation [3]. For example, OpsetVersion=9 exports the maxUnpooling2dLayer to the MaxUnpool-9 ONNX operator.

Example: OpsetVersion=6

Batch size of the ONNX network, specified as [] or as a positive integer. If you specify BatchSize as [], the ONNX network has a dynamic batch size. If you specify BatchSize as a positive integer k, the ONNX network has a fixed batch size of k.

Example: BatchSize=10

Limitations

  • exportONNXNetwork supports ONNX versions as follows:

    • The function supports ONNX intermediate representation version 7.

    • The function supports ONNX operator sets 6 to 14.

  • exportONNXNetwork does not export settings or properties related to network training such as training options, learning rate factors, or regularization factors.

  • If you export a network containing a layer that the ONNX format does not support (see Layers Supported for ONNX Export), then exportONNXNetwork saves a placeholder ONNX operator in place of the unsupported layer and returns a warning. You cannot import an ONNX network with a placeholder operator into other deep learning frameworks.

  • Because of architectural differences between MATLAB® and ONNX, an exported network can have a different structure compared to the original network.

Note

If you import an exported network, layers of the reimported network might differ from the original network and might not be supported.

More About

collapse all

Layers Supported for ONNX Export

exportONNXNetwork can export the following:

Tips

  • You can export a trained MATLAB deep learning network that includes multiple inputs and multiple outputs to the ONNX model format. To learn about a multiple-input and multiple-output deep learning network, see Multiple-Input and Multiple-Output Networks.

References

[1] Open Neural Network Exchange. https://github.com/onnx/.

[2] ONNX. https://onnx.ai/.

Version History

Introduced in R2018a