exportONNXNetwork
Export network to ONNX model format
Description
exportONNXNetwork(
exports the deep learning network net
,filename
)net
with weights to the ONNX™ format file filename
. If
filename
exists, then exportONNXNetwork
overwrites the
file.
This function requires the Deep Learning Toolbox™ Converter for ONNX Model Format support package. If this support package is not installed, then the function provides a download link.
exportONNXNetwork(
exports a network using additional options specified by one or more name-value
arguments.net
,filename
,Name=Value
)
Examples
Input Arguments
Limitations
exportONNXNetwork
supports ONNX versions as follows:The function supports ONNX intermediate representation version 7.
The function supports ONNX operator sets 6 to 14.
exportONNXNetwork
does not export settings or properties related to network training such as training options, learning rate factors, or regularization factors.If you export a network containing a layer that the ONNX format does not support (see Layers Supported for ONNX Export), then
exportONNXNetwork
saves a placeholder ONNX operator in place of the unsupported layer and returns a warning. You cannot import an ONNX network with a placeholder operator into other deep learning frameworks.Because of architectural differences between MATLAB® and ONNX, an exported network can have a different structure compared to the original network.
Note
If you import an exported network, layers of the reimported network might differ from the original network and might not be supported.
More About
Tips
You can export a trained MATLAB deep learning network that includes multiple inputs and multiple outputs to the ONNX model format. To learn about a multiple-input and multiple-output deep learning network, see Multiple-Input and Multiple-Output Networks.
References
[1] Open Neural Network Exchange. https://github.com/onnx/.
[2] ONNX. https://onnx.ai/.
[3] ONNX Operators. https://github.com/onnx/onnx/blob/master/docs/Operators.md.
Version History
Introduced in R2018a