1 min read

Open Neural Network Exchange (ONNX) team released ONNX 1.3, last week. The latest release includes features such as experimental function concept, along with other related improvements.

ONNX is an open ecosystem that allows Artificial Intelligence developers to select the right set of tools as their project evolves. ONNX provides an open source format for the deep learning models which allows machines to learn tasks without the need of being explicitly programmed. Deep learning models trained on one framework can easily be transferred to another with the help of the ONNX format.

Let’s explore the changes in ONNX 1.3.

ONNX 1.3 Key Updates

The control flow operators in Operator Set 8 in ONNX 1.3 have evolved from the experimental phase. A new operator Expand has been added. Other operators such as Max, Min, Mean, and Sum have been added to support broadcasting. Other than that, there is added support for output indices in operator MaxPool.

An experimental function concept is introduced in ONNX 1.3 for representing composed operators. MeanVarianceNormalization uses this feature.

Shape inference in ONNX 1.3 has been enhanced with support added for Reshape

operator with a constant new shape. There are more ONNX optimization passes available.

In addition to that, there are more operator backend tests available now with newly added test coverage stat page. Opset Version Converter provides support for operators such as Add, Mul, Gemm, Relu, BatchNorm, Concat, Reshape, Sum, MaxPool, AveragePool, and Dropout.

All the models in the model zoo have been covered, except tiny-yolo-v2.

For more information, check out the official ONNX 1.3 release notes.

Read Next

Amazon, Facebook and Microsoft announce the general availability of ONNX v0.1

ONNX for MXNet: Interoperability across deep learning models made easy

Baidu announces ClariNet, a neural network for text-to-speech synthesis

Tech writer at the Packt Hub. Dreamer, book nerd, lover of scented candles, karaoke, and Gilmore Girls.