Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

ONNX 1.3 is here with experimental function concept

Save for later
  • 120 min read
  • 2018-09-06 07:58:50

article-image

Open Neural Network Exchange (ONNX) team released ONNX 1.3, last week. The latest release includes features such as experimental function concept, along with other related improvements.

ONNX is an open ecosystem that allows Artificial Intelligence developers to select the right set of tools as their project evolves. ONNX provides an open source format for the deep learning models which allows machines to learn tasks without the need of being explicitly programmed. Deep learning models trained on one framework can easily be transferred to another with the help of the ONNX format.

Let’s explore the changes in ONNX 1.3.

ONNX 1.3 Key Updates


The control flow operators in Operator Set 8 in ONNX 1.3 have evolved from the experimental phase. A new operator Expand has been added. Other operators such as Max, Min, Mean, and Sum have been added to support broadcasting. Other than that, there is added support for output indices in operator MaxPool.

An experimental function concept is introduced in ONNX 1.3 for representing composed operators. MeanVarianceNormalization uses this feature.

Shape inference in ONNX 1.3 has been enhanced with support added for Reshape

operator with a constant new shape. There are more ONNX optimization passes available.

In addition to that, there are more operator backend tests available now with newly added test coverage stat page. Opset Version Converter provides support for operators such as Add, Mul, Gemm, Relu, BatchNorm, Concat, Reshape, Sum, MaxPool, AveragePool, and Dropout.

All the models in the model zoo have been covered, except tiny-yolo-v2.

For more information, check out the official ONNX 1.3 release notes.

Amazon, Facebook and Microsoft announce the general availability of ONNX v0.1

ONNX for MXNet: Interoperability across deep learning models made easy

Baidu announces ClariNet, a neural network for text-to-speech synthesis

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime