Article ID: 000058526 Content Type: Product Information & Documentation Last Reviewed: 08/03/2022

Is It Possible to Convert from Intermediate Representation (IR) Format Back to Open Neural Network Exchange* (ONNX*), or Other, File Format Using Model Optimizer?

BUILT IN - ARTICLE INTRO SECOND COMPONENT
Summary

Optimization of neural network models for execution on Inference Engine.

Description

ONNX* model can be converted into IR format using Model Optimizer tool. Unable to validate if there is a way to convert from IR format back to ONNX* file format.

Resolution

The OpenVINO™ workflow does not support converting from IR format back to ONNX*, or other, file format. Model Optimizer loads a model into memory, reads it, builds the internal representation of the model, optimizes it, and produces the IR format, which is the only format that the Inference Engine accepts.

Related Products

This article applies to 2 products