Validated EfficientNet models that are supported by OpenVINO™.
- The model was generated by using this code:
model=tf.keras.applications.EfficientNetB0(
include_top=True,
weights=None,
pooling=max,
classes=2,
classifier_activation="softmax"
)
- Converted the model to SavedModel format
- Ran the Model Optimizer command:
mo --saved_model_dir model
- Received errors:
[ ERROR ] Cannot infer shapes or values for node "StatefulPartitionedCall".
[ ERROR ] Error converting shape to a TensorShape: Failed to convert 'masked_array(data=[--, 224, 224, 3],
mask=[ True, False, False, False],
fill_value=-1000000007)' to a shape: 'masked'could not be converted to a dimension. A shape should either be single dimension (e.g. 10), or an iterable of dimensions (e.g. [1, 10, None])..
[ ERROR ]
[ ERROR ] It can happen due to bug in custom shape infer function <function tf_native_tf_node_infer at 0x7ff1ef133310>.
[ ERROR ] Or because the node inputs have incorrect values/shapes.
[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.
[ ERROR ] Exception occurred during running replacer "REPLACEMENT_ID" (<class 'openvino.tools.mo.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "StatefulPartitionedCall" node.
For more information please refer to Model Optimizer FAQ, question #38. (https://docs.openvino.ai/latest/openvino_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html?question=38#question-38) Post Time
The encountered errors are due to some of the layers in the custom model not being compatible with Model Optimizer architecture.
The validated Intel Public Pre-Trained EfficientNet models from the Open Model Zoo are as follows: