Error: “Unsupported Layer Type FakeQuantize” When Inferencing INT8 Model with MYRIAD Plugin
Content Type: Error Messages | Article ID: 000058516 | Last Reviewed: 05/20/2022
Ran inference on face-detection-adas-0001(INT8) model downloaded from Open Model Zoo with the following results:
Inference results:
CPU plugin - successful
GPU plugin - successful
MYRIAD plugin - error unsupported layer type FakeQuantize
FakeQuantize layers are added on activations and weights for most layers during the quantization process to INT8. The MYRIAD plugin does not support the FakeQuantize layer, hence, it cannot run inference on INT8 models.