Summary
Supported frameworks for OpenVINO™ Toolkit Inference Engine API
Description
- Inferenced Caffe model directly on Intel® Neural Compute Stick 2 (Intel® NCS2).
- Unable to determine why Caffe model can be used directly with OpenVINO™ Toolkit Inference Engine API without converting to Intermediate Representation (IR).
Resolution
Only limited Caffe frameworks can be used directly with the OpenVINO™ Toolkit Inference Engine API.
Currently, the frameworks supported by OpenVINO™ Toolkit Inference Engine API are:
- Intermediate Representation (IR)
- ONNX