OpenVINO™ inference time increases when running multiple processes
Content Type: Product Information & Documentation | Article ID: 000058227 | Last Reviewed: 06/06/2023
The inferencing time doubles when running two processes to infer the same model.
ie.SetConfig({ { CONFIG_KEY(CPU_BIND_THREAD), "NO" } }, "CPU")
Refer to Supported Configuration Parameters for more information on the configuration parameters for the CPU plugin.