Executive Summary
Recent advances in AI research using camera images have spurred rapid development of AI solutions by device manufacturers. These applications require significant computational power for AI inference, typically processed with external accelerators. This approach introduces a series of issues accompanied by additional GPU cards, such as size constraints, higher costs, increased power consumption, as well as concerns about long-term availability.
This is where OpenVINO™ toolkit comes in—a solution to accelerate AI inference at the edge. OpenVINO toolkit is a developer's toolset provided by Intel at no cost to optimize inference performance on Intel CPUs, integrated GPUs (iGPUs), and NPUs.
Using the OpenVINO toolkit enables high-speed AI inference on both standard laptops and industrial PCs.