AI PC Development
Advances in AI-focused hardware and software enable AI on the PC. Seamlessly transition projects from early AI development on the PC
to cloud-based training to edge deployment. Learn what is required of AI workloads and what is available to get started today.
ONNX* (Open Neural Network Exchange) Model and ONNX Runtime
ONNX* model is a machine learning model format, and ONNX Runtime is a cross-platform inference and training machine learning accelerator. It works with Intel® platforms and allows developers to improve model performance while easily targeting multiple platforms. ONNX Execution Providers (EP) enable certain hardware acceleration technologies to run AI models. Intel platforms have two optimized EPs: the OpenVINO Execution Provider and the DirectML EP.
Installation Guides
Use the following guides to get started with ONNX:
Accelerate AI on Windows* with Intel® NPUs
See how Microsoft* and Intel collaborate on NPU technology.
Support for Intel® AI Boost
Learn about the developer preview of DirectML with ONNX Runtime that enables support for Intel® Core™ Ultra processors with Intel® AI boost.
Sign Up for Exclusive News, Tips & Releases
Be among the first to learn about the latest development tools and resources for the Intel Core Ultra processor and your AI PC applications. Sign up now to get access to product updates and releases, exclusive invitations to webinars and events, valuable training and tutorial resources, exciting contest announcements, and other breaking news.