Get Started with Intel® Extension for PyTorch* on a GPU
Get Started with Intel® Extension for PyTorch* on a GPU
Subscribe Now
Stay in the know on all things CODE. Updates are delivered to your inbox.
Overview
Learn how to get started running PyTorch inference on an Intel® Data Center GPU Flex Series using Intel® Extension for PyTorch*. See how this extension brings the latest and greatest features for Intel hardware to open source PyTorch.
Intel contributes optimizations and features to open source PyTorch. Often these new capabilities debut in Intel Extension for PyTorch. Learn how to get started with this extension and how to use it for running inference on an Intel Data Center GPU Flex Series. This is illustrated using a transformers model based on one of the published examples in the documentation for the extension.
The video shows how to find more information and how to download this free extension for GPU support. For a direct link, you can check out Intel® Optimization for PyTorch*. To see what other AI tools and frameworks Intel offers, see AI and Machine Learning.
Featured Software
Get Intel Extension for PyTorch as part of the Intel® AI Analytics Toolkit.
Resources
Accelerate data science and AI pipelines-from preprocessing through machine learning-and provide interoperability for efficient model development.