AI Tools Selector Guide

ID 785444
Updated 10/9/2024
Version 2024.2.0
Public

author-image

By

Use the AI Tools Selector for a seamless installation process of the Intel AI Tools, tailored to your preferred distribution channel and package selection. You have the following options to choose from:

  • Presets
  • Custom selection of packages
  • Offline installer

Before proceeding with the installation of AI Tools, make sure your system meets the necessary System Requirements.

Set Up System Before Installation

Before installing AI tools via conda or pip, you need to set up your system as described below.

conda

Install conda

Set up conda with Miniforge:

  1. Download the appropriate Miniforge Installer for Linux* OS:
    wget -q https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-Linux-x86_64.sh
  1. In your terminal, run:
    sh Miniforge3-Linux-x86_64.sh
  1. Delete the downloaded file:
    rm Miniforge3-Linux-x86_64.sh
  1. (Optional) To speed up execution, use libmamba as the solver, which is the default in the latest conda distribution (miniforge). For older conda installations, use these commands to update conda and set libmamba:
    conda update -n base conda && \
    conda config --set solver libmamba

    To verify that libmamba is set:

    conda config --show

To learn more about miniforge installation, see the Miniforge Repository.

Create Environment

Create and activate a virtual environment:

conda create -n example_env -y
conda activate example_env

pip

Install Intel® oneAPI Base Toolkit (Ubuntu 22.04)

  1. Download the Intel® oneAPI Base Toolkit from the official Intel® repository:
    wget https://registrationcenter-download.intel.com/akdlm/IRC_NAS/e6ff8e9c-ee28-47fb-abd7-5c524c983e1c/l_BaseKit_p_2024.2.1.100_offline.sh
  1. Install the Intel® oneAPI Base Toolkit using the bash script:
    sh ./l_BaseKit_p_2024.2.1.100_offline.sh -a --silent --eula accept
  1. Depending on whether you want to use the Intel® Extension for PyTorch* or Intel® Extension for TensorFlow*, you need to set the environment variables with one of these scripts:

    Intel® Extension for TensorFlow*

    source /opt/intel/oneapi/compiler/latest/env/vars.sh  
    source /opt/intel/oneapi/mkl/latest/env/vars.sh  
    source /opt/intel/oneapi/ccl/latest/env/vars.sh  
    source /opt/intel/oneapi/mpi/latest/env/vars.sh

    Intel® Extension for PyTorch*

    source /opt/intel/oneapi/compiler/latest/env/vars.sh  
    source /opt/intel/oneapi/mkl/latest/env/vars.sh  

Install Python (Ubuntu 22.04)

  1. Install Python from apt, if you are a root user, you do not need sudo:
    sudo apt-get update && apt install python3.10
  1. Verify Python is installed:
    which python3
  1. Install venv to create a virtual environment:
    apt install python3.10-venv

Create Environment

Create and activate a virtual environment:

python3.10 -m venv example_env
source example_env/bin/activate

Install GPU drivers (Ubuntu 22.04)

If you use Intel GPU, you need to install the GPU drivers separately as described in the Ubuntu 22.04 section of the Intel® software for general purpose GPU capabilities document. Make sure you follow the instructions for LTS version.

Install AI Tools

Presets

The presets represent combinations of AI tools for easy installation and use. The following presets are available for integration into your workflow:

  • Data Analytics: Uncover valuable insights about your business and customers using libraries and tools optimized for Intel architectures. Make informed, data-driven decisions with enhanced performance.
  • Classical ML: Accelerate your Machine Learning and Data Science pipelines with the power of open libraries optimized for Intel architectures. Enhance the efficiency and speed of your ML tasks.
  • Deep Learning: Boost the performance of your single node and distributed Deep Learning workloads on Intel hardware with Intel's optimizations for TensorFlow and PyTorch.
  • Inference Optimization: Reduce model size and improve the speed of your deep learning inference deployments on Intel hardware.

Install a Preset

To use one of the preset packages, follow these steps:

  1. Go to the AI Tools Selector and click the preferred preset package name from the left sidebar. The tools included in the package are displayed as checked options while all other tools are dimmed automatically and not available for selection.
  2. Choose the version of the Intel® Optimized Python for installation via conda or Docker container.
  3. Choose either conda, pip, or Docker as your preferred distribution type.
  4. Make sure you set up the environment as described on the right panel of the selector under the installation command.
  5. Copy and execute the provided installation command in your Terminal.

Click the Customize button to switch to a custom selection of packages.

Accessing Docker Hub By accessing, downloading, or using this software and any required dependent software (the "Software Package"), you agree to the terms and conditions of the software license agreements for the Software Package, which may also include notices, disclaimers, or license terms for third-party software included with the Software Package. Refer to the licensing information for additional details.

Custom Selection of Packages

For more personalized package selection, you can install individual AI tools using conda or pip:

  1. Go to the AI Tools Selector and click the Customize button on the left sidebar.
  2. Select the appropriate Intel® Optimized Python version. Note that this selection is available only for installation via Conda and will be automatically dimmed for pip.
  3. Choose either conda or pip as your preferred package type.
  4. Click the checkboxes of the required tools, frameworks, SDKs, and/or CLIs. Note that certain combinations may be unavailable and will be automatically dimmed.
  5. Make sure you set up the environment as described in Set Up System Before Installation.
  6. Copy and execute the provided installation command in your Terminal.

Offline Installer

The AI Tools offline installer is a set of pre-built conda environments with the Intel optimization packages for Python. It contains the following environments:

Environment Main Component
base Intel® Distribution for Python
modin Modin*
pytorch Intel® Extension For PyTorch*
pytorch-gpu Intel® Extension For PyTorch*
tensorflow Intel® Extension For TensorFlow*
tensorflow-gpu Intel® Extension For TensorFlow*

NOTE

To use GPU environments, make sure the corresponding GPU drivers are installed as described in Install GPU Drivers.

To install the AI tools via the offline installer:

  1. Go to the AI Tools Selector and click the Offline installer button on the left sidebar.
  2. Follow the download and installation instructions on the right-hand side of the selector.

After installation, configure the system as described in Get Started with the AI Tools.

Verify Installation

Use the commands below to verify that the AI tools are properly installed:

AI Tool Command
Intel® Extension for PyTorch* (CPU) python -c "import torch; import intel_extension_for_pytorch as ipex; print(torch.__version__);print(ipex.__version__);"
Intel® Extension for PyTorch* (GPU) python -c "import torch; import intel_extension_for_pytorch as ipex; print(torch.__version__); print(ipex.__version__); [print(f'[{i}]: {torch.xpu.get_device_properties(i)}') for i in range(torch.xpu.device_count())];"
Intel® Extension for TensorFlow* (CPU) python -c "import intel_extension_for_tensorflow as itex; print(itex.__version__)"
Intel® Extension for TensorFlow* (GPU) python -c "from tensorflow.python.client import device_lib; print(device_lib.list_local_devices())"
Intel® Optimization for XGBoost* python -c "import xgboost as xgb; print(xgb.__version__)"
Intel® Extension for Scikit-learn* python -c "from sklearnex import patch_sklearn; patch_sklearn()"
Modin* python -c "import modin; print(modin.__version__)"
Intel® Neural Compressor python -c "import neural_compressor as inc; print(inc.__version__)"

Next Steps

After successful installation, refer to the following resources to start using the installed product(s):

Known lssues

2024.2

  • When installing packages via conda, you may get a "ClobberError" or "SafetyError", which occurs when conda detects that a file or directory will be overwritten. This error does not impact the functionality of the installed packages and can be safely ignored.

2024.1

  • For conda installations using version 23.9 or lower, you may get a JSON error:
    "json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)"

    This error occurs because, prior to conda version 23.10.0, the default solver was classical instead of conda-libmamba. To resolve this issue, update conda using one of the following commands:

     conda update conda -c conda-forge --override 

    or

    conda update conda -c conda-forge --update-deps --override