Intel® FPGA AI Suite: SoC Design Example User Guide

ID 768979
Date 12/01/2023
Public

A newer version of this document is available. Customers should click here to go to the newest version.

Document Table of Contents

3.6.2. Preparing a Model

A model must be converted from a framework (such as TensorFlow, Caffe, or Pytorch) into a pair of .bin and .xml files before the Intel® FPGA AI Suite compiler (dla_compiler command) can ingest the model.

The following commands download the ResNet-50 TensorFlow model and run Model Optimizer:
source ~/build-openvino-dev/openvino_env/bin/activate
omz_downloader --name resnet-50-tf \
    --output_dir $COREDLA_WORK/demo/models/
omz_converter --name resnet-50-tf \
    --download_dir $COREDLA_WORK/demo/models/ \
    --output_dir $COREDLA_WORK/demo/models/

The omz_downloader command downloads the trained model to $COREDLA_WORK/demo/models folder. The omz_converter command runs model optimizer that converts the trained model into intermediate representation .bin and .xml files in the $COREDLA_WORK/demo/models/public/resnet-50-tf/FP32/ directory.

The directory $COREDLA_WORK/demo/open_model_zoo/models/public/resnet-50-tf/ contains two useful files that do not appear in the $COREDLA_ROOT/demo/models/ directory tree:
  • The README.md file describes background information about the model.
  • The model.yml file shows the detailed command-line information given to Model Optimizer (mo.py) when it converts the model to a pair of .bin and .xml files

For a list OpenVINO™ Model Zoo models that the Intel® FPGA AI Suite supports, refer to the Intel® FPGA AI Suite IP Reference Manual .