Intel® FPGA AI Suite: SoC Design Example User Guide

ID 768979
Date 2/12/2024
Public

A newer version of this document is available. Customers should click here to go to the newest version.

Document Table of Contents

6.2.2. RAM considerations

An FPGA-based external memory interface is used to store all machine learning input, output, and intermediate data.

The Intel® FPGA AI Suite IP uses the DDR memory extensively in its operations.

Typically, you dedicate a memory to the Intel® FPGA AI Suite IP and avoid sharing it with the host CPU DDR memory. Although a design can use the host memory, other services that use the DDR memory impact the Intel® FPGA AI Suite IP performance and increase non-determinism in inference durations. Consider this impact when you choose to use a shared DDR resource.

The Intel® FPGA AI Suite IP requires an extensive depth of memory that prohibits the use of onboard RAM such as M20Ks. Consider the RAM/DDR memory footprint when you design with the Intel® FPGA AI Suite IP.