Visible to Intel only — GUID: hex1554215727082
Ixiasoft
Visible to Intel only — GUID: hex1554215727082
Ixiasoft
2.2.2. DDR4 Connection to PCIe Host
The OpenCL™ Memory Bank Divider component sits in the datapath of host and FPGA memory. It accepts input from the DMA engine or BAR4 of PCIE and outputs to the FPGA memory. It is mainly useful in OpenCL™ BSPs with multiple memory banks, where it creates a larger memory space. Implementations of appropriate clock crossing and pipelining are based on the design floorplan and the clock domains specific to the computing card. The OpenCL Memory Bank Divider section in the Intel® FPGA SDK for OpenCL™ Custom Platform Toolkit User Guide specifies the connection details of acl_bsp_snoop and acl_bsp_memorg_host ports.
The DDR4 IP core has one bank where its width and address configurations match those of the DDR4 SDRAM. Intel® tunes the other parameters such as burst size, pending reads, and pipelining. These parameters are customizable for an end application or board design.
- When designing a multi-bank OpenCL BSP (two DDR banks), you must change number of banks to 2 at the instantiation of the OpenCL Memory Bank divider. Furthermore, the acl_bsp_memorg_host wire must be connected to the kernel_interface block. The Memory Bank Divider has options of being used in two modes. The DEFAULT option uses banks in an interleaved manner to obtain more aggregated bandwidth. It can also be used in a non-interleaved mode, which creates a contiguous address space. This flag can be set during offline compilation using -no-interleave flag. The Address Span Extender (memwindow) component in the host to DDR path is used to transfer the unaligned portion of the data using BAR4 during DMA transfer or when DMA is disabled.
- Instruct the host to verify the successful calibration of the memory controller.
The INTELFPGAOCLSDKROOT/board/s10_ref/hardware/s10gx/board.qsys Platform Designer system uses a custom UniPHY Status to AVS IP component to aggregate different UniPHY status conduits into a single Avalon® agent port named s. This agent port connects to the pipe_stage_host_ctrl component so that the PCIe host can access it.