Intel® FPGA AI Suite: IP Reference Manual

ID 768974
Date 7/03/2023
Public

A newer version of this document is available. Customers should click here to go to the newest version.

Document Table of Contents

2.4.2.2. Parameter group: activation

This parameter group configures the activation module. These activation functions are common in deep learning, and it is beyond the scope of this document to describe them.

Different activation functions can be enabled or disabled to suit the graph to be run. Disabling unnecessary activations functions can reduce area.

Parameter: activation/enable_relu

This parameter enables or disables the Rectified Linear Unit (ReLU) activation function.

Legal values:
[true, false]

Parameter: activation/enable_leaky_relu

This parameter enables or disables the Leaky ReLU activation function. This activation function is a superset of the ReLU activation function.

Legal values:
[true, false]

Parameter: activation/enable_prelu

This parameter enables or disables the Parametric ReLU activation function. This activation function is a superset of the Leaky ReLU activation function.

Legal values:
[true, false]

Parameter: activation/enable_clamp

This parameter enables or disables the clamp function. Enabling the clamp function also enables a ReLU6 activation function.

Legal values:
[true, false]

Parameter: activation/enable_round_clamp

This parameter enables or disables the round_clamp function. Enabling the round_clamp function also enables ReLU.

If both enable_clamp and enable_round_clamp are set, enable_round_clamp takes priority over enable_clamp when implementing ReLU.

Legal values:
[true, false]