FPGA AI Suite: IP Reference Manual

ID 768974
Date 7/31/2024
Public
Document Table of Contents

2.5.2.2. Parameter Group: activation

This parameter group configures the activation module. These activation functions are common in deep learning, and it is beyond the scope of this document to describe them.

Different activation functions can be enabled or disabled to suit the graph to be run. Disabling unnecessary activations functions can reduce area.

Parameter: activation/enable_relu

This parameter enables or disables the Rectified Linear Unit (ReLU) activation function.

Legal values:
[true, false]

Parameter: activation/enable_leaky_relu

This parameter enables or disables the Leaky ReLU activation function. This activation function is a superset of the ReLU activation function.

Legal values:
[true, false]

Parameter: activation/enable_prelu

This parameter enables or disables the Parametric ReLU activation function. This activation function is a superset of the Leaky ReLU activation function.

Legal values:
[true, false]

Parameter: activation/enable_clamp

This parameter enables or disables the clamp function. Enabling the clamp function also enables a ReLU6 activation function.

Legal values:
[true, false]

Parameter: activation/enable_round_clamp

This parameter enables or disables the round_clamp function. Enabling the round_clamp function also enables ReLU.

If both enable_clamp and enable_round_clamp are set, enable_round_clamp takes priority over enable_clamp when implementing ReLU.

Legal values:
[true, false]

Parameter: activation/enable_sigmoid

This parameter enables or disables the Sigmoid and Swish activation functions.

As a side-effect, enabling these activation functions also enables the Tanh and Reciprocal activation function. This side-effect might change in a future release, so the best practice is to enable activation function explicitly instead of depending on the side-effect.

Legal values:
[true, false]

Parameter activation/enable_tanh

This parameter enables or disables the Tanh activation function.

As a side-effect, enabling this activation functions also enables the Sigmoid, Swish, and Reciprocal activation functions. This side-effect might change in a future release, so the best practice is to enable activation functions explicitly instead of depending on the side-effect.

Legal value:
[true, false]