Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial

Rectified Linear Unit - What is a Rectified Linear Unit?


When building deep learning models, the activation function most often utilized is a Rectified Linear Unit, or ReLU. If the input is negative, the function will return 0. Otherwise, it will return the same value for any positive value x. Thus, it is possible to write it as f(x)=max(0,x).