![Checkout](https://naologiccom.imgix.net/website-update/general/checkout.png?auto=compress&w=64&fm=png)
Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial Question
Rectified Linear Unit - What is a Rectified Linear Unit?
Answer
When building deep learning models, the activation function most often utilized is a Rectified Linear Unit, or ReLU. If the input is negative, the function will return 0. Otherwise, it will return the same value for any positive value x. Thus, it is possible to write it as f(x)=max(0,x).