Checkout
Personalized AI apps
Build multi-agent systems without code and automate document search, RAG and content generation
Start free trial
Question

Rectified Linear Unit - What is a Rectified Linear Unit?

Answer

When building deep learning models, the activation function most often utilized is a Rectified Linear Unit, or ReLU. If the input is negative, the function will return 0. Otherwise, it will return the same value for any positive value x. Thus, it is possible to write it as f(x)=max(0,x).