Checkout
Personalized AI apps
Build multi-agent systems without code and automate document search, RAG and content generation
Start free trial
Question

Rectified Linear Unit - What is the formula for leaky Rectified Linear Unit?

Answer

Maximum of 0.01*x and x is the activation function for Leaky ReLU. For positive inputs, this function will return x; however, for negative inputs, it returns a tiny value, precisely 0.01 times x. Therefore, it also produces a result for negative numbers.