Personalized AI apps
Build multi-agent systems without code and automate document search, RAG and content generation
Start free trial Question
Rectified Linear Unit - What is the purpose of leaky ReLU?
Answer
To make the outputs of artificial neural networks nonlinear throughout all of their layers, Leaky ReLU is used as an activation function. A 'dead' neural network might happen during training if the normal ReLU function is used, which is why this function was created to fix the 'dying ReLU' problem.