Personalized AI apps
Build multi-agent systems without code and automate document search, RAG and content generation
Start free trial Question
Activation Function - What is the ELU activation function?
Answer
Neural networks also make use of the Exponential Linear Unit (ELU) as an activation function. Due to their ability to take on negative values, ELUs are able to bring mean unit activations closer to zero, in contrast to ReLUs. This approach is comparable to batch normalization, however it requires less computing power.