Personalized AI apps
Build multi-agent systems without code and automate document search, RAG and content generation
Start free trial Question
Activation Function - Why use ReLU instead of sigmoid?
Answer
In two important respects, ReLU excels over sigmoid functions. First, because its gradient is always 1 for all positive inputs, it avoids the vanishing gradient problem. Second, better training is the outcome of a more refined backpropagation process.