Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial Question
Activation Function - Why use ReLU instead of sigmoid?
Answer
In two important respects, ReLU excels over sigmoid functions. First, because its gradient is always 1 for all positive inputs, it avoids the vanishing gradient problem. Second, better training is the outcome of a more refined backpropagation process.