Checkout
Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial
Question

Activation Function - Why use ReLU instead of sigmoid?

Answer

ReLU has the upper hand over sigmoid functions in two key areas. Firstly, it bypasses the vanishing gradient issue due to its constant gradient of 1 for all positive inputs. Secondly, the backpropagation process becomes smoother, resulting in more effective training.