Checkout
Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial
Question

Activation Function - What does the ReLU function do?

Answer

An activation function that fixes the vanishing gradients issue and adds nonlinearity to deep learning models is the rectified linear unit (ReLU). Because of this, it has become quite popular.