Checkout
Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial
Question

Activation Function - What does the ReLU function do?

Answer

The rectified linear unit (ReLU) is an activation function that adds nonlinearity to a deep learning model while resolving the vanishing gradients problem. This is the reason for its widespread popularity.