Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial

Activation Function - What does the ReLU function do?


The rectified linear unit (ReLU) is an activation function that adds nonlinearity to a deep learning model while resolving the vanishing gradients problem. This is the reason for its widespread popularity.