Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial

Activation Function - What is the ELU activation function?


Neural networks also make use of the Exponential Linear Unit (ELU) as an activation function. Due to their ability to take on negative values, ELUs are able to bring mean unit activations closer to zero, in contrast to ReLUs. This approach is comparable to batch normalization, however it requires less computing power.