Personalized AI apps
Build multi-agent systems without code and automate document search, RAG and content generation
Start free trial Question
Vanishing/Exploding Gradients - How is vanishing gradient solved?
Answer
To solve the vanishing gradient problem, one might use one of several approaches. Activation functions like ReLU (rectified linear unit) and its derivatives can help keep gradients from being too tiny, for example. As long as the input is positive, these functions will keep their slope at 1.