Checkout
Personalized AI apps
Build multi-agent systems without code and automate document search, RAG and content generation
Start free trial
Question

Attention Mechanism - What is attention vs self-attention in Transformers?

Answer

The ability for a transformer model to focus on certain parts of an input sequence is known as self-attention. However, a transformer model may concentrate on different regions in a distinct sequence thanks to attention.