Ali Ghodsi, Deep Learning, Attention mechanism, self-attention, S2S, Fall 2023, Lecture 9
Data Science Courses Data Science Courses
20.6K subscribers
2,555 views
0

 Published On Oct 18, 2023

Attention mechanism and self-attention,
Sequence-to-sequence models

This video provides an in-depth exploration of Attention Mechanism and Self-Attention, crucial concepts that have revolutionized the field of Natural Language Processing (NLP). Transformers, the game-changers in NLP, rely heavily on self-attention. Join us as we unravel the fundamentals of attention and self-attention in the context of NLP, and gain a brief insight into their application in image processing.

show more

Share/Embed