Here is how Transformers ended the tradition of Inductive Bias in Neural Nets
Neural Breakdown with AVB Neural Breakdown with AVB
5.32K subscribers
5,751 views
0

 Published On Nov 30, 2023

In this episode, I discuss Transformers and the role of Attention in Deep Learning and everything one needs to know about them!

Especially how they changed the usual AI research course of aligning towards more inductive bias in neural nets (CNNs with their locality bias, RNNs with their recency bias) and embraced a more general, but data-hungry architecture.

To support the channel and access the Word documents/slides/animations used in this video, consider JOINING the channel on Youtube or Patreon. Members get access to Code, project files, scripts, slides, animations, and illustrations for most of the videos on my channel! Learn more about perks below.

Join and support the channel - https://www.youtube.com/@avb_fj/join
Patreon -   / neuralbreakdownwithavb  

Check out the previous two videos on Attention and Self Attention that covers a lot of the groundwork behind this video:
   • Neural Attention - This simple exampl...  
   • What the “Self” in Self-Attention act...  

Follow on Twitter: @neural_avb
#ai #deeplearning #machinelearning #neuralnetworks

Useful papers:
https://arxiv.org/abs/1706.03762
https://arxiv.org/abs/1409.0473

show more

Share/Embed