MIT 6.S191 (2022): Recurrent Neural Networks and Transformers
Alexander Amini Alexander Amini
242K subscribers
252,190 views
0

 Published On Premiered Mar 18, 2022

MIT Introduction to Deep Learning 6.S191: Lecture 2
Recurrent Neural Networks
Lecturer: Ava Soleimany
January 2022

For all lectures, slides, and lab materials: http://introtodeeplearning.com

Lecture Outline
0:00​ - Introduction
1:59​ - Sequence modeling
4:16​ - Neurons with recurrence
10:09 - Recurrent neural networks
11:42​ - RNN intuition
14:44​ - Unfolding RNNs
16:43 - RNNs from scratch
19:49 - Design criteria for sequential modeling
21:00 - Word prediction example
27:49​ - Backpropagation through time
30:02 - Gradient issues
33:53​ - Long short term memory (LSTM)
35:35​ - RNN applications
40:22 - Attention fundamentals
43:12 - Intuition of attention
44:53 - Attention and search relationship
47:16 - Learning attention with neural networks
54:52 - Scaling attention and applications
56:09 - Summary
Subscribe to stay up to date with new deep learning lectures at MIT, or follow us @MITDeepLearning on Twitter and Instagram to stay fully-connected!!

show more

Share/Embed