Word2Vec Papers Explained From Scratch: Skip-Gram with Negative Sampling
YouTube Viewers YouTube Viewers
1.51K subscribers
9,747 views
0

 Published On Feb 20, 2021

Today we are going to be taking you guys through the 2 Word2Vec papers!

PAPER

Paper 1: Efficient Estimation of Word Representations in Vector Space. https://arxiv.org/abs/1301.3781
Paper 2: Distributed Representations of Words and Phrases and their Compositionality. https://arxiv.org/abs/1310.4546

OUTLINE

00:00​ Intro & Overview
00:34​ Word2Vec Paper 1 Abstract
02:32 Feed Forward Neural Network Language Model
04:42 New Proposed Architectures
05:57 Intro to CBOW and Skip-Gram Models
07:21 Word2Vec Paper 2 Abstract
09:15 Skip-Gram Model Objective Function
10:31 Skip-Gram Model with Negative Sampling
11:55 Skip-Gram Model with Subsampling Frequent Words
12:18 Conclusion

RESOURCES

Understanding Neural Network    • But what is a neural network? | Chapt...  
Hierarchical Binary-Tree Softmax    • Neural networks [10.7] : Natural lang...  
Demystifying Neural Network in Skip-Gram Modelling: https://aegis4048.github.io/demystify...

show more

Share/Embed