Word2Vec
A popular implementation of word embeddings used for pre-training word representations.
Common Themes
Videos Mentioning Word2Vec

Breaking down the OG GPT Paper by Alec Radford
Latent Space
A popular implementation of word embeddings used for pre-training word representations.

Deep Learning Basics: Introduction and Overview
Lex Fridman
A technique for generating word embeddings, representing words as vectors in a way that captures semantic relationships.

Eugenia Kuyda: Friendship with an AI Companion | Lex Fridman Podcast #121
Lex Fridman
A group of models used to produce word embeddings, representing advancements in language translation and understanding, discussed for its role in transforming language into representations.

Deep Learning State of the Art (2019)
Lex Fridman
A technique for mapping words into a compressed, meaningful representation (embedding) using unsupervised learning.

Foundations and Challenges of Deep Learning (Yoshua Bengio)
Lex Fridman
A method for training word embeddings, cited as an example of successful unsupervised learning in Natural Language Processing.

Foundations of Unsupervised Deep Learning (Ruslan Salakhutdinov, CMU)
Lex Fridman
A technique mentioned in the context of text representation, potentially used to initialize models or sum word representations for input into simpler networks.

Deep Learning for Natural Language Processing (Richard Socher, Salesforce)
Lex Fridman
A model introduced by Thomas Mikolov in 2013 that trains word vectors by predicting words in a window, offering faster training and vocabulary expansion compared to older methods.