Word2Vec

GoogleVerified via Wikidata

group of related models that are used to produce word embeddings

Mentioned in 8 videos
Developer
Google

Videos Mentioning Word2Vec

Breaking down the OG GPT Paper by Alec Radford

Breaking down the OG GPT Paper by Alec Radford

Latent Space

A popular implementation of word embeddings used for pre-training word representations.

Deep Learning Basics: Introduction and Overview

Deep Learning Basics: Introduction and Overview

Lex Fridman

A technique for generating word embeddings, representing words as vectors in a way that captures semantic relationships.

Eugenia Kuyda: Friendship with an AI Companion | Lex Fridman Podcast #121

Eugenia Kuyda: Friendship with an AI Companion | Lex Fridman Podcast #121

Lex Fridman

A group of models used to produce word embeddings, representing advancements in language translation and understanding, discussed for its role in transforming language into representations.

Deep Learning State of the Art (2019)

Deep Learning State of the Art (2019)

Lex Fridman

A technique for mapping words into a compressed, meaningful representation (embedding) using unsupervised learning.

Foundations and Challenges of Deep Learning (Yoshua Bengio)

Foundations and Challenges of Deep Learning (Yoshua Bengio)

Lex Fridman

A method for training word embeddings, cited as an example of successful unsupervised learning in Natural Language Processing.

Foundations of Unsupervised Deep Learning (Ruslan Salakhutdinov, CMU)

Foundations of Unsupervised Deep Learning (Ruslan Salakhutdinov, CMU)

Lex Fridman

A technique mentioned in the context of text representation, potentially used to initialize models or sum word representations for input into simpler networks.

Deep Learning for Natural Language Processing (Richard Socher, Salesforce)

Deep Learning for Natural Language Processing (Richard Socher, Salesforce)

Lex Fridman

A model introduced by Thomas Mikolov in 2013 that trains word vectors by predicting words in a window, offering faster training and vocabulary expansion compared to older methods.

Stanford CS25: Transformers United V6 I Overview of Transformers

Stanford CS25: Transformers United V6 I Overview of Transformers

Stanford Online

A typical method for creating word embeddings, representing words as dense vectors in a high-dimensional space.