RoPE
Concept
A position embedding method used in Transformer architectures, appreciated for its extrapolation properties in long context models.
Mentioned in 3 videos
Save the 3 videos on RoPE to your own pod.
Sign up free to keep building your knowledge base on RoPE as more episodes are added.
Videos Mentioning RoPE

The 10,000x Yolo Researcher Metagame — with Yi Tay of Reka
Latent Space
A position embedding method used in Transformer architectures, appreciated for its extrapolation properties in long context models.

A Comprehensive Overview of Large Language Models - Latent Space Paper Club
Latent Space
A method for positional encoding, likely a more advanced or efficient variant.

Stanford CME296 Diffusion & Large Vision Models | Spring 2026 | Lecture 5 - Architectures
Stanford Online
A leading method for representing positions in attention layers, introduced in a 2021 paper, which rotates queries and keys based on their positions.