M

Mixture of Experts (MoE)

ConceptMentioned in 1 video

A type of sparse model used for Cursor Tab, allowing it to process large inputs with small outputs efficiently, improving performance at longer context lengths.