Mixture of Experts (MoE)
Concept
A type of sparse model used for Cursor Tab, allowing it to process large inputs with small outputs efficiently, improving performance at longer context lengths.
Mentioned in 1 video
A type of sparse model used for Cursor Tab, allowing it to process large inputs with small outputs efficiently, improving performance at longer context lengths.