Mixtral 8x7B
Software / App
An open-source model that significantly impacted the landscape by highlighting serverless inference providers and considerations for speed and cost.
Mentioned in 2 videos
Videos Mentioning Mixtral 8x7B
![[Paper Club] Upcycling Large Language Models into Mixture of Experts](https://i.ytimg.com/vi/e_mkhFkKPEk/maxresdefault.jpg)
[Paper Club] Upcycling Large Language Models into Mixture of Experts
Latent Space
A specific MoE model architecture mentioned in the context of upcycling and learning rate experiments.

Artificial Analysis: The Independent LLM Analysis House — with George Cameron and Micah Hill-Smith
Latent Space
An open-source model that significantly impacted the landscape by highlighting serverless inference providers and considerations for speed and cost.