DBRX
Software / App
Databricks' Mixture-of-Experts language model, featuring 132 billion total parameters and 36 billion active parameters on any input, pre-trained on 12 trillion tokens.
Mentioned in 1 video
Databricks' Mixture-of-Experts language model, featuring 132 billion total parameters and 36 billion active parameters on any input, pre-trained on 12 trillion tokens.