DBRX

Software / AppMentioned in 1 video

Databricks' Mixture-of-Experts language model, featuring 132 billion total parameters and 36 billion active parameters on any input, pre-trained on 12 trillion tokens.