FP8

Concept

A precision format (8-bit floating point) used for DeepSeek V3 weights, requiring specific kernel support for inference, and a trend in native quantization during training.

Mentioned in 6 videos

Build a research pod on FP8.

6 expert discussions. Save them all to your own pod, ask any question, get cited answers.

Get Started Free