Distillation

Concept

The process of using larger AI models as 'teacher' models to train smaller, more efficient 'student' models.

Mentioned in 2 videos