Kullback-Leibler (KL) divergence

Concept

A term in the VAE loss function that regularizes the encoder's latent distribution to be close to a fixed prior distribution (e.g., standard normal), helping to structure the latent space.

Mentioned in 1 video