ELI5: Explain Like I'm 5

Kullback–Leibler divergence

Kullback–Leibler divergence is a way to measure how different two probability distributions are. For example, if you have two types of candy in a jar, one is sour and one is sweet, you can use KL-divergence to compare how different the sour and sweet candies are. KL-divergence looks at all the different types of candy in the jar and sees how much more or less of one type there is compared to the other. If there is much more sour candy than sweet candy, then the divergence would be high.