KL Divergence

Published: Aug. 7, 2017, 3:07 a.m.

b"Kullback Leibler divergence, or KL divergence, is a measure of information loss when you try to approximate one distribution with another distribution. \\xa0It comes to us originally from information theory, but today underpins other, more machine-learning-focused algorithms like t-SNE. \\xa0And boy oh boy can it be tough to explain. \\xa0But we're trying our hardest in this episode!"