Abstract:
An overview of relative entropy (aka Kulback-Leibler divergence, etc.) and its multiple appearances in information theory, probability, and statistics, including recent results by the speaker.
Chat is not available.