Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

distinction of cross entropy and KL divergence #520

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Commits on Nov 11, 2020

  1. distinction of cross entropy and KL divergence

    I'm not entirely sure of the inner workings of the algorithm, but when reading this documentation and comparing with other sources I found that the expression for what was named 'cross entropy' did not seem correct. Instead, there are two separate terms describing two separate KL divergences (one for the change in entropy in the probability of the simplex existing, and one for not existing). It is not clear in the text (even with my suggestions) why one needs to divergences. I make no claims to the workings of the algorithm, but only suggest changes to the descriptions of the mathematics in the documentation. 
    
    I hope it makes sense, and thanks to everyone for putting this great resource together!
    schwabPhysics authored Nov 11, 2020
    Configuration menu
    Copy the full SHA
    92176f5 View commit details
    Browse the repository at this point in the history