You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We would like to use these issues to gauge user interest.
Sparse cross entropy allows the computation of cross entropy loss without one-hot encoding of the target class. This is useful for language modeling as the target classes are the entire vocabulary which is a very large space to one-hot encode, and wouldn't be memory efficient.
It is possible to make a custom implementation of sparse cross entropy computation with dlarray.
The text was updated successfully, but these errors were encountered:
We would like to use these issues to gauge user interest.
Sparse cross entropy allows the computation of cross entropy loss without one-hot encoding of the target class. This is useful for language modeling as the target classes are the entire vocabulary which is a very large space to one-hot encode, and wouldn't be memory efficient.
It is possible to make a custom implementation of sparse cross entropy computation with
dlarray
.The text was updated successfully, but these errors were encountered: