-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Negative discrepancy #5
Comments
This is expected behavior. The smoothing of the min based on the log-sum-exp can lead to negative values. In addition, like the original DTW, soft-DTW doesn't fulfill the triangle inequality. |
Ok, thanks for the confirmation. On a similar note, if I set |
Indeed, soft-DTW(x, x) is not zero because soft-DTW considers all paths, not just the diagonal one. soft-DTW is most useful when you want to differentiate it. |
I see, thanks for the clarification. One last question if you don't mind: is soft-DTW always symmetric? |
It is. |
Thanks! |
I know this issue was closed, but could you @mblondel kindly clarify something please: Imagine that my goal is to reduce the soft-DTW dissimilarity between two sequences Thanks in advance, |
Essentially, soft-dtw returns as value the log partition function of a probabilistic model (more precisely a conditional random field over alignments). This is why the value can be negative, just as the log likelihood can be negative. If you care about non-negativity, a debiasing/normalization is proposed by @marcocuturi in #10. |
Thanks a lot for the clarification and the prompt response @mblondel. I had seen @marcocuturi's formula and the blog post, but I think I need to study it more to understand what it does and how it's applicable to my problem. |
Is a negative loss still usable? I.e. that as the loss continues to drop more negative, the signals are closer together than they were? Or is it something that needs correcting for in gradient descent? |
Hello, I'm testing this python version of soft-DTW and I am seeing the following:
It seems counter intuitive to obtain negative values for a discrepancy, but maybe it is normal in this case? I would like to corroborate.
The text was updated successfully, but these errors were encountered: