-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
issues on dst #5
Comments
Hi, We are only removing none. it is fixed now. |
@smartyfh empty belief issue: in generate_dialogue.py file, add text=text.strip() before tokenizer.encode(text), or there is always an space at the end. However I didn't get the joint acc in the paper, will you? |
@fasterbuild Have you checked all the checkpoints or just one checkpoint? If ignoring both none and dontcare slots, the results should be reproducible. However, if keeping the dontcare slots, the acc would go down several points. But this needs the author to confirm. |
@smartyfh I am struggling in reproducing the result, could you please share the hyper-params you use for the training? |
If keeping the |
The JGA is 50.32% if keeping the |
I got 50.46 joint accuracy, keeping dontcare and doing default cleaning. |
Hi,
when evaluating the JGA for DST, did you remove both the none slot and dontcare slot?
When I ran the dialogue_generation.py, it seems that the generated belief states are always empty in the MODEL_OUTPUT file. so could you please provide more details about how the model is trained for DST?
Thanks!
The text was updated successfully, but these errors were encountered: