-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pre-train accuracy for fbwq_half #116
Comments
I used the pre-trained embedding on KGQA, the accuracy is very low. How did you learn the embedding use kge? @apoorvumang |
Can you please elaborate on what experiment you did, what commands you used? |
I try to use kge to get the pre-train embedding of fbwq_half, and use the pre-trained embedding in multi-hop answering on fbwq_half. @apoorvumang |
But the pre-train accuracy is very low when using kge. If I use that embedding, the kgqa accuracy is also much lower than the result in the paper. So, I want to know, how do you use kge to pre-train the model. The config file I use is listed below. complex:
|
Thanks for the details, let me try it with this config and I'll get back to you |
Can you share your config file to me? so I can get the same performance in the paper @apoorvumang |
May I know how much the accuracy is when using kge trains fbwq_half ?
Also, when you use kgc to train fbwq_half, did you only use train.txt?
The text was updated successfully, but these errors were encountered: