-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NN dependency #122
Comments
Already done: |
I think the issue come from #106, since there is a |
Very nice tests, but I was thinking more about something at the Platypus level (that would just check that we don't get a "no answer" result). But as you have now integration tests, the priority is really lower. |
I don't know if it is linked but questions like "Who are the daughters of Louis XIV?" don't work anymore |
Yes, same thing: |
The big problem is that these questions are exactly the same as "Who is the France president?". Thus, with a grammatical approach, we will merge "Louis XIV" if and only if we merge "France president". This is again our problem of "named entity recognition" (NER).
|
Is “Who is the France president?” valid English? |
Good question, I don't know. Asked on StackExchange. |
If you use the latest version of the Stanford parser, both
It works if you put an uppercase letter for
Not so bad, we obtain:
All these questions are equivalent:
Actually we do not merge (Panama <> canal, US <> president, United States <> president).
yes
yes, see #64 and #85 (and i propose to close this issue since 2 other ones are opened on the same topic) |
How to update its version of the stanford parser:
|
See #123 |
I have the latest version of the Stanford Parser. According to StackExchange, the only correct form without using "of" is "Who is France's president?". |
It's strange. I add it to deep_tests, let's travis decides: https://travis-ci.org/ProjetPP/PPP-QuestionParsing-Grammatical/builds/51077581 |
Travis is with me :) Are you sure that you run the latest version: |
My bad, I had the two installations in conflict... |
According to StackExchange, "Who is the France president" is incorrect. I think we whould use our previous heuristic for |
What about the following ones:
We need to be sure that these questions are incorrect and will not be use in practice by the users |
seems not used except for |
|
Same thing, you juste replaced "France" by "US" and "United States"... We do not have to handle incorrect sentences (for the same reason, we do not have any spell-checker within our module: we suppose the input sentence to be correct). Moreover, this sentences seems to be very odd to the native speakers, so they should not be asked very often. |
This is not the subject of this issue... |
Fixed 3189e90 Now we produce |
Original post:
"Where is the Panama canal" is broken
Link: http://askplatyp.us/?lang=en&q=Where+is+the+Panama+canal%3F
It would be very nice to create some kind of automatized tests (maybe using log data) in order to avoid such regressions. @Ezibenroc Could you do it?
EDIT (by Ezibenroc)
The
nn
dependency heuristic does not work well on simple questions:LOCATION
)LOCATION
)The text was updated successfully, but these errors were encountered: