-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
filehashes: fix for unique filehash filenames #343
Conversation
Commit 8725e56 gave each downloaded a file a unique name so dataset files from different sources wouldn't clobber each other, but this was applied to all files breaking file hash lists as that code wasn't updated for the new filename scheme. Update the file hashing code to find the files based on the filename prefix of the rule referencing the file. Bug: #6854
Should we have SV test support and tests for SU? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This does seem to work. Was it always broken with --local
?
Suricata-update has its own test suite in
I think one thing we could do is ensure that if all free sources are enabled, then suricata-update does not error out. |
Somethings broken with |
The integration tests avoid hitting the network now, so not sure how that will work unless they go and hit all the rule sources. Which I'm not that keen to do in automated tests. |
Can this issue be properly tested in the framework? If so that should be sufficient. |
I see. If you take my test ruleset and extract it to disk and point to it with However, datasets are picked up with |
yeah. i get the same error as w/o this patch. Haven't tried to check the why though.
|
Yes, the integration test was updated to use my test ruleset and verify the extraction of the file hash lists to the proper location. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Comments were addressed. LGTM 🚀
Could we keep a copy that we update at certain intervals? ref: https://github.com/OISF/suricata-verify/tree/master/tests/test-ruleparse-etopen-01 |
We have a copy of emerging threats. I'm not sure how keeping a copy of all would be helpful. Now something else that could be interesting is an external tool that validates all public rulesets.. Makesure the links are all there, and which versions of Suricata they parse for. Interesting project I think, but outside of Suricata-Update. |
Merged. |
Commit 8725e56 gave each downloaded a file a unique name so dataset files from different sources wouldn't clobber each other, but this was applied to all files breaking file hash lists as that code wasn't updated for the new filename scheme.
Update the file hashing code to find the files based on the filename prefix of the rule referencing the file.
Bug: #6854
Test ruleset: https://github.com/jasonish/suricata-test-rules/archive/refs/heads/main.zip
Perhaps in combination with https://rules.pawpatrules.fr/suricata/paw-patrules.tar.gz which has a dataset.