-
Notifications
You must be signed in to change notification settings - Fork 249
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
stucks on big input file #88
Comments
wow, 90k is a lot of hosts. I don't think I've run anything near that large. My suspicion is that you might be running into a memory issue or it's still chugging through creating all of the fingerprinting permutations and loading it into the queue. Here's a few things to check:
Please send a screenshot of top and the last few lines of the debug output and I can see if I can track it down further. Also if you're doing that large of a scan, it may be advantageous to break it up into smaller chunks like 5k hosts. And even spread those smaller chunks amongst multiple servers if that's an option to get more parallelism. |
Hi @ztgrace, This is also my problem ( check Line 326 to 335 here ). Since I'm not using nmap scan xml output instead I am using masscan output. gather all Ip address and port The other problem is, I noticed after the scan, there's a lot of changeme script process that still open even the scan is already done (maybe a conflict with redis-thing). |
So the processes that you see are the python subprocesses waiting for for work to do. The queues are built using Also, I am looking into two design changes to changeme that should improve this situation. One is converting the target generation process to a generator pattern to reduce overhead and the second is moving the scanners to an event driven framework to get away from the process management hell. |
There is a lot of error message in my logs. Will send you the logs after the scan. Looking forward in this tool. |
i have input file with 90 000 hosts and scanner just stucks and doing nothing even when i allowing 100 threads with "-t" flag
The text was updated successfully, but these errors were encountered: