Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

stucks on big input file #88

Open
Celestial-intelligence opened this issue Aug 30, 2019 · 4 comments
Open

stucks on big input file #88

Celestial-intelligence opened this issue Aug 30, 2019 · 4 comments

Comments

@Celestial-intelligence
Copy link

i have input file with 90 000 hosts and scanner just stucks and doing nothing even when i allowing 100 threads with "-t" flag

@ztgrace
Copy link
Owner

ztgrace commented Aug 30, 2019

wow, 90k is a lot of hosts. I don't think I've run anything near that large. My suspicion is that you might be running into a memory issue or it's still chugging through creating all of the fingerprinting permutations and loading it into the queue.

Here's a few things to check:

  1. Is Redis up and running? Redis performs much better than the python in-memory queue
  2. Open top to monitor the system memory and cpu performance
  3. Start changeme with the --debug flag

Please send a screenshot of top and the last few lines of the debug output and I can see if I can track it down further.

Also if you're doing that large of a scan, it may be advantageous to break it up into smaller chunks like 5k hosts. And even spread those smaller chunks amongst multiple servers if that's an option to get more parallelism.

@0xspade
Copy link

0xspade commented Nov 24, 2019

Hi @ztgrace,

This is also my problem ( check Line 326 to 335 here ). Since I'm not using nmap scan xml output instead I am using masscan output. gather all Ip address and port (e.g: 123.123.123.123:1337) and loop into bash script however, I got always stuck and manually forces to kill the changeme process and then it will continue to scan the other ip address with port.

The other problem is, I noticed after the scan, there's a lot of changeme script process that still open even the scan is already done (maybe a conflict with redis-thing).

@ztgrace
Copy link
Owner

ztgrace commented Nov 27, 2019

So the processes that you see are the python subprocesses waiting for for work to do. The queues are built using ScanEngine._build_targets and "poison pills" are added that terminates the subprocess. Something is happening that prevents the subprocesses from receiving the poison pills and exiting gracefully. Are you seeing any errors or timeout messages?

Also, I am looking into two design changes to changeme that should improve this situation. One is converting the target generation process to a generator pattern to reduce overhead and the second is moving the scanners to an event driven framework to get away from the process management hell.

@0xspade
Copy link

0xspade commented Nov 28, 2019

Are you seeing any errors or timeout messages?

There is a lot of error message in my logs. Will send you the logs after the scan.

Looking forward in this tool.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants