Skip to content

"WebCrawler" is a recursive web enum tool scripted in python

Notifications You must be signed in to change notification settings

yaxhx/WebCrawler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

image

Usage

Run the WebCrawler with the below command -

python3 scraper.py -u <url> --d <depth>

Example

python3 scraper.py -u "https://google.com" -d 2

Features

  • Crawls recursively to a certain depth by user

  • Returns subdomains, links, javascript files for some recon

  • Flexible depth params for customizing the level or recursion

About

"WebCrawler" is a recursive web enum tool scripted in python

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages