A minimalistic web crawler written in C
To crawl any URL using this crawler
- Clone the repository
- Run the
Makefile
- Crawl a URL by running a.out as
./a.out <URL to crawl>
See the list of URL(s) present on the site!
- All the log information regarding the crawler is stored in `.logfile` at your specified location.
- Update the `log_file_path` in `logger.c` to point to the location(relative to home directory) where you wish to see the logs.
- If you don't wish to see the logs, leave the `logger.c` file as it is and ignore the warning issued regarding creation of log file