crawley
Crawls web pages and prints any link it can find.
Scan depth (by default - 0) can be configured.
features
- fast SAX-parser (powered by
golang.org/x/net/html
) - small (<1000 SLOC), idiomatic, 100% test covered codebase
- grabs most of useful resources links (pics, videos, audios, etc...)
- found links are streamed to stdout and guranteed to be unique
usage
crawley [flags] url
possible flags:
-delay duration
per-request delay
-depth int
scan depth
-skip-ssl
skip ssl verification
-user-agent string
user-agent string
-version
show version
-workers int
number of workers