a simple bash web directory scanner. give it a url and a wordlist, it tests every path and shows what returns 200.
./main.sh -u https://target.com -w wordlist.txt
./main.sh -u https://target.com -w wordlist.txt -o results.txt| flag | description |
|---|---|
-u |
target url |
-w |
wordlist file |
-o |
(optional) save found results to a file |
wordlists included in /wordlists
# arch
sudo pacman -S curl
# debian/ubuntu
sudo apt install curl- only detects status 200, no 403/301 filtering
- no wildcard detection → possible false positives
- not recursive
- slower than gobuster/dirsearch on large wordlists
