DorkScout - Golang tool to automate google dork scan against the entiere internet or specific targets

dorkscout

dokrscout is a tool to automate the finding of vulnerable applications or secret files around the internet throught google searches, dorkscout first starts by fetching the dorks lists from https://www.exploit-db.com/google-hacking-database and then it scans a given target or everything it founds

Installation

dorkscout can be installed in different ways:

Go Packages

throught Golang Packages (golang package manager)

go get github.com/R4yGM/dorkscout

this will work for every platform

Docker

if you don't have docker installed you can follow their guide

first of all you have to pull the docker image (only 17.21 MB) from the docker registry, you can see it here, if you don't want to pull the image you can also clone the repository and then build the image from the Dockerfile

docker pull r4yan/dorkscout:latest

if you don't want to pull the image you can download or copy the dorkscout Dockerfile that can be found here and then build the image from the Dockerfile

then if you want to launch the container you have to first create a volume to share your files to the container

docker volume create --name dorkscout_data

using docker when you launch the container it will automatically install the dork lists inside a directory called "dorkscout" :

-rw-r--r-- 1 r4yan r4yan   110 Jul 31 14:56  .dorkscout
-rw-r--r-- 1 r4yan r4yan 79312 Aug 10 20:30 'Advisories and Vulnerabilities.dorkscout'
-rw-r--r-- 1 r4yan r4yan  6352 Jul 31 14:56 'Error Messages.dorkscout'
-rw-r--r-- 1 r4yan r4yan 38448 Jul 31 14:56 'Files Containing Juicy Info.dorkscout'
-rw-r--r-- 1 r4yan r4yan 17110 Jul 31 14:56 'Files Containing Passwords.dorkscout'
-rw-r--r-- 1 r4yan r4yan  1879 Jul 31 14:56 'Files Containing Usernames.dorkscout'
-rw-r--r-- 1 r4yan r4yan  5398 Jul 31 14:56  Footholds.dorkscout
-rw-r--r-- 1 r4yan r4yan  5568 Jul 31 14:56 'Network or Vulnerability Data.dorkscout'
-rw-r--r-- 1 r4yan r4yan 49048 Jul 31 14:56 'Pages Containing Login Portals.dorkscout'
-rw-r--r-- 1 r4yan r4yan 16112 Jul 31 14:56 'Sensitive Directories.dorkscout'
-rw-r--r-- 1 r4yan r4yan   451 Jul 31 14:56 'Sensitive Online Shopping Info.dorkscout'
-rw-r--r-- 1 r4yan r4yan 29938 Jul 31 14:56 'Various Online Devices.dorkscout'
-rw-r--r-- 1 r4yan r4yan  2802 Jul 31 14:56 'Vulnerable Files.dorkscout'
-rw-r--r-- 1 r4yan r4yan  4925 Jul 31 14:56 'Vulnerable Servers.dorkscout'
-rw-r--r-- 1 r4yan r4yan  8145 Jul 31 14:56 'Web Server Detection.dorkscout'

so that you don't have to install them then you can start scanning by doing :

docker run -v Dorkscout:/dorkscout r4yan/dorkscout scan <options>

replace the <options> with the options/arguments you want to give to dorkscout, example :

docker run -v dorkscout_data:/dorkscout r4yan/dorkscout scan -d="/dorkscout/Sensitive Online Shopping Info.dorkscout" -H="/dorkscout/a.html"

If you wanted to scan throught a proxy using a docker container you have to add the --net host option example :

docker run --net host -v dorkscout_data:/dorkscout r4yan/dorkscout scan -d="/dorkscout/Sensitive Online Shopping Info.dorkscout" -H="/dorkscout/a.html -x socks5://127.0.0.1:9050"

Always save your results inside the volume and not in the container because then the results will be deleted! you can save them by writing the same volume path of the directory you are saving the results

if you added this and did everything correctly at the end of every scan you'd find the results inside the folder /var/lib/docker/volumes/dorkscout_data/_data

this will work for every platform

Executable

you can also download the already compiled binaries here and then execute them

Usage

dorkscout -h
Usage:
  dorkscout [command]

Available Commands:
  completion  generate the autocompletion script for the specified shell
  delete      deletes all the .dorkscout files inside a given directory
  help        Help about any command
  install     installs a list of dorks from exploit-db.com
  scan        scans a specific website or all the websites it founds for a list of dorks

Flags:
  -h, --help   help for dorkscout

Use "dorkscout [command] --help" for more information about a command.

to start scanning with a wordlist and a proxy that will then return the results in a HTML format

dorkscout scan -d="/dorkscout/Sensitive Online Shopping Info.dorkscout" -H="/dorkscout/a.html" -x socks5://127.0.0.1:9050

results :

Install wordlists

to start scanning you'll need some dork lists and to have these lists you can install them through the install command

dorkscout install --output-dir /dorks

and this will fetch all the available dorks from exploit.db

[+] ./Advisories and Vulnerabilities.dorkscout
[+] ./Vulnerable Files.dorkscout
[+] ./Files Containing Juicy Info.dorkscout
[+] ./Sensitive Online Shopping Info.dorkscout
[+] ./Files Containing Passwords.dorkscout
[+] ./Vulnerable Servers.dorkscout
[+] ./Various Online Devices.dorkscout
[+] ./Pages Containing Login Portals.dorkscout
[+] ./Footholds.dorkscout
[+] ./Error Messages.dorkscout
[+] ./Files Containing Usernames.dorkscout
[+] ./Network or Vulnerability Data.dorkscout
[+] ./.dorkscout
[+] ./Sensitive Directories.dorkscout
[+] ./Web Server Detection.dorkscout
2021/08/11 19:02:45 Installation finished in 2.007928 seconds on /dorks
Similar Resources

A crawler/scraper based on golang + colly, configurable via JSON

Super-Simple Scraper This a very thin layer on top of Colly which allows configuration from a JSON file. The output is JSONL which is ready to be impo

Aug 21, 2022

New World Auction House Crawler In Golang

New-World-Auction-House-Crawler Goal of this library is to have a process which grabs New World auction house data in the background while playing the

Sep 7, 2022

A PCPartPicker crawler for Golang.

gopartpicker A scraper for pcpartpicker.com for Go. It is implemented using Colly. Features Extract data from part list URLs Search for parts Extract

Nov 9, 2021

Golang based web site opengraph data scraper with caching

Golang based web site opengraph data scraper with caching

Snapper A Web microservice for capturing a website's OpenGraph data built in Golang Building Snapper building the binary git clone https://github.com/

Oct 5, 2022

Youtube tutorial about web scraping using golang and Gocolly

This is an example project I wrote for a youtube tutorial about webscraping using golang and gocolly It extracts data from a tracking differences webs

Mar 26, 2022

Go spider: A crawler of vertical communities achieved by GOLANG

go_spider A crawler of vertical communities achieved by GOLANG. Latest stable Release: Version 1.2 (Sep 23, 2014). QQ群号:337344607 Features Concurrent

Dec 9, 2021

A simple cralwer with golang

A simple cralwer with golang

A simple cralwer with golang

Nov 8, 2021

Ratemyprof scraper - Ratemyprof scraper with golang

ratemyprof scraper visit https://ratemyprof-api.vercel.app/api/getProf to try ou

Jan 18, 2022

A Golang library to scrape lyrics from musixmatch.com (WIP)

A Golang library to scrape lyrics from musixmatch.com (WIP)

Aug 5, 2022
Comments
  • golang panic: runtime error

    golang panic: runtime error

    Hej, im using go version go1.16.8 linux/amd64.

    When trying to run a scan (installing wordlists worked fine) im running into several runtime errors almost instant.

    ./dorkscout scan --OutputHTML "dorks" --dorklist "dorks/Advisories and Vulnerabilities.dorkscout" --target www.#####.###

    Started scanning ##### with dorks/Advisories and Vulnerabilities.dorkscout 88 panic: runtime error: invalid memory address or nil pointer dereference [recovered] panic: runtime error: invalid memory address or nil pointer dereference [signal SIGSEGV: segmentation violation code=0x1 addr=0x10 pc=0x59f1c0]

    goroutine 1 [running]: text/template.errRecover(0xc0001c95d8) /usr/local/go/src/text/template/exec.go:163 +0x1b2 panic(0x99c2c0, 0xebf790) /usr/local/go/src/runtime/panic.go:965 +0x1b9 text/template.(*Template).execute(0x0, 0xaf5a80, 0x0, 0x996920, 0xc000233b60, 0x0, 0x0) /usr/local/go/src/text/template/exec.go:217 +0x180 text/template.(*Template).Execute(...) /usr/local/go/src/text/template/exec.go:203 github.com/R4yGM/dorkscout/results.HTMLInject(0x7fff0527616c, 0x2e, 0xc0001c9b00, 0xa46451, 0xa, 0x7fff0527615b, 0x5, 0x0) /usr/local/go/src/github.com/R4yGM/dorkscout/results/results.go:257 +0x10ef github.com/R4yGM/dorkscout/cmd.scan(0x0) /usr/local/go/src/github.com/R4yGM/dorkscout/cmd/scan.go:102 +0x1378 github.com/R4yGM/dorkscout/cmd.google_scan() /usr/local/go/src/github.com/R4yGM/dorkscout/cmd/scan.go:244 +0x2f4 github.com/R4yGM/dorkscout/cmd.glob..func3(0xecc3c0, 0xc000070ae0, 0x0, 0x6) /usr/local/go/src/github.com/R4yGM/dorkscout/cmd/scan.go:51 +0x53 github.com/spf13/cobra.(*Command).execute(0xecc3c0, 0xc000070a80, 0x6, 0x6, 0xecc3c0, 0xc000070a80) /mnt/c/Users/rayan/documents/GOLANG/pkg/mod/github.com/spf13/[email protected]/command.go:860 +0x2c2 github.com/spf13/cobra.(*Command).ExecuteC(0xecc140, 0x0, 0xffffffff, 0xc00007c058) /mnt/c/Users/rayan/documents/GOLANG/pkg/mod/github.com/spf13/[email protected]/command.go:974 +0x375 github.com/spf13/cobra.(*Command).Execute(...) /mnt/c/Users/rayan/documents/GOLANG/pkg/mod/github.com/spf13/[email protected]/command.go:902 github.com/R4yGM/dorkscout/cmd.Execute(...) /usr/local/go/src/github.com/R4yGM/dorkscout/cmd/root.go:14 main.main() /usr/local/go/src/github.com/R4yGM/dorkscout/main.go:6 +0x2d

  • Segfault when attempting to generate HTML report

    Segfault when attempting to generate HTML report

    Hi. Here is my error message :

    $ ./dorkscout scan  -d ../osint/dork_documents_pdf.txt -H results           
    ../osint/dork_documents_pdf.txt  nothing found
    =====================================
    Results for :  site:domainedelacadene.fr filetype:pdf
    panic: runtime error: invalid memory address or nil pointer dereference [recovered]
            panic: runtime error: invalid memory address or nil pointer dereference
    [signal SIGSEGV: segmentation violation code=0x1 addr=0x10 pc=0x577dd9]
    
    goroutine 1 [running]:
    text/template.errRecover(0xc0001c9378)
            /usr/lib/go-1.17/src/text/template/exec.go:163 +0x15b
    panic({0x8eee00, 0xe6d360})
            /usr/lib/go-1.17/src/runtime/panic.go:1038 +0x215
    text/template.(*Template).execute(0x0, {0xa34380, 0x0}, {0x8e98a0, 0xc000306b70})
            /usr/lib/go-1.17/src/text/template/exec.go:214 +0x239
    text/template.(*Template).Execute(...)
            /usr/lib/go-1.17/src/text/template/exec.go:200
    github.com/R4yGM/dorkscout/results.HTMLInject({0xc0000283c0, 0xc000010018}, 0xf8, {0x9835ba, 0x2}, {0x7ffface643e1, 0x7}, 0x0)
            /home/scassi/cadene/dorkscout/results/results.go:239 +0x9b5
    github.com/R4yGM/dorkscout/cmd.scan(0x1)
            /home/scassi/cadene/dorkscout/cmd/scan.go:129 +0x596
    github.com/R4yGM/dorkscout/cmd.scan(0x0)
            /home/scassi/cadene/dorkscout/cmd/scan.go:193 +0xa8d
    github.com/R4yGM/dorkscout/cmd.google_scan()
            /home/scassi/cadene/dorkscout/cmd/scan.go:242 +0x66
    github.com/R4yGM/dorkscout/cmd.glob..func3(0xe7a120, {0x97db15, 0x4, 0x4})
            /home/scassi/cadene/dorkscout/cmd/scan.go:51 +0x91
    github.com/spf13/cobra.(*Command).execute(0xe7a120, {0xc0002d40c0, 0x4, 0x4})
            /home/scassi/go/pkg/mod/github.com/spf13/[email protected]/command.go:860 +0x5f8
    github.com/spf13/cobra.(*Command).ExecuteC(0xe79ea0)
            /home/scassi/go/pkg/mod/github.com/spf13/[email protected]/command.go:974 +0x3bc
    github.com/spf13/cobra.(*Command).Execute(...)
            /home/scassi/go/pkg/mod/github.com/spf13/[email protected]/command.go:902
    github.com/R4yGM/dorkscout/cmd.Execute(...)
            /home/scassi/cadene/dorkscout/cmd/root.go:14
    main.main()
            /home/scassi/cadene/dorkscout/main.go:6 +0x25
    
    

    The content of my dork file is the following : site:[domain] filetype:pdf

    With [domain] being an actual domain obviously ;)

Go-based search engine URL collector , support Google, Bing, can be based on Google syntax batch collection URL
Go-based search engine URL collector , support Google, Bing, can be based on Google syntax batch collection URL

Go-based search engine URL collector , support Google, Bing, can be based on Google syntax batch collection URL

Nov 9, 2022
Cirno-go A tool for downloading books from hbooker in Go.
Cirno-go A tool for downloading books from hbooker in Go.

Cirno-go A tool for downloading books from hbooker in Go. Features Login your own account Search books by book name Download books as txt and epub fil

Oct 25, 2022
DataHen Till is a standalone tool that instantly makes your existing web scraper scalable, maintainable, and more unblockable, with minimal code changes on your scraper.
DataHen Till is a standalone tool that instantly makes your existing web scraper scalable, maintainable, and more unblockable, with minimal code changes on your scraper.

DataHen Till is a standalone tool that instantly makes your existing web scraper scalable, maintainable, and more unblockable, with minimal code changes on your scraper.

Dec 14, 2022
🦙 acao(阿草), the tool man for data scraping of https://asoul.video/.

?? acao acao(阿草), the tool man for data scraping of https://asoul.video/. Deploy to Aliyun serverless function with Raika update_member Update A-SOUL

Jul 25, 2022
This is a small tool designed to scrape one or more URLs given as command arguments.

HTTP-FETCH This is a small tool designed to scrape one or more URLs given as command arguments. Usage http-fetch [--metadata] ...URLs The output files

Nov 23, 2021
View reddit memes and posts from ur terminal with golang and webscraping

goddit View reddit memes and posts from your terminal with golang and webscraping Installation run the following commands on your terminal to install

Feb 22, 2021
Elegant Scraper and Crawler Framework for Golang

Colly Lightning Fast and Elegant Scraping Framework for Gophers Colly provides a clean interface to write any kind of crawler/scraper/spider. With Col

Jan 9, 2023
Pholcus is a distributed high-concurrency crawler software written in pure golang
Pholcus is a distributed high-concurrency crawler software written in pure golang

Pholcus Pholcus(幽灵蛛)是一款纯 Go 语言编写的支持分布式的高并发爬虫软件,仅用于编程学习与研究。 它支持单机、服务端、客户端三种运行模式,拥有Web、GUI、命令行三种操作界面;规则简单灵活、批量任务并发、输出方式丰富(mysql/mongodb/kafka/csv/excel等

Dec 30, 2022
[爬虫框架 (golang)] An awesome Go concurrent Crawler(spider) framework. The crawler is flexible and modular. It can be expanded to an Individualized crawler easily or you can use the default crawl components only.

go_spider A crawler of vertical communities achieved by GOLANG. Latest stable Release: Version 1.2 (Sep 23, 2014). QQ群号:337344607 Features Concurrent

Jan 6, 2023
A crawler/scraper based on golang + colly, configurable via JSON

A crawler/scraper based on golang + colly, configurable via JSON

Aug 21, 2022