Nuke-Net is a VERY VERY over powered and ridiculous web crawler that is well- very very noisy XD read more here

Nuke-Net

  ______        __                 ______        __   
 |   _  \.--.--|  |--.-----.______|   _  \.-----|  |_ 
 |.  |   |  |  |    <|  -__|______|.  |   |  -__|   _|
 |.  |   |_____|__|__|_____|      |.  |   |_____|____|
 |:  |   |                        |:  |   |           
 |::.|   |                        |::.|   |           
 `--- ---'                        `--- ---'           
Nuke-Net     Special-OP-Ajax-Spider   Scare_Sec_Hackers
-------------------------------------------------------  

Nuke-Net is a VERY VERY over powered and ridiculous web crawler that is well- very very noisy XD read more here

why?

well simple, i was bored i am currently working on the project red-rabbit version 5 now ( see version 4 here https://ArkAngeL43/Red-Rabbit-V4)
and quite frankly i just figured why not test the current knowlege i have with golang, so to resume that lets talk about what this tool does

brief DESC

[Mon, 27 Dec 2021 05:31:04 GMT] [ INFO ] Server -> [ECS (mic/9A9C)] [ INFO ] Accept-Ranges -> [bytes] [ INFO ] Cache-Control -> [max-age=604800] [ INFO ] Content-Type -> [text/html; charset=UTF-8] [ INFO ] Date -> [Mon, 20 Dec 2021 05:31:04 GMT] [ INFO ] Age -> [441049] [ INFO ] Last-Modified -> [Thu, 17 Oct 2019 07:18:26 GMT] [ INFO ] Vary -> [Accept-Encoding] [ INFO ] X-Cache -> [HIT] [ INFO ] -> SKIPPED HTML DOWNLOAD, FLAG NOT PARSED 46 (BYTES WRITTEN TO FILE) then will retrieve the domain name as seen below [ INFO ] FOUND DOMAIN NAME => example.com will find the IP of the domain and port scan it [*] Scan Results for ├ example.com (0.0.0.0) [+] ┡ 443 https [+] ┡ 80 http then grab the server, status, and the domain IP range [ INFO ] FOUND DOMAIN IP => [0000:00000:000:0:000:0000:0000:0000 0.0.0.0]75 [ INFO ] FOUND SERVER => ECS (mic/9ABC) [ INFO ] RESPO STATUS => 200OK [ INFO ] FOUND URL => https://www.iana.org/domains/example then continues to test the SQLI against the URl [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE [ INFO ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE for every URL it finds it will crawl that URL and still continue the search for the other URL's and continue to crawl until it runs out or you come across eaither this error -> Get "https://www.scanme.org/admin/": dial tcp [2600:3c01::f03c:91ff:fe18:bb2f]:443: connect: connection refused
or this one -> panic: runtime error: invalid memory address or nil pointer dereference [signal SIGSEGV: segmentation violation code=0x1 addr=0x40 pc=0x2293]
this is not due to the script this is due to its end point, when the script hits a dead end or a domain doesnt work correctly it will log or panic the error, some domains you can even get the GET error or POST error to the URl, this is probobly due to the DIAL error with the URL not accepting connections during the current time
">
for simplicity 

1 => Crawl URL's
2 => Download the HTML files
3 => Admin test your target
4 => Tests SQLI against the target
5 => grabs the response headers
6 => parses complex urls
7 => writes server domain IP's to a file 
8 => writes test output to a file 
9 => writes SQLI tests to a file
10 => port scans domains and your target 

for advanced


when this crawler or spider starts off it attempts to download your targets HTML file ( this is a customizable option ) will grab the request headers, parse the IP, parse the domain, parse the URL and strings, parse the requests, save the requests and write all of your data in a file 

when it is done testing over 100+ admin panel finding payloads ( this goes based off the requests ) then it will start scraping, first it will grab the URL's headers like this 

[  INFO  ]  Etag	 -> ["3147526947"]
	[  INFO  ]  Expires	 -> [Mon, 27 Dec 2021 05:31:04 GMT]
	[  INFO  ]  Server	 -> [ECS (mic/9A9C)]
	[  INFO  ]  Accept-Ranges	 -> [bytes]
	[  INFO  ]  Cache-Control	 -> [max-age=604800]
	[  INFO  ]  Content-Type	 -> [text/html; charset=UTF-8]
	[  INFO  ]  Date	 -> [Mon, 20 Dec 2021 05:31:04 GMT]
	[  INFO  ]  Age	 -> [441049]
	[  INFO  ]  Last-Modified	 -> [Thu, 17 Oct 2019 07:18:26 GMT]
	[  INFO  ]  Vary	 -> [Accept-Encoding]
	[  INFO  ]  X-Cache	 -> [HIT]
 	[  INFO  ]  -> SKIPPED HTML DOWNLOAD, FLAG NOT PARSED
46 (BYTES WRITTEN TO FILE)

then will retrieve the domain name as seen below

	[ INFO ]  FOUND DOMAIN NAME => example.com

will find the IP of the domain and port scan it 

	[*] Scan Results for   ├ example.com (0.0.0.0)
	[+]			┡ 443	https
	[+]			┡ 80	http


then grab the server, status, and the domain IP range

[ INFO ]  FOUND DOMAIN IP => [0000:00000:000:0:000:0000:0000:0000 0.0.0.0]75
	[ INFO ]  FOUND SERVER => ECS (mic/9ABC)
	[ INFO ]  RESPO STATUS => 200OK	[ INFO ]  FOUND URL => https://www.iana.org/domains/example 

then continues to test the SQLI against the URl

 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE
 	[  INFO  ] Server HAS PASSED ALL INJECTIONS, NOT VULNERABLE



for every URL it finds it will crawl that URL and still continue the search for the other URL's and continue to crawl until it runs out or you come across

eaither this error -> Get "https://www.scanme.org/admin/": dial tcp [2600:3c01::f03c:91ff:fe18:bb2f]:443: connect: connection refused

or this one -> panic: runtime error: invalid memory address or nil pointer dereference [signal SIGSEGV: segmentation violation code=0x1 addr=0x40 pc=0x2293]
this is not due to the script this is due to its end point, when the script hits a dead end or a domain doesnt work correctly it will log or panic the error, some domains you can even get the GET error or POST error to the URl, this is probobly due to the DIAL error with the URL not accepting connections during the current time


# usages
basic usage 
                |=> HTTPS URL                 |=> Domain Name         |=> Base URL or HTTP URL     |=> for every URL downl;oad every HTML index
                |                             |                       |                            |
go run main.go -target https://www.scanme.org -domain www.example.org -base http://www.example.com -filedfel

NOTE => -filedfel IS OPTIONAL AND IS NOT NEEDED ( it is suggested cause well-- you are like downloading 800+ HTML files ( cries why ) 

where is the output stored


ALL OUTPUT IS STORED IN THE `OUTPUT` FOLDER

demo

Owner
RE43P3R
Owner Of Stop being a snowflake and live a fucking life for once, stop being stuck on social worlds and fight for what you love
RE43P3R
Similar Resources

🌌 A libp2p DHT crawler that gathers information about running nodes in the network.

🌌 A libp2p DHT crawler that gathers information about running nodes in the network.

A libp2p DHT crawler that gathers information about running nodes in the network. The crawler runs every 30 minutes by connecting to the standard DHT bootstrap nodes and then recursively following all entries in the k-buckets until all peers have been visited.

Dec 27, 2022

GopherTalk: a multi-user chat powered by GO to explore its standard library and features like sockets, goroutines, channels and sync package

GopherTalk: a multi-user chat powered by GO to explore its standard library and features like sockets, goroutines, channels and sync package

GopherTalk GopherTalk is a multi-user chat powered by GO to explore its standard

Jun 6, 2022

Provides the function Parallel to create a synchronous in memory pipe and lets you write to and read from the pipe parallelly

iopipe provides the function Parallel to create a synchronous in memory pipe and lets you write to and read from the pipe parallely

Jan 25, 2022

SailFirewall - Linux firewall powered by eBPF and XDP

SailFirewall Linux firewall powered by eBPF and XDP Requirements Go 1.16+ Linux

May 4, 2022

gqlgenc is a fully featured go gql client, powered by codegen

gqlgenc Note: ⚠️ This is a WIP, backward-compatibility cannot be guaranteed yet, use at your own risk gqlgenc is a fully featured go gql client, power

Sep 17, 2022

An userspace SORACOM Arc client powered by wireguard-go

soratun An easy-to-use, userspace SORACOM Arc client powered by wireguard-go. For deploying and scaling Linux servers/Raspberry Pi devices working wit

Jun 2, 2022

GraphQL API server for galaxy powered blockchain network

ICICB GraphQL API Server GraphQL API server for galaxy powered blockchain network. Releases Please check the release tags to get more details and to d

Jan 5, 2022

EasyTCP is a light-weight and less painful TCP server framework written in Go (Golang) based on the standard net package.

EasyTCP is a light-weight TCP framework written in Go (Golang), built with message router. EasyTCP helps you build a TCP server easily fast and less painful.

Jan 7, 2023

Record and replay a go net.Conn, mosting for testing.

fakeconn Record and replay a go net.Conn, mostly for testing and debugging. This package isn't finished, so you probably shouldn't use it. Recording f

Dec 5, 2021
Related tags
Go package to simulate bandwidth, latency and packet loss for net.PacketConn and net.Conn interfaces

lossy Go package to simulate bandwidth, latency and packet loss for net.PacketConn and net.Conn interfaces. Its main usage is to test robustness of ap

Oct 14, 2022
Powered by Matterbridge, MatterAMXX is a plugin for AMXX that allows simple bridging between your game servers, Mattermost, IRC, XMPP, Gitter, Slack, Discord, Telegram, and more.
Powered by Matterbridge, MatterAMXX is a plugin for AMXX that allows simple bridging between your game servers, Mattermost, IRC, XMPP, Gitter, Slack, Discord, Telegram, and more.

Powered by Matterbridge, MatterAMXX is a plugin for AMXX that allows simple bridging between your game servers, Mattermost, IRC, XMPP, Gitter, Slack, Discord, Telegram, and more.

Dec 27, 2022
Hprose 1.0 for Golang (Deprecated). Hprose 2.0 for Golang is here:

Hprose for Golang Introduction Installation Usage Http Server Http Client Synchronous Invoking Synchronous Exception Handling Asynchronous Invoking As

Dec 15, 2022
Golang `net/rpc` over SSH using installed SSH program

Golang net/rpc over SSH using installed SSH program This package implements a helper functions to launch an RPC client and server. It uses the install

Nov 16, 2022
Access more HTTP ports over CDN with this application.
Access more HTTP ports over CDN with this application.

More-Ports More Ports is a proxy service to establish all web-based applications on different ports on the server-side over a well known TCP port. It

May 8, 2022
httpstream provides HTTP handlers for simultaneous streaming uploads and downloads of objects, as well as persistence and a standalone server.

httpfstream httpfstream provides HTTP handlers for simultaneous streaming uploads and downloads of files, as well as persistence and a standalone serv

May 1, 2021
Hybridnet is an open source container networking solution, integrated with Kubernetes and used officially by following well-known PaaS platforms

Hybridnet What is Hybridnet? Hybridnet is an open source container networking solution, integrated with Kubernetes and used officially by following we

Jan 4, 2023
Lux - A web library collection based on net/http

Lux - A web library collection based on net/http

Jan 7, 2023
Sep 23, 2022