Gryffin is a large scale web security scanning platform.

Gryffin (beta) Build Status GoDoc

Gryffin is a large scale web security scanning platform. It is not yet another scanner. It was written to solve two specific problems with existing scanners: coverage and scale.

Better coverage translates to fewer false negatives. Inherent scalability translates to capability of scanning, and supporting a large elastic application infrastructure. Simply put, the ability to scan 1000 applications today to 100,000 applications tomorrow by straightforward horizontal scaling.

Coverage

Coverage has two dimensions - one during crawl and the other during fuzzing. In crawl phase, coverage implies being able to find as much of the application footprint. In scan phase, or while fuzzing, it implies being able to test each part of the application for an applied set of vulnerabilities in a deep.

Crawl Coverage

Today a large number of web applications are template-driven, meaning the same code or path generates millions of URLs. For a security scanner, it just needs one of the millions of URLs generated by the same code or path. Gryffin's crawler does just that.

Page Deduplication

At the heart of Gryffin is a deduplication engine that compares a new page with already seen pages. If the HTML structure of the new page is similar to those already seen, it is classified as a duplicate and not crawled further.

DOM Rendering and Navigation

A large number of applications today are rich applications. They are heavily driven by client-side JavaScript. In order to discover links and code paths in such applications, Gryffin's crawler uses PhantomJS for DOM rendering and navigation.

Scan Coverage

As Gryffin is a scanning platform, not a scanner, it does not have its own fuzzer modules, even for fuzzing common web vulnerabilities like XSS and SQL Injection.

It's not wise to reinvent the wheel where you do not have to. Gryffin at production scale at Yahoo uses open source and custom fuzzers. Some of these custom fuzzers might be open sourced in the future, and might or might not be part of the Gryffin repository.

For demonstration purposes, Gryffin comes integrated with sqlmap and arachni. It does not endorse them or any other scanner in particular.

The philosophy is to improve scan coverage by being able to fuzz for just what you need.

Scale

While Gryffin is available as a standalone package, it's primarily built for scale.

Gryffin is built on the publisher-subscriber model. Each component is either a publisher, or a subscriber, or both. This allows Gryffin to scale horizontally by simply adding more subscriber or publisher nodes.

Operating Gryffin

Pre-requisites

  1. Go - go1.13 or later
  2. PhantomJS, v2
  3. Sqlmap (for fuzzing SQLi)
  4. Arachni (for fuzzing XSS and web vulnerabilities)
  5. NSQ ,
    • running lookupd at port 4160,4161
    • running nsqd at port 4150,4151
    • with --max-msg-size=5000000
  6. Kibana and Elastic search, for dashboarding

Installation

go get -u github.com/yahoo/gryffin/...

Run

(WIP)

TODO

  1. Mobile browser user agent
  2. Preconfigured docker images
  3. Redis for sharing states across machines
  4. Instruction to run gryffin (distributed or standalone)
  5. Documentation for html-distance
  6. Implement a JSON serializable cookiejar.
  7. Identify duplicate url patterns based on simhash result.

Talks and Slides

Credits

Licence

Code licensed under the BSD-style license. See LICENSE file for terms.

Owner
Yahoo
Yahoo is a Verizon Media brand. This organization is the home to many of the active open source projects published by engineers at Yahoo and Verizon Media.
Yahoo
Comments
  • gryffin-standalone: Timeout when rendering js

    gryffin-standalone: Timeout when rendering js

    Steps I do to reproduce:

    1. nsqlookupd -verbose=true
    2. nsqd --max-msg-size=2313820682 --lookupd-tcp-address=127.0.0.1:4160
    3. go run cmd/gryffin-distributed/main.go --storage=memory seed http://mysite.com

    Output:

    {"Service":"Poke","Msg":"Poking","Method":"GET","Url":"http://mysite.com"}
    2015/09/30 13:27:54 INF    1 (127.0.0.1:4150) connecting to nsqd
    Seed http://mysite.com injected.
    2015/09/30 13:27:54 INF    1 stopping
    2015/09/30 13:27:54 INF    1 exiting router
    
    1. go run cmd/gryffin-distributed/main.go --storage=memory crawl

    Output:

    2015/09/30 13:29:12 INF    2 [seed/primary] querying nsqlookupd http://127.0.0.1:4161/lookup?topic=seed
    2015/09/30 13:29:12 INF    2 [seed/primary] (ano:4150) connecting to nsqd
    {"Service":"CrawlAsync","Msg":"Started","Method":"GET","Url":"http://mysite.com"}
    {"Service":"PhantomjsRenderer.Do","Msg":"Running: render.js","Method":"GET","Url":"http://mysite.com"}
    {"Service":"PhantomjsRenderer.Do","Msg":"[Timeout] Terminating the crawl process.","Method":"GET","Url":"http://mysite.com"}
    2015/09/30 13:30:19 INF    2 [seed/primary] querying nsqlookupd http://127.0.0.1:4161/lookup?topic=seed
    

    Am I missing something ?

  • Page deduplication drawbacks?

    Page deduplication drawbacks?

    Hey guys,

    The page deduplication approach is clever, but it sets a hard limit on the scanner's issue detection capabilities.

    Since you're comparing HTML code to determine uniqueness, it guarantees that certain issues exposed via text nodes will not be able to be identified, as those pages will never make it to the underlying scanner. For example, disclosure of sensitive information such as credit card numbers, SSN, etc.

    In addition, it also prevents you from identifying server-side resources, like sensitive files and directories (backup, backdoors, etc.); the URL of a template-generated page whose path can be manipulated into returning a sensitive file will never make it to the scanner.

    I can see that right now you're only interested in SQL and XSS issues, but the architecture seems limited to only identifying active issues.

    Was that a conscious decision?

    Cheers, Tasos L.

  • Fix form elements resolution

    Fix form elements resolution

    May resolve #29 :) Original results:

     $ phantomjs ./renderer/resource/render.js https://www.buglloc.com/static/gryffin/forms-unencoded-fields.html | head -1 | python -mjson.tool
    
    {
        "response": {
            "headers": {...},
            "contentType": "text/html; charset=utf-8",
            "status": 200,
            "url": "https://www.buglloc.com/static/gryffin/forms-unencoded-fields.html",
            "body": "<html><head></head><body>\n<a href=\"/foo\">foo</a>\n<form name=\"with-query\" action=\"/\" method=\"post\">\n\t<input type=\"text\" name=\"foo[]\" value=\"some\">\n\t<select name=\"bar[]\">\n\t\t<option value=\"one\">One</option>\n\t\t<option value=\"two\">Two</option>\n\t</select>\n\t<input type=\"submit\" name=\"save\" value=\"Save\">\n</form>\n\n\n</body></html>",
            "details": {
                "links": [],
                "forms": []
            }
        },
        "elasped": 586,
        "ok": 1,
        "msgType": "domSteady",
        "signature": "==lXlKfYWch7H9VdJgPCmJ=="
    }
    

    After apply this PR:

     $ phantomjs ./renderer/resource/render.js https://www.buglloc.com/static/gryffin/forms-unencoded-fields.html | head -1 | python -mjson.tool
    {
        "response": {
            "headers": {...},
            "contentType": "text/html; charset=utf-8",
            "status": 200,
            "url": "https://www.buglloc.com/static/gryffin/forms-unencoded-fields.html",
            "body": "<html><head></head><body>\n<a href=\"/foo\">foo</a>\n<form name=\"with-query\" action=\"/\" method=\"post\">\n\t<input type=\"text\" name=\"foo[]\" value=\"some\">\n\t<select name=\"bar[]\">\n\t\t<option value=\"one\">One</option>\n\t\t<option value=\"two\">Two</option>\n\t</select>\n\t<input type=\"submit\" name=\"save\" value=\"Save\">\n</form>\n\n\n</body></html>",
            "details": {
                "links": [
                    {
                        "text": "foo",
                        "url": "https://www.buglloc.com/foo"
                    }
                ],
                "forms": [
                    {
                        "data": "bar%5B%5D=one&foo%5B%5D=some&save=Save",
                        "dataType": {
                            "bar%5B%5D": "select-one",
                            "foo%5B%5D": "text",
                            "save": "submit"
                        },
                        "method": "post",
                        "url": "https://www.buglloc.com/"
                    },
                    {
                        "data": "bar%5B%5D=two&foo%5B%5D=some&save=Save",
                        "dataType": null,
                        "method": "post",
                        "url": "https://www.buglloc.com/"
                    }
                ],
                "jsLinkFeedback": true
            }
        },
        "elasped": 580,
        "ok": 1,
        "msgType": "domSteady",
        "signature": "==lXlKfYWch7H9VdJgPCmJ=="
    }
    
  • Efficiency of redundant fuzzers

    Efficiency of redundant fuzzers

    Both sqlmap and arachni have SQL injection capabilities. Why run both? It doesn't look like there's an attempt to pare down either to a specific configuration or avoid running identical payloads.

  • Results

    Results

    So, at last I managed to make gryffin-standalone do a full (successful ?) scan. After the run is finished I get this:

    gryffin-standalone http://192.168.1.117:1912/login
    === Running Gryffin ===
    {"Service":"Main","Msg":"Started","Method":"GET","Url":"http://192.168.1.117:1912/login"}
    {"Service":"Poke","Msg":"Poking","Method":"GET","Url":"http://192.168.1.117:1912/login"}
    {"Service":"CrawlAsync","Msg":"Started","Method":"GET","Url":"http://192.168.1.117:1912/login"}
    {"Service":"PhantomjsRenderer.Do","Msg":"Running: render.js","Method":"GET","Url":"http://192.168.1.117:1912/login"}
    {"Service":"Fingerprint","Msg":"Computed","Method":"GET","Url":"http://192.168.1.117:1912/login"}
    {"Service":"IsDuplicatedPage","Msg":"Unique Page","Method":"GET","Url":"http://192.168.1.117:1912/login"}
    {"Service":"ShouldCrawl","Msg":"Unique Link","Method":"POST","Url":"http://192.168.1.117:1912/afterLogin"}
    {"Service":"Arachni.Scan","Msg":"Run as [arachni --checks xss* --output-only-positives --http-request-concurrency 1 --http-request-timeout 10000 --timeout 00:03:00 --scope-dom-depth-limit 0 --scope-directory-depth-limit 0 --scope-page-limit 1 --audit-with-both-methods --report-save-path /dev/null --snapshot-save-path /dev/null http://192.168.1.117:1912/login]","Method":"GET","Url":"http://192.168.1.117:1912/login"}
    {"Service":"SQLMap.Scan","Msg":"Run as [sqlmap --batch --timeout=2 --retries=3 --crawl=0 --disable-coloring -o --text-only -v 0 --level=1 --risk=1 --smart --fresh-queries --purge-output --os=Linux --dbms=MySQL --delay=0.1 --time-sec=1 -u http://192.168.1.117:1912/login]","Method":"GET","Url":"http://192.168.1.117:1912/login"}
    {"Service":"CrawlAsync","Msg":"Started","Method":"POST","Url":"http://192.168.1.117:1912/afterLogin"}
    {"Service":"PhantomjsRenderer.Do","Msg":"Running: render.js","Method":"POST","Url":"http://192.168.1.117:1912/afterLogin"}
    {"Service":"IsDuplicatedPage","Msg":"Duplicate Page","Method":"POST","Url":"http://192.168.1.117:1912/afterLogin"}
    {"Service":"PhantomjsRenderer.Do","Msg":"[Cleanup] Terminating the crawl process.","Method":"POST","Url":"http://192.168.1.117:1912/afterLogin"}
    {"Service":"Get Links","Msg":"Finished","Method":"POST","Url":"http://192.168.1.117:1912/afterLogin"}
    {"Service":"SQLMap.Scan","Msg":"SQLMap return true","Method":"GET","Url":"http://192.168.1.117:1912/login"}
    {"Service":"Arachni.Scan","Msg":"Arachni return true","Method":"GET","Url":"http://192.168.1.117:1912/login"}
    {"Service":"ShouldCrawl","Msg":"Duplicate Link","Method":"GET","Url":"http://192.168.1.117:1912/afterLogin"}
    {"Service":"Get Links","Msg":"Finished","Method":"POST","Url":"http://192.168.1.117:1912/afterLogin"}
    === End Running Gryffin ===
    

    Did the output saves at another place ? is this all of the results ? what info can I get from this ? Also, it was really too fast to be a full sqlmap run on the login form

    P.S If gryffin detects a form, it should maybe try to run sqlmap with the --forms flag

  • Installation error

    Installation error

    Whenever I try to install gryffin, I get this error:

    github.com/yahoo/gryffin

    src/github.com/yahoo/gryffin/gryffin.go:226:24: error: reference to undefined field or method ‘Timeout’ client.(*http.Client).Timeout = time.Duration(3) * time.Second

    Please assist, Thank you.

  • runtime error: invalid memory address or nil pointer dereference

    runtime error: invalid memory address or nil pointer dereference

    Hi all, @yukinying @bararchy @harisec sorry for my delayed response...as "gryffin-standalone: Timeout when rendering js" topic is now closed, I've opened this new one, because for me at least, after running "go get -v -u github.com/yahoo/gryffin/..." I am still facing this nil pointer issue. Here is the stack trace:

    $GOPATH/bin/gryffin-standalone http://zero.webappsecurity.com/ === Running Gryffin === {"Service":"Main","Msg":"Started","Method":"GET","Url":"http://zero.webappsecurity.com/"} {"Service":"Poke","Msg":"Poking","Method":"GET","Url":"http://zero.webappsecurity.com/"} {"Service":"CrawlAsync","Msg":"Started","Method":"GET","Url":"http://zero.webappsecurity.com/"} {"Service":"PhantomjsRenderer.Do","Msg":"Running: render.js","Method":"GET","Url":"http://zero.webappsecurity.com/"} {"Service":"Fingerprint","Msg":"Computed","Method":"GET","Url":"http://zero.webappsecurity.com/"} {"Service":"IsDuplicatedPage","Msg":"Unique Page","Method":"GET","Url":"http://zero.webappsecurity.com/"} {"Service":"SQLMap.Scan","Msg":"Run as [sqlmap --batch --timeout=2 --retries=3 --crawl=0 --disable-coloring -o --text-only -v 0 --level=1 --risk=1 --smart --fresh-queries --purge-output --os=Linux --dbms=MySQL --delay=0.1 --time-sec=1 -u http://zero.webappsecurity.com/]","Method":"GET","Url":"http://zero.webappsecurity.com/"} panic: runtime error: invalid memory address or nil pointer dereference [signal 0xb code=0x1 addr=0x8 pc=0x529b5f]

    goroutine 26 [running]: github.com/yahoo/gryffin/fuzzer/sqlmap.(*Fuzzer).Fuzz(0xc820047f88, 0xc8200d3290, 0x0, 0x7f8023f973a0, 0xc820155520) /usr/local/go/src/src/github.com/yahoo/gryffin/fuzzer/sqlmap/sqlmap.go:84 +0x98f main.linkChannels.func2.2(0xc82002a118, 0xc82000f590) /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:102 +0x30 created by main.linkChannels.func2 /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:104 +0xfd

    goroutine 1 [semacquire]: sync.runtime_Semacquire(0xc82000f59c) /usr/local/go/src/runtime/sema.go:43 +0x26 sync.(*WaitGroup).Wait(0xc82000f590) /usr/local/go/src/sync/waitgroup.go:126 +0xb4 main.linkChannels(0xc8200d3290) /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:144 +0x263 main.main() /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:176 +0x40a

    goroutine 17 [syscall, locked to thread]: runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1696 +0x1

    goroutine 5 [chan receive (nil chan)]: github.com/yahoo/gryffin.NewGryffinStore.func1(0x0) /usr/local/go/src/src/github.com/yahoo/gryffin/session.go:41 +0x8f created by github.com/yahoo/gryffin.NewGryffinStore /usr/local/go/src/src/github.com/yahoo/gryffin/session.go:44 +0x157

    goroutine 6 [chan receive]: main.linkChannels.func1(0xc82001c360, 0xc82001c3c0, 0xc82000f590, 0xc82001c300) /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:55 +0x68 created by main.linkChannels /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:89 +0x17e

    goroutine 7 [chan receive]: main.linkChannels.func2(0xc82001c3c0, 0xc82000f590) /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:92 +0x68 created by main.linkChannels /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:107 +0x1aa

    goroutine 8 [chan receive]: main.linkChannels.func3(0xc82001c300, 0xc82001c360) /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:111 +0x68 created by main.linkChannels /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:122 +0x1d6

    goroutine 18 [syscall]: syscall.Syscall6(0x3d, 0x4cac, 0xc82004db84, 0x0, 0xc8200605a0, 0x0, 0x0, 0x728620, 0xc82001cc60, 0x3) /usr/local/go/src/syscall/asm_linux_amd64.s:44 +0x5 syscall.wait4(0x4cac, 0xc82004db84, 0x0, 0xc8200605a0, 0x90, 0x0, 0x0) /usr/local/go/src/syscall/zsyscall_linux_amd64.go:172 +0x72 syscall.Wait4(0x4cac, 0xc82004dbcc, 0x0, 0xc8200605a0, 0xc82002a1c8, 0x0, 0x0) /usr/local/go/src/syscall/syscall_linux.go:256 +0x55 os.(_Process).wait(0xc82000b820, 0x15, 0x0, 0x0) /usr/local/go/src/os/exec_unix.go:22 +0x105 os.(_Process).Wait(0xc82000b820, 0x0, 0x0, 0x0) /usr/local/go/src/os/doc.go:45 +0x2d os/exec.(_Cmd).Wait(0xc8200868c0, 0x0, 0x0) /usr/local/go/src/os/exec/exec.go:380 +0x211 github.com/yahoo/gryffin/renderer.(_PhantomJSRenderer).Do(0xc820118060, 0xc8200d3290) /usr/local/go/src/src/github.com/yahoo/gryffin/renderer/phantomjs.go:234 +0x76e created by github.com/yahoo/gryffin.(*Scan).CrawlAsync /usr/local/go/src/src/github.com/yahoo/gryffin/gryffin.go:340 +0x9b

    goroutine 25 [runnable]: main.linkChannels.func2.1(0xc82002a118, 0xc82000f590) /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:95 created by main.linkChannels.func2 /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:99 +0xd1

    goroutine 15 [IO wait]: net.runtime_pollWait(0x7f8023f96640, 0x72, 0xc82000e1a0) /usr/local/go/src/runtime/netpoll.go:157 +0x60 net.(_pollDesc).Wait(0xc82004ea70, 0x72, 0x0, 0x0) /usr/local/go/src/net/fd_poll_runtime.go:73 +0x3a net.(_pollDesc).WaitRead(0xc82004ea70, 0x0, 0x0) /usr/local/go/src/net/fd_poll_runtime.go:78 +0x36 net.(_netFD).Read(0xc82004ea10, 0xc8200f3000, 0x1000, 0x1000, 0x0, 0x7f8023f91050, 0xc82000e1a0) /usr/local/go/src/net/fd_unix.go:232 +0x23a net.(_conn).Read(0xc82002a130, 0xc8200f3000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /usr/local/go/src/net/net.go:172 +0xe4 net/http.noteEOFReader.Read(0x7f8023f96d10, 0xc82002a130, 0xc8200d34f8, 0xc8200f3000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /usr/local/go/src/net/http/transport.go:1370 +0x67 net/http.(_noteEOFReader).Read(0xc82000b280, 0xc8200f3000, 0x1000, 0x1000, 0x0, 0x0, 0x0) :126 +0xd0 bufio.(_Reader).fill(0xc82001c900) /usr/local/go/src/bufio/bufio.go:97 +0x1e9 bufio.(_Reader).Peek(0xc82001c900, 0x1, 0x0, 0x0, 0x0, 0x0, 0x0) /usr/local/go/src/bufio/bufio.go:132 +0xcc net/http.(_persistConn).readLoop(0xc8200d34a0) /usr/local/go/src/net/http/transport.go:876 +0xf7 created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:685 +0xc78

    goroutine 20 [chan receive (nil chan)]: main.linkChannels.func1.2(0xc820118060, 0xc82000f590, 0xc82001c300, 0xc82002a110) /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:76 +0x5c created by main.linkChannels.func1 /usr/local/go/src/src/github.com/yahoo/gryffin/cmd/gryffin-standalone/main.go:85 +0x189

    goroutine 16 [select]: net/http.(_persistConn).writeLoop(0xc8200d34a0) /usr/local/go/src/net/http/transport.go:1009 +0x40c created by net/http.(_Transport).dialConn /usr/local/go/src/net/http/transport.go:686 +0xc9d

    goroutine 21 [syscall]: syscall.Syscall(0x0, 0x6, 0xc820132001, 0x3dff, 0x20, 0x20, 0x6cf120) /usr/local/go/src/syscall/asm_linux_amd64.s:18 +0x5 syscall.read(0x6, 0xc820132001, 0x3dff, 0x3dff, 0x0, 0x0, 0x0) /usr/local/go/src/syscall/zsyscall_linux_amd64.go:783 +0x5f syscall.Read(0x6, 0xc820132001, 0x3dff, 0x3dff, 0xc820150e70, 0x0, 0x0) /usr/local/go/src/syscall/syscall_unix.go:160 +0x4d os.(_File).read(0xc82002a1a8, 0xc820132001, 0x3dff, 0x3dff, 0x411069, 0x0, 0x0) /usr/local/go/src/os/file_unix.go:211 +0x53 os.(_File).Read(0xc82002a1a8, 0xc820132001, 0x3dff, 0x3dff, 0xc8201552c0, 0x0, 0x0) /usr/local/go/src/os/file.go:95 +0x8a encoding/json.(_Decoder).refill(0xc82005b860, 0x0, 0x0) /usr/local/go/src/encoding/json/stream.go:152 +0x287 encoding/json.(_Decoder).readValue(0xc82005b860, 0x1, 0x0, 0x0) /usr/local/go/src/encoding/json/stream.go:128 +0x41b encoding/json.(_Decoder).Decode(0xc82005b860, 0x6b3240, 0xc820150f30, 0x0, 0x0) /usr/local/go/src/encoding/json/stream.go:57 +0x159 github.com/yahoo/gryffin/renderer.(_PhantomJSRenderer).extract(0xc820118060, 0x7f8023f97200, 0xc82002a1a8, 0xc8200d3290) /usr/local/go/src/src/github.com/yahoo/gryffin/renderer/phantomjs.go:129 +0x14b created by github.com/yahoo/gryffin/renderer.(*PhantomJSRenderer).Do /usr/local/go/src/src/github.com/yahoo/gryffin/renderer/phantomjs.go:231 +0x72e

    goroutine 22 [select]: github.com/yahoo/gryffin/renderer.(_PhantomJSRenderer).wait(0xc820118060, 0xc8200d3290) /usr/local/go/src/src/github.com/yahoo/gryffin/renderer/phantomjs.go:165 +0x17d created by github.com/yahoo/gryffin/renderer.(_PhantomJSRenderer).Do /usr/local/go/src/src/github.com/yahoo/gryffin/renderer/phantomjs.go:232 +0x760

    I hope this sheds some light, because I really want to see my install working ...:)

  • Cannot establish tcp connection to log listener

    Cannot establish tcp connection to log listener

    I'm pretty new to go, so maybe I missed something, I keep getting this error:

    make test-mono 
    go run cmd/gryffin-standalone/main.go "http://127.0.0.1:8081"
    Cannot establish tcp connection to log listener.
    {"Service":"Main","Msg":"Started","Method":"GET","Url":"http://127.0.0.1:8081"}
    panic: runtime error: invalid memory address or nil pointer dereference
    [signal 0xb code=0x1 addr=0x20 pc=0x47b4e4]
    
    goroutine 1 [running]:
    io.(*multiWriter).Write(0xc82000af40, 0xc82008eb40, 0x50, 0x95, 0x50, 0x0, 0x0)
        /usr/lib/go/src/io/multi.go:43 +0xd4
    encoding/json.(*Encoder).Encode(0xc820051d48, 0x6b8240, 0xc820014680, 0x0, 0x0)
        /usr/lib/go/src/encoding/json/stream.go:201 +0x16b
    github.com/yahoo/gryffin.(*Scan).Log(0xc8200e0bb0, 0x6b8240, 0xc820014680)
        /home/unshadow/go/src/github.com/yahoo/gryffin/gryffin.go:508 +0x96
    github.com/yahoo/gryffin.(*Scan).Logm(0xc8200e0bb0, 0x7ca7e0, 0x4, 0x7cb3b8, 0x7)
        /home/unshadow/go/src/github.com/yahoo/gryffin/gryffin.go:495 +0x121
    main.main()
        /home/unshadow/Desktop/git-projects/gryffin/cmd/gryffin-standalone/main.go:173 +0x4e4
    
    goroutine 17 [syscall, locked to thread]:
    runtime.goexit()
        /usr/lib/go/src/runtime/asm_amd64.s:1745 +0x1
    exit status 2
    Makefile:31: recipe for target 'test-mono' failed
    make: *** [test-mono] Error 1
    
    
  • 
Fix function comments based on best practices from Effective Go

    Fix function comments based on best practices from Effective Go

    Every exported function in a program should have a doc comment. The first sentence should be a summary that starts with the name being declared. From effective go.

    PR generated by CodeLingo. Install here to drive Continuous Higher Standards.

  • Efficiency of page deduplication

    Efficiency of page deduplication

    (Issue #6 focuses on drawbacks for the test and audit phase. The focuses on applicability to crawling.)

    This approach doesn't look very robust to improve crawling against real-world web apps. It requires a link to be requested before the crawler can decide whether the requested link was redundant.

    The comparison also seems overly sensitive to tags that do not affect structure or navigation. For example, false positives can come from text nodes with large variations in formatting tags like <b> and <i>, pages that display tables with different amounts of rows, or articles with varying numbers of comments (where a comment may be in its own <div>)

    For the flickr examples, the distance doesn't seem to consistently reflect duplicate content. For example, it appears to produce a 100% match for a user's /albums/ and /groups/ content, even though the /groups/ clearly points to additional navigation links that would be important to crawl.

    How does the dedupe work for pages that dynamically create content? Is the HTML taken from the HTTP response, or from the version rendered in a browser? If it renders from a browser, at what point is the page considered settled versus an "infinite scroll" or dynamically refreshing page?

    Are link+page combinations labeled with an authentication state? The content for a link can change significantly depending on whether the user is logged in.

  • Fuzzer integration issues

    Fuzzer integration issues

    Hey guys,

    I've noticed a few issues with the way that the fuzzers are currently integrated:

    1. Spawning and monitoring is taking place via an STDOUT channel, which really isn't meant to be a programming interface but a user one[1]. UI output can change at any time and it really shouldn't be something to depend on for communicating data.
      1. Keeping the entire output buffer in memory instead of processing it as it arrives and then releasing it can be a problem too, depending on the amount of output.
      2. Output seems to be the sole source of data, instead for example also parsing a generated report at the end, this severely limits the amount of information that can be reported to the user.
    2. Fuzzers are being cold-started for every resource, this will introduce a significant latency.

    I can't speak for the rest but Arachni is built to be a distributed/integrable system, so you can solve all of these issues by requesting warmed up scanner processes from a Dispatcher (maintains a pool of initialized scanners, no spawn latency) and all integration (scanner dispatch, scan management and monitoring) happens via an RPC API.

    [1] UNIX philosophy excluded.

Related tags
Web-Security-Academy - Web Security Academy, developed in GO

Web-Security-Academy - Web Security Academy, developed in GO

Feb 23, 2022
A Large killer focused on intranet scanning
A Large killer focused on intranet scanning

FscanX 其实FscanX的灵感来源于fscan和LodanGo这两个开源项目,首先不得不说fscan和LadonGo两个都是非常优秀的内网扫描器。并且其独自的特色也让其在内网扫描器领域独占鳌头。其中LadonGo的插件式让其在扫描时更加专注,而fscan的傻瓜式则让其对内网的信息搜集更加高效。

Dec 31, 2021
The dynamic infrastructure framework for everybody! Distribute the workload of many different scanning tools with ease, including nmap, ffuf, masscan, nuclei, meg and many more!
The dynamic infrastructure framework for everybody! Distribute the workload of many different scanning tools with ease, including nmap, ffuf, masscan, nuclei, meg and many more!

Axiom is a dynamic infrastructure framework to efficiently work with multi-cloud environments, build and deploy repeatable infrastructure focussed on

Dec 30, 2022
EarlyBird is a sensitive data detection tool capable of scanning source code repositories for clear text password violations, PII, outdated cryptography methods, key files and more.
EarlyBird is a sensitive data detection tool capable of scanning source code repositories for clear text password violations, PII, outdated cryptography methods, key files and more.

EarlyBird is a sensitive data detection tool capable of scanning source code repositories for clear text password violations, PII, outdated cryptograp

Dec 10, 2022
Nuclei is a fast tool for configurable targeted vulnerability scanning based on templates offering massive extensibility and ease of use.
Nuclei is a fast tool for configurable targeted vulnerability scanning based on templates offering massive extensibility and ease of use.

Fast and customisable vulnerability scanner based on simple YAML based DSL. How • Install • For Security Engineers • For Developers • Documentation •

Dec 30, 2022
A fully self-contained Nmap like parallel port scanning module in pure Golang that supports SYN-ACK (Silent Scans)

gomap What is gomap? Gomap is a fully self-contained nmap like module for Golang. Unlike other projects which provide nmap C bindings or rely on other

Dec 10, 2022
🌘🦊 DalFox(Finder Of XSS) / Parameter Analysis and XSS Scanning tool based on golang
🌘🦊 DalFox(Finder Of XSS) / Parameter Analysis and XSS Scanning tool based on golang

Finder Of XSS, and Dal(달) is the Korean pronunciation of moon. What is DalFox ?? ?? DalFox is a fast, powerful parameter analysis and XSS scanner, bas

Jan 5, 2023
WIP. Converts Azure Container Scan Action output to SARIF, for an easier integration with GitHub Code Scanning

container-scan-to-sarif container-scan-to-sarif converts Azure Container Scan Action output to Static Analysis Results Interchange Format (SARIF), for

Jan 25, 2022
ARP spoofing tool based on go language, supports LAN host scanning, ARP poisoning, man-in-the-middle attack, sensitive information sniffing, HTTP packet sniffing
ARP spoofing tool based on go language, supports LAN host scanning, ARP poisoning, man-in-the-middle attack, sensitive information sniffing, HTTP packet sniffing

[ARP Spoofing] [Usage] Commands: clear clear the screen cut 通过ARP欺骗切断局域网内某台主机的网络 exit exit the program help display help hosts 主机管理功能 loot 查看嗅探到的敏感信息

Dec 30, 2022
Wrapper to communicate with the wifi scanning protocol on Brother MFC-J430W
Wrapper to communicate with the wifi scanning protocol on Brother MFC-J430W

Brother MFC-J430W protocol wrapper (wifi scanner) Reasons Brother MFC-J430W has already scanner driver and you can download here but that are prebuilt

Dec 20, 2022
A FreeSWITCH specific scanning and exploitation toolkit for CVE-2021-37624 and CVE-2021-41157.

PewSWITCH A FreeSWITCH specific scanning and exploitation toolkit for CVE-2021-37624 and CVE-2021-41157. Related blog: https://0xinfection.github.io/p

Nov 2, 2022
Naabu - a port scanning tool written in Go that allows you to enumerate valid ports for hosts in a fast and reliable manner
Naabu - a port scanning tool written in Go that allows you to enumerate valid ports for hosts in a fast and reliable manner

Naabu is a port scanning tool written in Go that allows you to enumerate valid ports for hosts in a fast and reliable manner. It is a really simple tool that does fast SYN/CONNECT scans on the host/list of hosts and lists all ports that return a reply.

Jan 2, 2022
Go-basic-port-scanner: Scanning of TCP ports only
Go-basic-port-scanner: Scanning of TCP ports only

go-basic-port-scanner Scanning of TCP ports only. Usage git clone https://git

Jan 22, 2022
Feb 2, 2022
Portmantool - Port scanning and monitoring tool

portmantool Port scanning and monitoring tool Components runner while true do r

Feb 14, 2022
set of web security test cases and a toolkit to construct new ones

Webseclab Webseclab contains a sample set of web security test cases and a toolkit to construct new ones. It can be used for testing security scanners

Jan 7, 2023
A web-based testing platform for WAF (Web Application Firewall)'s correctness

WAFLab ?? WAFLab is a web-based platform for testing WAFs. Live Demo https://waflab.org/ Architecture WAFLab contains 2 parts: Name Description Langua

Oct 25, 2022