crawlalyzer

command module
v0.0.0-...-d8649fa Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 1, 2021 License: MIT Imports: 7 Imported by: 0

README

Concurrect web crawler which tries to detect used technologies

The crawler uses Wappalyzer's technology fingerprints in order to detect used technologies of a website. After finishing all the technologies are aggregated per single root URL.

Usage:

There are two command line arguments:

  • urls - Comma-separated urls to crawl.
  • follow-external - Specifies whether to follow external links.

Example usage: go run main.go -urls=https://google.com -follow-external=true

When you press a key the crawling stops and the aggregated technologies for the root urls and their links are saved into fingerprints.json file.

Documentation

The Go Gopher

There is no documentation for this package.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL