httpsyet

package
v0.2.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 30, 2018 License: MIT Imports: 10 Imported by: 0

README

v5 - Just for the fun of it

Overview

As You see in genny.go now pipe/m is used - and m generate methods on traffic (instead of package functions).

Further, using anyThing=Site (watch the change to the public type!) generated stuff becomes public (where intended) and can be seen in godoc.

Last, but not least, the idea to have Results in a separate package has been abandoned, as this adds more complications than benefit.

Now result lives inside the httpsyet package again - in crawling.go..


Some remarks regarding changes to source files compared with the previous version:

traffic.go

Functions become methods.

genny.go in traffic/

Now take from s insted `m�.

site.go

No change.

crawling.go

Adjust for result becoming a local type again.

crawler_test.go

Just the import path.

Changes to crawler.go

No need to touch.


Back to Overview

Documentation

Overview

Package httpsyet provides the configuration and execution for crawling a list of sites for links that can be updated to HTTPS.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Crawler

type Crawler struct {
	Sites    []string                             // At least one URL.
	Out      io.Writer                            // Required. Writes one detected site per line.
	Log      *log.Logger                          // Required. Errors are reported here.
	Depth    int                                  // Optional. Limit depth. Set to >= 1.
	Parallel int                                  // Optional. Set how many sites to crawl in parallel.
	Delay    time.Duration                        // Optional. Set delay between crawls.
	Get      func(string) (*http.Response, error) // Optional. Defaults to http.Get.
	Verbose  bool                                 // Optional. If set, status updates are written to logger.
}

Crawler is used as configuration for Run. Is validated in Run().

func (Crawler) Run

func (c Crawler) Run() error

Run the crawler. Can return validation errors. All crawling errors are reported via logger. Output is written to writer. Crawls sites recursively and reports all external links that can be changed to HTTPS. Also reports broken links via error logger.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL