crawler

package
v1.0.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 21, 2022 License: MIT Imports: 11 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type CrawlerPage

type CrawlerPage struct {
	// contains filtered or unexported fields
}

CrawlerPage is a implementation to handle with web crawler.

func NewCrawlerPage

func NewCrawlerPage(pager pager.PagerService, database crawler.CrawlerDatabase) CrawlerPage

NewCrawlerPage is a constructor to create a new instance of CrawlerPage.

func (CrawlerPage) Craw

func (p CrawlerPage) Craw(ctx context.Context, uri string, depth uint) ([]string, error)

Craw execute the call to craw pages concurrently and will respect depth param.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL