crawler

package
v1.0.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 21, 2022 License: MIT Imports: 10 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type CrawlerRepository

type CrawlerRepository struct {
	// contains filtered or unexported fields
}

CrawlerRepository is a repository to handle with persistence layer.

func NewCrawlerRepository

func NewCrawlerRepository(ctx context.Context) CrawlerRepository

NewCrawlerRepository is a constructor to create a new instance of CrawlerRepository.

func (CrawlerRepository) Find

func (c CrawlerRepository) Find(ctx context.Context, uri string, depth uint) ([]string, error)

Find is a method to fetch links crawled from database.

func (CrawlerRepository) Insert

func (c CrawlerRepository) Insert(ctx context.Context, uri string, depth uint, uris []string) error

Insert is a method to insert new page crawled on database.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL