repository

package
v1.0.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 16, 2023 License: MIT Imports: 4 Imported by: 0

Documentation

Index

Constants

View Source
const CrawlersCollection = "categories"

Variables

This section is empty.

Functions

This section is empty.

Types

type CRepository

type CRepository struct {
	// contains filtered or unexported fields
}

CRepository provides a mongo collection for database job.

func (*CRepository) DeleteAll

func (r *CRepository) DeleteAll() error

DeleteAll drops crawlers collection.

func (*CRepository) GetById

func (r *CRepository) GetById(id string) (url *models.Crawler, err error)

GetById returns the url based on id.

func (*CRepository) GetDataByURL

func (r *CRepository) GetDataByURL(url string) (data *models.Crawler, err error)

GetDataByURL returns the data based on url.

func (*CRepository) Save

func (r *CRepository) Save(url *models.Crawler) error

Save adds url to database.

type CrawlersRepository

type CrawlersRepository interface {
	Save(url *models.Crawler) error
	GetById(id string) (url *models.Crawler, err error)
	GetDataByURL(url string) (data *models.Crawler, err error)
}

CrawlersRepository is the interface of the crawler backend.

func NewCrawlersRepository

func NewCrawlersRepository(conn db.Connection) CrawlersRepository

NewCrawlersRepository creates a new CrawlersRepository instance.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL