Documentation ¶
Index ¶
Constants ¶
View Source
const CrawlersCollection = "categories"
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type CRepository ¶
type CRepository struct {
// contains filtered or unexported fields
}
CRepository provides a mongo collection for database job.
func (*CRepository) DeleteAll ¶
func (r *CRepository) DeleteAll() error
DeleteAll drops crawlers collection.
func (*CRepository) GetById ¶
func (r *CRepository) GetById(id string) (url *models.Crawler, err error)
GetById returns the url based on id.
func (*CRepository) GetDataByURL ¶
func (r *CRepository) GetDataByURL(url string) (data *models.Crawler, err error)
GetDataByURL returns the data based on url.
type CrawlersRepository ¶
type CrawlersRepository interface { Save(url *models.Crawler) error GetById(id string) (url *models.Crawler, err error) GetDataByURL(url string) (data *models.Crawler, err error) }
CrawlersRepository is the interface of the crawler backend.
func NewCrawlersRepository ¶
func NewCrawlersRepository(conn db.Connection) CrawlersRepository
NewCrawlersRepository creates a new CrawlersRepository instance.
Click to show internal directories.
Click to hide internal directories.