scraper

package
v0.0.0-...-c84f994 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 26, 2021 License: GPL-3.0 Imports: 9 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Config

type Config struct {
	QueryInterval    time.Duration      // interval between query attempts
	MaxFailed        int                // maximum number of failed query attempts before removing address
	QueryFunction    QueryFunction      // function for querying servers
	OnRequestArchive func(string)       // called to archive an address
	OnRequestRemove  func(string)       // called to remove an address
	OnRequestUpdate  func(types.Server) // called to update an address
}

Config contains parameters to tweak the scraper performance

type QueryFunction

type QueryFunction func(context.Context, string, bool) (sampquery.Server, error)

QueryFunction represents a function capable of retreiving server information via the server API

type Scraper

type Scraper struct {
	// contains filtered or unexported fields
}

Scraper crawls through a list of server addresses and gathers information about them via the legacy query API, it then stores the results as standard Server objects, accessible via the API.

func New

func New(ctx context.Context, initial []string, config Config) (daemon *Scraper, err error)

New sets up the query daemon and starts the background processes

func (*Scraper) Add

func (daemon *Scraper) Add(address string)

Add will add a new address to the TickerPool and query it periodically

func (*Scraper) Remove

func (daemon *Scraper) Remove(address string)

Remove will remove an address from the query rotation

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL