scraper

package
v0.1.0-alpha.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Aug 24, 2016 License: Apache-2.0 Imports: 12 Imported by: 4

README

scraper

The scraper is responsible for polling prometheus exporters and writing metrics onto the metric bus. Jobs are defined in a configuration store and a pool of scrapers share the load to fulfill these jobs. Scrapers react to other scrapers joining or leaving the pool. Scrapers react to jobs changing.

scraper architecture

 ╔═════════════════════════════════════════════════════════════════════════════╗
 ║                       configuration store (zookeeper)                       ║
 ║  ┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓ ┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓  ║
 ║  ┃               jobs                ┃ ┃         alive scrapers          ┃  ║
 ║  ┃┌─────────────────────────────────┐┃ ┃┌───────────────────────────────┐┃  ║
 ║  ┃│job_name: mysql                  │┃ ┃│scraper-ff55531                │┃  ║
 ║  ┃│static_configs:                  │┃ ┃│scraper-e85aae3                │┃  ║
 ║  ┃│  - targets:                     │┃ ┃│scraper-9c2ff7b                │┃  ║
 ║  ┃│    - mysql1.example.com:9104    │┃ ┃└───────────────────────────────┘┃  ║
 ║  ┃│    - mysql2.example.com:9104    │┃ ┃                                 ┃  ║
 ║  ┃└─────────────────────────────────┘┃ ┃                                 ┃  ║
 ║  ┃┌─────────────────────────────────┐┃ ┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛  ║
 ║  ┃│job_name: memcached              │┃                  △                   ║
 ║  ┃│dns_sd_configs:                  │┃                  │                   ║
 ║  ┃│  - names:                       │┃                  │                   ║
 ║  ┃│    - _memcached._tcp.example.com│┃                  │                   ║
 ║  ┃└─────────────────────────────────┘┃                  │                   ║
 ║  ┃┌─────────────────────────────────┐┃                  │                   ║
 ║  ┃│job_name: nginx                  │┃                  │                   ║
 ║  ┃│consul_sd_configs:               │┃                  │                   ║
 ║  ┃│  - server: 'consul:1234'        │┃                  │                   ║
 ║  ┃│    services: ['nginx']          │┃                ┌─┘                   ║
 ║  ┃└─────────────────────────────────┘┃                │                     ║
 ║  ┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛                │                     ║
 ║                    △                                  │                     ║
 ║                    │                                  │                     ║
 ╚════════════════════╬══════════════════════════════════╬═════════════════════╝
                    ┌─┘                                  │
 ╔══════════════════╬════════════════════════════════════╬═════════════════════╗
 ║                  │           each scraper node        │                     ║
 ║                  │                                    │                     ║
 ║                  ▼                                    ▼                     ║
 ║  ┌───────────────────────────────┐    ┌───────────────────────────────┐     ║
 ║  │        resolve targets        │    │        consistent hash        │     ║
 ║  │                               │ ┌─▶│                               │     ║
 ║  │┌─────────────────────────────┐│ │  └───────────────────────────────┘     ║
 ║  ││                             ││ │                  │                     ║
 ║  ││                             ││ │                  ▼                     ║
 ║  ││mysql1.example.com:9104      ││ │  ┌───────────────────────────────┐     ║
 ║  ││mysql2.example.com:9104      ││ │  │          my targets           │     ║
 ║  ││memcached1.example.com:9106  ││ │  │                               │     ║
 ║  ││memcached2.example.com:9106  ││─┘  │┌─────────────────────────────┐│     ║
 ║  ││memcached3.example.com:9106  ││    ││                             ││     ║
 ║  ││nginx1.example.com:9113      ││    ││                             ││     ║
 ║  ││nginx2.example.com:9113      ││    ││mysql1.example.com:9108      ││     ║
 ║  ││                             ││    ││memcached3.example.com:9109  ││     ║
 ║  ││                             ││    ││                             ││     ║
 ║  │└─────────────────────────────┘│    ││                             ││     ║
 ║  │                               │    │└─────────────────────────────┘│     ║
 ║  └───────────────────────────────┘    └───────────────────────────────┘     ║
 ║                                                                             ║
 ╚═════════════════════════════════════════════════════════════════════════════╝
 ▲                                                                             │
 │        ┌───────────────────────────┐                        ┌───────────────┘
 ├───────▷│  mysql1.example.com:9108  │                        │
 │        └───────────────────────────┘                        ▼
 │        ┌───────────────────────────┐        ┌───────────────────────────────┐
 └───────▷│memcached3.example.com:9109│        │      metric bus (kafka)       │
          └───────────────────────────┘        └───────────────────────────────┘

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Config

type Config struct {
	Targeter TargetWatcher
	Writer   Writer
}

Config represents an instance of a scraper configuration. It contains a Targeter interface and a Writer interface.

type ConsistentHashTargeter

type ConsistentHashTargeter struct {
	// contains filtered or unexported fields
}

ConsistentHashTargeter represents an object that orchestrates work between Zookeeper targets and available worker nodes in the pool.

func NewConsistentHashTargeter

func NewConsistentHashTargeter(config *ConsistentHashTargeterConfig) *ConsistentHashTargeter

NewConsistentHashTargeter returns a new instance of a ConsistentHashTargeter object.

func (*ConsistentHashTargeter) Targets

func (cht *ConsistentHashTargeter) Targets() <-chan []Targeter

Targets returns a channel that feeds current available jobs.

type ConsistentHashTargeterConfig

type ConsistentHashTargeterConfig struct {
	Targeter TargetWatcher
	ID       string
	Pool     Pool
}

ConsistentHashTargeterConfig represents an configuration for a ConsistentHashTargeter object.

type HTTPTarget

type HTTPTarget struct {
	// contains filtered or unexported fields
}

HTTPTarget represents an instance of an HTTP scraper target.

func NewHTTPTarget

func NewHTTPTarget(config *HTTPTargetConfig) *HTTPTarget

NewHTTPTarget creates an instance of HTTPTarget.

func (*HTTPTarget) Equals

func (ht *HTTPTarget) Equals(other Targeter) bool

Equals checkfs if the instance's current target is the same as the parameter other.

func (*HTTPTarget) Fetch

func (ht *HTTPTarget) Fetch() ([]*dto.MetricFamily, error)

Fetch polls the target's metric endpoint for data and transforms it into a prometheus MetricFamily type.

func (*HTTPTarget) Interval

func (ht *HTTPTarget) Interval() time.Duration

Interval returns the current targets interval.

func (*HTTPTarget) Key

func (ht *HTTPTarget) Key() string

Key implements Targeter

type HTTPTargetConfig

type HTTPTargetConfig struct {
	Interval time.Duration
	URL      *url.URL
	JobName  JobName
}

HTTPTargetConfig represents the configuration of an HTTPTarget.

type Job

type Job interface {
	// TargetWatcher
	// Returns the unique name of a job.
	Name() JobName
	AddTargets(...Targeter)
	GetTargets() []Targeter
}

Job represents a discoverable targets to be be processed.

type JobName

type JobName string

JobName is the name of an exporter/job.

type JobWatcher

type JobWatcher interface {
	Jobs() <-chan []Job
}

JobWatcher is an interface that wraps the Target method.

type Pool

type Pool interface {
	Scrapers() <-chan []string
}

Pool is an interface that wraps the Scrapers method.

type Scraper

type Scraper struct {
	Targeter TargetWatcher
	Writer   Writer
	// contains filtered or unexported fields
}

Scraper answers to a query on which targets are running and groups them by job.

func NewScraper

func NewScraper(config *Config) *Scraper

NewScraper returns a new Scraper instance from the provided Config.

func (*Scraper) Run

func (s *Scraper) Run()

Run ranges over the Scraper's targets and does work on them.

func (*Scraper) Stop

func (s *Scraper) Stop()

Stop signals the Scraper instance to stop running.

type StaticJob

type StaticJob struct {
	// contains filtered or unexported fields
}

StaticJob represents a vulcan job.

func NewStaticJob

func NewStaticJob(config *StaticJobConfig) *StaticJob

NewStaticJob returns a new instance of the StaticJob configuration.

func (*StaticJob) AddTargets

func (j *StaticJob) AddTargets(t ...Targeter)

AddTargets appends the provided Targeters to the list of targets.

func (*StaticJob) GetTargets

func (j *StaticJob) GetTargets() []Targeter

GetTargets implements Job.

func (*StaticJob) Name

func (j *StaticJob) Name() JobName

Name implements Job.

type StaticJobConfig

type StaticJobConfig struct {
	JobName   JobName
	Targeters []Targeter
}

StaticJobConfig represents a StaticJob configuration.

type Target

type Target struct {
	Job      string
	URL      string
	Instance string
	Interval time.Duration
}

Target represents a scrape target.

func (*Target) Key

func (t *Target) Key() string

Key returns a unique key for the target TODO Need to consider a more robust method of generating a unique key.

type TargetWatcher

type TargetWatcher interface {
	Targets() <-chan []Targeter
}

TargetWatcher is an interface that represents something that can return a slice of active Targeters

type Targeter

type Targeter interface {
	Equals(other Targeter) bool
	// Fetch polls for metrics of the target.
	// It can pass off errors to a configurable endpoint.
	Fetch() ([]*dto.MetricFamily, error)
	Interval() time.Duration
	// Key returns unique key for target.
	Key() string
}

Targeter is an interface that wraps the methods of exporter/job target.

type Worker

type Worker struct {
	Target Targeter
	// contains filtered or unexported fields
}

Worker represents an instance of a scraper worker.

func NewWorker

func NewWorker(config *WorkerConfig) *Worker

NewWorker creates a new instance of a Worker.

func (*Worker) Retarget

func (w *Worker) Retarget(t Targeter)

Retarget sets the current Worker's target to the parameter t.

func (*Worker) Stop

func (w *Worker) Stop()

Stop signals the current Worker instance to stop running.

type WorkerConfig

type WorkerConfig struct {
	// JobName JobName
	Key    string
	Target Targeter
	Writer Writer
}

WorkerConfig respresents an instance of a Worker's configuration.

type Writer

type Writer interface {
	Write(string, []*dto.MetricFamily) error
}

Writer is an interface that wraps the Write method to a message bus.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL