core

package module
v1.2.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 3, 2022 License: MIT Imports: 21 Imported by: 11

README

Go Reference GitHub go.mod Go version GitHub release (latest by date) Go Report Card Actions Status

HomeDashboard Datasource - Core

Contains interfaces and core components for HomeDashboard datasources.

Documentation

Index

Constants

View Source
const (
	// ORIGIN_QUEUE is used to add name of a source queue to message attributes of archive events.
	ORIGIN_QUEUE string = "origin_queue"
)

Variables

This section is empty.

Functions

This section is empty.

Types

type Collector added in v1.0.1

type Collector interface {

	// Run executes collectors data processing logic.
	Run(context.Context) error
}

A Collector calls fetch method of a datasource and process the returned event.

func NewContinuousCollector added in v1.1.1

func NewContinuousCollector(datasource Collector, logger log.Logger) Collector

NewContinuousCollector returns a new collector for continuous processing with given datasource.

func NewScheduledCollector added in v1.0.1

func NewScheduledCollector(queue string, datasource DataSource, conf config.Config, logger log.Logger) Collector

NewScheduledCollector returns a new scheduled collector for given config.

type ContinuousCollector added in v1.1.1

type ContinuousCollector struct {
	// contains filtered or unexported fields
}

ContinuousCollector is used as a deamon to permanently collect data from a source. It mainly cares about observing os singles to handle graceful shutdowns. The actual logic to process data in encapsulated in datasource member.

func (*ContinuousCollector) Run added in v1.1.1

func (collector *ContinuousCollector) Run(ctx context.Context) error

Run executes datasource member in a separate Go routine and observes os signals for a graceful shotdown.

type DataSource

type DataSource interface {

	// Fetch will retrieve new data and returns it as an event from central event lib.
	// For more details about events see https://github.com/tommzn/hdb-events-go
	Fetch() (proto.Message, error)
}

DataSource retrieves data from a specific source.

type EventHandlerS3 added in v1.0.4

type EventHandlerS3 struct {
	// contains filtered or unexported fields
}

EventHandlerS3 is used to process an S3 event send from Cloud Watch to a Lambda function on AWS.

func (*EventHandlerS3) Handle added in v1.0.4

func (handler *EventHandlerS3) Handle(ctx context.Context, event awsevents.S3Event) error

Handle processes passed S3 event.

type Publisher added in v1.2.0

type Publisher interface {

	// Send will publish passed message to given queues.
	Send(message proto.Message) error
}

Publisher is used to send messages to one or multiple queues.

func NewPublisher added in v1.2.0

func NewPublisher(conf config.Config, logger log.Logger) Publisher

newSqsPublisher creates a new SQS message publisher.

type S3EventHandler added in v1.0.4

type S3EventHandler interface {

	// Handle processes passed S3 event.
	Handle(ctx context.Context, event awsevents.S3Event) error
}

S3EventHandler is used to process an event published for S3 actions.

func NewS3EventHandler added in v1.0.4

func NewS3EventHandler(queue string, processor S3EventProcessor, conf config.Config, logger log.Logger) S3EventHandler

NewS3EventHandler returns a new handler to process S3 events send from Cloud Watch.

type S3EventProcessor added in v1.0.4

type S3EventProcessor interface {

	// Process is called to process given event for a S3 object.
	// If download option is enable via config it will pass S3 object content as well.
	ProcessEvent(entity awsevents.S3Entity, content []byte) (proto.Message, error)
}

S3EventProcessor processes an event for a specific S3 object.

type ScheduledCollector added in v1.0.1

type ScheduledCollector struct {
	// contains filtered or unexported fields
}

A ScheduledCollector calls fetch method of a datasource one time and publishes returned event to a given AWS SQS queue. It contains a logger to provide insights to all processing steps and it requires a datasource and a puslisher for AWS SQS.

func (*ScheduledCollector) Run added in v1.0.1

func (collector *ScheduledCollector) Run(ctx context.Context) error

Run calls fetch of current datasource one time and published the returned event to a given AWS SQS queue. In can of any errors, they'll be logged and returned.

type SqsEventProcessor added in v1.0.3

type SqsEventProcessor interface {

	// Handle processes given SQS events.
	Handle(ctx context.Context, sqsEvent events.SQSEvent) error
}

SqsEventProcessor is used to handle event forwarded from AWS SQS to a lambda function.

type SqsPublisher added in v1.2.0

type SqsPublisher struct {
	// contains filtered or unexported fields
}

SqsPublisher is used to publish messages on AWS SQS.

func (*SqsPublisher) Send added in v1.2.0

func (publisher *SqsPublisher) Send(message proto.Message) error

send will publish passed message to given queues.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL