batch

package
v1.1.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 2, 2020 License: MIT Imports: 15 Imported by: 0

Documentation

Index

Constants

View Source
const (
	// BulkQueued represents the status when the batch operation is queued
	BulkQueued = "QUEUED"
	// BulkUploading represents the uploading status
	BulkUploading = "UPLOADING"
	// BulkSucceeded represents the succeeded status
	BulkSucceeded = "SUCCEEDED"
	// BulkPartialUpload represent the partial upload status
	BulkPartialUpload = "PARTIAL UPLOAD"
	// BulkFailed represents the failed status
	BulkFailed = "FAILED"
	// TableBulkStatus is the name of the table for storing all the batchIDs
	TableBulkStatus = "bulkStatus"
	// TableBulkErrors is the name of the error tables for storing all the errors of a specific batch
	TableBulkErrors = "bulkErrors"
)

Variables

View Source
var (
	// MaxNumberOfWorkers is the max number of concurrent goroutines for uploading data
	MaxNumberOfWorkers = runtime.NumCPU()
	// FlushIntervalInSec is the amount of time before executing the Pipeline in case the buffer is not full
	FlushIntervalInSec = 10
	// MaxNumberOfCommandsInPipeline is the amount of commands that the Pipeline can executes in one step
	MaxNumberOfCommandsInPipeline = 10000
)

Functions

This section is empty.

Types

type BulkStatus

type BulkStatus string

BulkStatus defines the status of the batch upload from S3

type Data

type Data map[string][]models.ItemScore

Data is the object representing the content of the data parameter in the batch request

type DataUploadedError

type DataUploadedError struct {
	NumberOfLinesFailed string             `json:"numberoflinesfailed" description:"total count of lines that were not uploaded"`
	Errors              []models.LineError `json:"error" description:"errors found"`
}

DataUploadedError is the response payload when the batch upload failed

type Operator

type Operator struct {
	DBClient    db.DB
	CacheClient cache.Cache
	Model       models.Model
}

Operator is the object responsible for uploading data in batch to Database

func NewOperator

func NewOperator(dbc db.DB, m models.Model) *Operator

NewOperator returns the object responsible for uploading the data in batch to Database

func (*Operator) IterateFile

func (o *Operator) IterateFile(rd *bufio.Reader, setName string, rs chan<- *models.RecordQueue, le chan<- models.LineError)

IterateFile will iterate each line in the reader object and push messages in the channels

func (*Operator) SetStatus

func (o *Operator) SetStatus(batchID, status string) error

SetStatus sets the status in the DB. The error message is logged only

func (*Operator) StoreErrors

func (o *Operator) StoreErrors(batchID string, le chan models.LineError) int

StoreErrors stores the errors in Database from the channel in input

func (*Operator) UploadDataDirectly

func (o *Operator) UploadDataDirectly(bd []Data) (string, DataUploadedError, error)

UploadDataDirectly does an insert directly to Database

func (*Operator) UploadDataFromFile

func (o *Operator) UploadDataFromFile(file *io.ReadCloser, batchID string) error

UploadDataFromFile reads from a file and upload line-by-line to Database on a particular BatchID

func (*Operator) UploadRecord

func (o *Operator) UploadRecord(batchID string, rs chan *models.RecordQueue)

UploadRecord store each message from the channel to DB

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL