datasource

package
v0.0.0-...-77bcb31 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 14, 2019 License: MIT Imports: 22 Imported by: 0

Documentation

Overview

Copyright 2011 The Go Authors. All rights reserved. Use of this source code is governed by a BSD-style license that can be found in the LICENSE file.

Datasource package contains database/source type related. A few datasources are implemented here (test, csv). This package also includes schema base services (datasource registry).

Index

Constants

View Source
const (
	// Default Max Allowed packets for connections
	MaxAllowedPacket = 4194304
)
View Source
const (
	// SchemaDbSourceType is schemadb source type name
	SchemaDbSourceType = "schemadb"
)

Variables

View Source
var (

	// DialectWriterCols list of columns for dialectwriter.
	DialectWriterCols = []string{"mysql"}
	// DialectWriters list of differnt writers.
	DialectWriters = []schema.DialectWriter{&mysqlWriter{}}
)
View Source
var (
	// ErrNotDate an error for trying to corece/convert to Time a field that is not a time.
	ErrNotDate = errors.New("Unable to conver to time value")
)
View Source
var (
	// IntrospectCount is default number of rows to evaluate for introspection
	// based schema discovery.
	IntrospectCount = 20
)

Functions

func IntrospectSchema

func IntrospectSchema(s *schema.Schema, name string, iter schema.Iterator) error

IntrospectSchema discover schema from contents of row introspection.

func IntrospectTable

func IntrospectTable(tbl *schema.Table, iter schema.Iterator) error

IntrospectTable accepts a table and schema Iterator and will read a representative sample of rows, introspecting the results to create a schema. Generally used for CSV, Json files to create strongly typed schemas.

func KeyFromWhere

func KeyFromWhere(wh interface{}) schema.Key

Given a Where expression, lets try to create a key which

requires form    `idenity = "value"`

func MessageConversion

func MessageConversion(vals []interface{}) []schema.Message

MessageConversion convert values of type schema.Message.

func MysqlValueString

func MysqlValueString(t value.ValueType) string

func NewMySqlSessionVars

func NewMySqlSessionVars() expr.ContextReadWriter

func NewNamespacedContextReader

func NewNamespacedContextReader(basereader expr.ContextReader, namespace string) expr.ContextReader

NewNestedContextReader provides a context reader which prefixes all keys with a name space. This is useful if you have overlapping field names between ContextReaders within a NestedContextReader.

msg.Get("foo.key")

func NewNestedContextReadWriter

func NewNestedContextReadWriter(readers []expr.ContextReader, writer expr.ContextWriter, ts time.Time) expr.ContextReadWriter

NewNestedContextReader provides a context reader which is a composite of ordered child readers the first reader with a key will be used

func NewNestedContextReader

func NewNestedContextReader(readers []expr.ContextReader, ts time.Time) expr.ContextReader

NewNestedContextReader provides a context reader which is a composite of ordered child readers the first reader with a key will be used

func RowsForSession

func RowsForSession(ctx *plan.Context) [][]driver.Value

func SchemaDBStoreProvider

func SchemaDBStoreProvider(s *schema.Schema) schema.Source

SchemaDBStoreProvider create source for schemadb

Types

type ContextSimple

type ContextSimple struct {
	Data map[string]value.Value
	// contains filtered or unexported fields
}

func NewContextMap

func NewContextMap(data map[string]interface{}, namespacing bool) *ContextSimple

func NewContextMapTs

func NewContextMapTs(data map[string]interface{}, namespacing bool, ts time.Time) *ContextSimple

func NewContextSimple

func NewContextSimple() *ContextSimple

func NewContextSimpleData

func NewContextSimpleData(data map[string]value.Value) *ContextSimple

func NewContextSimpleNative

func NewContextSimpleNative(data map[string]interface{}) *ContextSimple

func NewContextSimpleTs

func NewContextSimpleTs(data map[string]value.Value, ts time.Time) *ContextSimple

func NewMySqlGlobalVars

func NewMySqlGlobalVars() *ContextSimple

func (*ContextSimple) All

func (m *ContextSimple) All() map[string]value.Value

func (*ContextSimple) Body

func (m *ContextSimple) Body() interface{}

func (*ContextSimple) Commit

func (m *ContextSimple) Commit(rowInfo []expr.SchemaInfo, row expr.RowWriter) error

func (*ContextSimple) Delete

func (m *ContextSimple) Delete(row map[string]value.Value) error

func (ContextSimple) Get

func (m ContextSimple) Get(key string) (value.Value, bool)

func (*ContextSimple) Id

func (m *ContextSimple) Id() uint64

func (*ContextSimple) Put

func (*ContextSimple) Row

func (m *ContextSimple) Row() map[string]value.Value

func (*ContextSimple) SupportNamespacing

func (m *ContextSimple) SupportNamespacing()

func (*ContextSimple) Ts

func (m *ContextSimple) Ts() time.Time

type ContextWrapper

type ContextWrapper struct {
	// contains filtered or unexported fields
}

func NewContextWrapper

func NewContextWrapper(val interface{}) *ContextWrapper

func (*ContextWrapper) Get

func (m *ContextWrapper) Get(key string) (value.Value, bool)

func (*ContextWrapper) Row

func (m *ContextWrapper) Row() map[string]value.Value

func (*ContextWrapper) Ts

func (m *ContextWrapper) Ts() time.Time

type CsvDataSource

type CsvDataSource struct {
	// contains filtered or unexported fields
}

Csv DataSource, implements qlbridge schema DataSource, SourceConn, Scanner

to allow csv files to be full featured databases.
- very, very naive scanner, forward only single pass
- can open a file with .Open()
- assumes comma delimited
- not thread-safe
- does not implement write operations

func NewCsvSource

func NewCsvSource(table string, indexCol int, ior io.Reader, exit <-chan bool) (*CsvDataSource, error)

NewCsvSource reader assumes we are getting first row as headers - optionally may be gzipped

func (*CsvDataSource) Close

func (m *CsvDataSource) Close() error

func (*CsvDataSource) Columns

func (m *CsvDataSource) Columns() []string

func (*CsvDataSource) Init

func (m *CsvDataSource) Init()

func (*CsvDataSource) Next

func (m *CsvDataSource) Next() schema.Message

func (*CsvDataSource) Open

func (m *CsvDataSource) Open(connInfo string) (schema.Conn, error)

func (*CsvDataSource) Setup

func (m *CsvDataSource) Setup(*schema.Schema) error

func (*CsvDataSource) Table

func (m *CsvDataSource) Table(tableName string) (*schema.Table, error)

func (*CsvDataSource) Tables

func (m *CsvDataSource) Tables() []string

type FileLineHandler

type FileLineHandler func(line []byte) (schema.Message, error)

type JsonHelperScannable

type JsonHelperScannable u.JsonHelper

JsonHelperScannable expects map json's (not array) map[string]interface

func (*JsonHelperScannable) MarshalJSON

func (m *JsonHelperScannable) MarshalJSON() ([]byte, error)

func (*JsonHelperScannable) Scan

func (m *JsonHelperScannable) Scan(src interface{}) error

Scan the database/sql interface for scanning sql byte vals into this typed structure.

func (*JsonHelperScannable) UnmarshalJSON

func (m *JsonHelperScannable) UnmarshalJSON(data []byte) error

UnmarshalJSON bytes into this typed struct

func (JsonHelperScannable) Value

func (m JsonHelperScannable) Value() (driver.Value, error)

Value This is the go sql/driver interface we need to implement to allow conversion back forth

type JsonSource

type JsonSource struct {
	// contains filtered or unexported fields
}

JsonSource implements qlbridge schema DataSource, SourceConn, Scanner to allow new line delimited json files to be full featured databases. - very, very naive scanner, forward only single pass - can open a file with .Open() - not thread-safe - does not implement write operations

func NewJsonSource

func NewJsonSource(table string, rc io.ReadCloser, exit <-chan bool, lh FileLineHandler) (*JsonSource, error)

NewJsonSource reader assumes we are getting NEW LINE delimted json file - optionally may be gzipped

func (*JsonSource) Close

func (m *JsonSource) Close() error

func (*JsonSource) Columns

func (m *JsonSource) Columns() []string

func (*JsonSource) CreateIterator

func (m *JsonSource) CreateIterator() schema.Iterator

func (*JsonSource) Init

func (m *JsonSource) Init()

func (*JsonSource) Next

func (m *JsonSource) Next() schema.Message

func (*JsonSource) Open

func (m *JsonSource) Open(connInfo string) (schema.Conn, error)

func (*JsonSource) Setup

func (m *JsonSource) Setup(*schema.Schema) error

func (*JsonSource) Table

func (m *JsonSource) Table(tableName string) (*schema.Table, error)

func (*JsonSource) Tables

func (m *JsonSource) Tables() []string

type JsonWrapper

type JsonWrapper json.RawMessage

JsonWrapper json data

func (*JsonWrapper) MarshalJSON

func (m *JsonWrapper) MarshalJSON() ([]byte, error)

func (*JsonWrapper) Scan

func (m *JsonWrapper) Scan(src interface{}) error

func (*JsonWrapper) Unmarshal

func (m *JsonWrapper) Unmarshal(v interface{}) error

func (*JsonWrapper) UnmarshalJSON

func (m *JsonWrapper) UnmarshalJSON(data []byte) error

UnmarshalJSON bytes into this typed struct

func (JsonWrapper) Value

func (m JsonWrapper) Value() (driver.Value, error)

Value This is the go sql/driver interface we need to implement to allow conversion back forth

type KeyCol

type KeyCol struct {
	Name string
	Val  driver.Value
}

Variety of Key Types

func NewKeyCol

func NewKeyCol(name string, val driver.Value) KeyCol

func (KeyCol) Key

func (m KeyCol) Key() driver.Value

type KeyInt

type KeyInt struct {
	Id int
}

Variety of Key Types

func NewKeyInt

func NewKeyInt(key int) KeyInt

func (*KeyInt) Key

func (m *KeyInt) Key() driver.Value

type KeyInt64

type KeyInt64 struct {
	Id int64
}

Variety of Key Types

func NewKeyInt64

func NewKeyInt64(key int64) KeyInt64

func (*KeyInt64) Key

func (m *KeyInt64) Key() driver.Value

type NamespacedContextReader

type NamespacedContextReader struct {
	// contains filtered or unexported fields
}

func (*NamespacedContextReader) Get

func (*NamespacedContextReader) Row

func (*NamespacedContextReader) Ts

type NestedContextReader

type NestedContextReader struct {
	// contains filtered or unexported fields
}

func (*NestedContextReader) Delete

func (n *NestedContextReader) Delete(delRow map[string]value.Value) error

func (*NestedContextReader) Get

func (n *NestedContextReader) Get(key string) (value.Value, bool)

func (*NestedContextReader) Put

func (*NestedContextReader) Row

func (n *NestedContextReader) Row() map[string]value.Value

func (*NestedContextReader) Ts

func (n *NestedContextReader) Ts() time.Time

type SchemaDb

type SchemaDb struct {
	// contains filtered or unexported fields
}

SchemaDb Static Schema Source, implements qlbridge DataSource to allow in-memory native go data to have a Schema and implement and be operated on by Sql Operations.

func NewSchemaDb

func NewSchemaDb(s *schema.Schema) *SchemaDb

NewSchemaDb create new db for storing schema.

func (*SchemaDb) Close

func (m *SchemaDb) Close() error

Close down everything.

func (*SchemaDb) DropTable

func (m *SchemaDb) DropTable(t string) error

func (*SchemaDb) Init

func (m *SchemaDb) Init()

Init initialize

func (*SchemaDb) Open

func (m *SchemaDb) Open(schemaObjectName string) (schema.Conn, error)

Open Create a SchemaSource specific to schema object (table, database)

func (*SchemaDb) Setup

func (m *SchemaDb) Setup(*schema.Schema) error

Setup the schemadb

func (*SchemaDb) Table

func (m *SchemaDb) Table(table string) (*schema.Table, error)

Table get schema Table

func (*SchemaDb) Tables

func (m *SchemaDb) Tables() []string

Tables list of table names.

type SchemaSource

type SchemaSource struct {
	// contains filtered or unexported fields
}

SchemaSource type for the schemadb connection (thread-safe).

func (*SchemaSource) Close

func (m *SchemaSource) Close() error

func (*SchemaSource) Columns

func (m *SchemaSource) Columns() []string

func (*SchemaSource) Get

func (m *SchemaSource) Get(key driver.Value) (schema.Message, error)

func (*SchemaSource) Next

func (m *SchemaSource) Next() schema.Message

func (*SchemaSource) SetContext

func (m *SchemaSource) SetContext(ctx *plan.Context)

SetContext set the plan context

func (*SchemaSource) SetRows

func (m *SchemaSource) SetRows(rows [][]driver.Value)

type SqlDriverMessage

type SqlDriverMessage struct {
	Vals  []driver.Value
	IdVal uint64
}

SqlDriverMessage context message of values.

func NewSqlDriverMessage

func NewSqlDriverMessage(id uint64, row []driver.Value) *SqlDriverMessage

func (*SqlDriverMessage) Body

func (m *SqlDriverMessage) Body() interface{}

func (*SqlDriverMessage) Id

func (m *SqlDriverMessage) Id() uint64

func (*SqlDriverMessage) ToMsgMap

func (m *SqlDriverMessage) ToMsgMap(colidx map[string]int) *SqlDriverMessageMap

type SqlDriverMessageMap

type SqlDriverMessageMap struct {
	Vals     []driver.Value // Values
	ColIndex map[string]int // Map of column names to ordinal position in vals
	IdVal    uint64         // id()
	// contains filtered or unexported fields
}

SqlDriverMessageMap Context message with column/position info.

func NewSqlDriverMessageMap

func NewSqlDriverMessageMap(id uint64, row []driver.Value, colindex map[string]int) *SqlDriverMessageMap

func NewSqlDriverMessageMapCtx

func NewSqlDriverMessageMapCtx(id uint64, ctx expr.ContextReader, colindex map[string]int) *SqlDriverMessageMap

func NewSqlDriverMessageMapEmpty

func NewSqlDriverMessageMapEmpty() *SqlDriverMessageMap

func NewSqlDriverMessageMapVals

func NewSqlDriverMessageMapVals(id uint64, row []driver.Value, cols []string) *SqlDriverMessageMap

func (*SqlDriverMessageMap) Body

func (m *SqlDriverMessageMap) Body() interface{}

func (*SqlDriverMessageMap) Copy

func (*SqlDriverMessageMap) Get

func (m *SqlDriverMessageMap) Get(key string) (value.Value, bool)

func (*SqlDriverMessageMap) Id

func (m *SqlDriverMessageMap) Id() uint64

func (*SqlDriverMessageMap) Key

func (m *SqlDriverMessageMap) Key() driver.Value

func (*SqlDriverMessageMap) Row

func (m *SqlDriverMessageMap) Row() map[string]value.Value

func (*SqlDriverMessageMap) SetKey

func (m *SqlDriverMessageMap) SetKey(key string)

func (*SqlDriverMessageMap) SetKeyHashed

func (m *SqlDriverMessageMap) SetKeyHashed(key string)

func (*SqlDriverMessageMap) SetRow

func (m *SqlDriverMessageMap) SetRow(row []driver.Value)

func (*SqlDriverMessageMap) Ts

func (m *SqlDriverMessageMap) Ts() time.Time

func (*SqlDriverMessageMap) Values

func (m *SqlDriverMessageMap) Values() []driver.Value

type StringArray

type StringArray []string

StringArray Convert json to array of strings

func (*StringArray) MarshalJSON

func (m *StringArray) MarshalJSON() ([]byte, error)

func (*StringArray) Scan

func (m *StringArray) Scan(src interface{}) error

Scan the database/sql interface for scanning sql byte vals into this typed structure.

func (*StringArray) UnmarshalJSON

func (m *StringArray) UnmarshalJSON(data []byte) error

func (StringArray) Value

func (m StringArray) Value() (driver.Value, error)

Value convert string to json values

type TimeValue

type TimeValue time.Time

TimeValue Convert a string/bytes to time.Time by parsing the string with a wide variety of different date formats that are supported in http://godoc.org/github.com/araddon/dateparse

func (*TimeValue) MarshalJSON

func (m *TimeValue) MarshalJSON() ([]byte, error)

func (*TimeValue) Scan

func (m *TimeValue) Scan(src interface{}) error

func (*TimeValue) Time

func (m *TimeValue) Time() time.Time

func (*TimeValue) UnmarshalJSON

func (m *TimeValue) UnmarshalJSON(data []byte) error

func (TimeValue) Value

func (m TimeValue) Value() (driver.Value, error)

Directories

Path Synopsis
Package files is a cloud (gcs, s3) and local file datasource that translates json, csv, files into appropriate interface for qlbridge DataSource so we can run queries.
Package files is a cloud (gcs, s3) and local file datasource that translates json, csv, files into appropriate interface for qlbridge DataSource so we can run queries.
Membtree implements a Datasource in-memory implemenation using the google btree.
Membtree implements a Datasource in-memory implemenation using the google btree.
Memdb package implements a Qlbridge Datasource in-memory implemenation using the hashicorp go-memdb (immuteable radix tree's).
Memdb package implements a Qlbridge Datasource in-memory implemenation using the hashicorp go-memdb (immuteable radix tree's).
Package mockcsv implements an in-memory csv data source for testing usage implemented by wrapping the mem-b-tree, loading csv data into it.
Package mockcsv implements an in-memory csv data source for testing usage implemented by wrapping the mem-b-tree, loading csv data into it.
Package mockcsvtestdata is csv test data only used for tests.
Package mockcsvtestdata is csv test data only used for tests.
Package sqlite implements a Qlbridge Datasource interface around sqlite that translates mysql syntax to sqlite.
Package sqlite implements a Qlbridge Datasource interface around sqlite that translates mysql syntax to sqlite.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL