encoding

package
v0.8.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 11, 2023 License: Apache-2.0 Imports: 13 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

View Source
var (
	// ErrInvalidData is an error that can be returned by an Encoder
	// in case the given input data was invalid within the context of that encoder.
	ErrInvalidData = errors.New("invalid data")
)

Functions

This section is empty.

Types

type Encoder

type Encoder interface {
	// EncodeRows encodes the data, as understood by the encoder,
	// into proto-encoded binary rows. ErrInvalidData is to be returned
	// in case the data was of an unexpected type or format. Any other error
	// can be returned for all other possible error cases.
	EncodeRows(data interface{}) (rows [][]byte, err error)
}

Encoder is the interface required by the BigQuery storage client in order to encode the data into the protobuf expected format.

type ProtobufEncoder

type ProtobufEncoder struct{}

ProtobufEncoder is the preferred encoder shipped with the bqwriter package. It encodes any valid proto Message, the schema itself is to be defined upon creating the actual storage client.

In case using a proto message is not an option for you, you can use the SchemaEncoder instead, but do not that you will be mostlikely pay a performance penalty for doing so.

func NewProtobufEncoder

func NewProtobufEncoder() *ProtobufEncoder

NewProtobufEncoder creates a new ProtobufEncoder. See the documentation on ProtobufEncoder to know what values can be encoded using it.

func (*ProtobufEncoder) EncodeRows

func (pbe *ProtobufEncoder) EncodeRows(data interface{}) ([][]byte, error)

EncodeRows implements Encoder::EncodeRows

Data passed in as input and to be encoded is expected to be a single row of data only.

type SchemaEncoder

type SchemaEncoder struct {
	// contains filtered or unexported fields
}

SchemaEncoder is an encoder that encodes the data into a single row, based on a dynamically defined BigQuery schema. If possible you should however use the ProtobufEncoder as it is much more efficient. On the contrary, this Encoder requires a lot of recflection as well as some possibly some extra trial-and-error.

The following values can be encoded:

  • []byte, expected to be a Json encoded message from which it will proto-encode using a json-driven decoder (see the official protobuf protojson package);
  • JsonMarshaler, which will be Json-encoded to []byte and follow the same path as previous option from here;
  • string, expected to be a Text (human-friendly) encoded message from which it will proto-encode using a text-driven decoder (see the official protobuf prototext package);
  • Stringer, which will be stringified to string and follow the same path as previous option here;

Any value of a type different than the ones listed above will be attempted to be encoded using the bigquery.StructSaver in order to be able to via that long road to Json-Encode it to a []byte value and use a Json-driven decoder in the end just as when you would have passed in a []byte value yourself. It's a long road and even much more so inefficient, you have been warned.

func NewSchemaEncoder

func NewSchemaEncoder(schema bigquery.Schema) (*SchemaEncoder, error)

NewSchemaEncoder creates a new SchemaEncoder. Can fail in case the given bigquery.Schema cannot be converted to a Protobuf Descriptor. See the SchemaEncoder documentation to learn more about what kind of values can be encoded using it.

func (*SchemaEncoder) EncodeRows

func (se *SchemaEncoder) EncodeRows(data interface{}) ([][]byte, error)

EncodeRows implements Encoder::EncodeRows

Data passed in as input and to be encoded is expected to be a single row of data only.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL