dataset: github.com/qri-io/dataset Index | Files | Directories

package dataset

import "github.com/qri-io/dataset"

Package dataset contains the qri ("query") dataset document definition This package contains the base definition, as well as a number of subpackages that build from this base to add functionality as necessary Datasets take inspiration from HTML documents, deliniating semantic purpose to predefined tags of the document, but instead of orienting around presentational markup, dataset documents emphasize interoperability and composition. The principle encoding format for a dataset document is JSON.

Alpha-Keys: Dataset documents are designed to produce consistent checksums when encoded for storage & transmission. To keep hashing consistent map keys are sorted lexographically for encoding. This applies to all fields of a dataset document except the body of a dataaset, where users may need to dictate the ordering of map keys

Pod ("Plain old Data") Pattern: To maintain high interoperability, dataset documents must support encoding & decoding ("coding", or "serialization") to and from many formats, fields of dataset documents that leverage "exotic" custom types are acommpanied by a "Plain Old Data" variant, denoted by a "Pod" suffix in their name Plain-Old-Data variants use only basic go types: string, bool, int, float64, []interface{}, etc. and have methods for clean encoding and decoding to their exotic forms

Index

Package Files

commit.go compare.go data_format.go data_format_config.go dataset.go hash.go kind.go meta.go structure.go transform.go viz.go

Constants

const (
    // KindDataset is the current kind for datasets
    KindDataset = Kind("ds:" + CurrentSpecVersion)
    // KindMeta is the current kind for metadata
    KindMeta = Kind("md:" + CurrentSpecVersion)
    // KindStructure is the current kind for dataset structures
    KindStructure = Kind("st:" + CurrentSpecVersion)
    // KindTransform is the current kind for dataset transforms
    KindTransform = Kind("tf:" + CurrentSpecVersion)
    // KindCommit is the current kind for dataset transforms
    KindCommit = Kind("cm:" + CurrentSpecVersion)
    // KindViz is the current kind for dataset transforms
    KindViz = Kind("vz:" + CurrentSpecVersion)
)
const CurrentSpecVersion = "0"

CurrentSpecVersion is the current verion of the dataset spec

Variables

var (
    // ErrInlineBody is the error for attempting to generate a body file when
    // body data is stored as native go types
    ErrInlineBody = fmt.Errorf("dataset body is inlined")
    // ErrNoResolver is an error for missing-but-needed resolvers
    ErrNoResolver = fmt.Errorf("no resolver available to fetch path")
)
var (
    // BaseSchemaArray is a minimum schema to constitute a dataset, specifying
    // the top level of the document is an array
    BaseSchemaArray = map[string]interface{}{"type": "array"}
    // BaseSchemaObject is a minimum schema to constitute a dataset, specifying
    // the top level of the document is an object
    BaseSchemaObject = map[string]interface{}{"type": "object"}
)
var ErrUnknownDataFormat = fmt.Errorf("Unknown Data Format")

ErrUnknownDataFormat is the expected error for when a data format is missing or unknown

func AbstractColumnName Uses

func AbstractColumnName(i int) string

AbstractColumnName is the "base26" value of a column name to make short, sql-valid, deterministic column names

func AccuralDuration Uses

func AccuralDuration(p string) time.Duration

AccuralDuration takes an ISO 8601 periodicity measure & returns a time.Duration invalid periodicities return time.Duration(0)

func CompareCommits Uses

func CompareCommits(a, b *Commit) error

CompareCommits checks if all fields of a Commit are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareDatasets Uses

func CompareDatasets(a, b *Dataset) error

CompareDatasets checks if all fields of a dataset are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareLicenses Uses

func CompareLicenses(a, b *License) error

CompareLicenses checks if all fields in two License pointers are equal, returning an error if unequal

func CompareMetas Uses

func CompareMetas(a, b *Meta) error

CompareMetas checks if all fields of a metadata struct are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareSchemas Uses

func CompareSchemas(a, b map[string]interface{}) error

CompareSchemas checks if all fields of two Schema pointers are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareStringSlices Uses

func CompareStringSlices(a, b []string) error

CompareStringSlices confirms two string slices are the same size, contain the same values, in the same order

func CompareStructures Uses

func CompareStructures(a, b *Structure) error

CompareStructures checks if all fields of two structure pointers are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareTransformResources Uses

func CompareTransformResources(a, b *TransformResource) error

CompareTransformResources checks if all fields are equal in both resources

func CompareTransforms Uses

func CompareTransforms(a, b *Transform) error

CompareTransforms checks if all fields of two transform pointers are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareVizs Uses

func CompareVizs(a, b *Viz) error

CompareVizs checks if all fields of two Viz pointers are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func HashBytes Uses

func HashBytes(data []byte) (hash string, err error)

HashBytes generates the base-58 encoded SHA-256 hash of a byte slice It's important to note that this is *NOT* the same as an IPFS hash, These hash functions should be used for other things like checksumming, in-memory content-addressing, etc.

func JSONHash Uses

func JSONHash(m json.Marshaler) (hash string, err error)

JSONHash calculates the hash of a json.Marshaler It's important to note that this is *NOT* the same as an IPFS hash, These hash functions should be used for other things like checksumming, in-memory content-addressing, etc.

type CSVOptions Uses

type CSVOptions struct {
    // HeaderRow specifies weather this csv file has a header row or not
    HeaderRow bool `json:"headerRow"`
    // If LazyQuotes is true, a quote may appear in an unquoted field and a
    // non-doubled quote may appear in a quoted field.
    LazyQuotes bool `json:"lazyQuotes"`
    // Separator is the field delimiter.
    // It is set to comma (',') by NewReader.
    // Comma must be a valid rune and must not be \r, \n,
    // or the Unicode replacement character (0xFFFD).
    Separator rune `json:"separator,omitempty"`
    // VariadicFields sets permits records to have a variable number of fields
    // avoid using this
    VariadicFields bool `json:"variadicFields"`
}

CSVOptions specifies configuration details for csv files This'll expand in the future to interoperate with okfn csv spec

func NewCSVOptions Uses

func NewCSVOptions(opts map[string]interface{}) (*CSVOptions, error)

NewCSVOptions creates a CSVOptions pointer from a map

func (*CSVOptions) Format Uses

func (*CSVOptions) Format() DataFormat

Format announces the CSV Data Format for the FormatConfig interface

func (*CSVOptions) Map Uses

func (o *CSVOptions) Map() map[string]interface{}

Map returns a map[string]interface representation of the configuration

type Citation Uses

type Citation struct {
    Name  string `json:"name,omitempty"`
    URL   string `json:"url,omitempty"`
    Email string `json:"email,omitempty"`
}

Citation is a place that this dataset drew it's information from

func (*Citation) Decode Uses

func (c *Citation) Decode(val interface{}) (err error)

Decode reads json.Umarshal-style data into a Citation

type Commit Uses

type Commit struct {
    // Author of this commit
    Author *User `json:"author,omitempty"`
    // Message is an optional
    Message string `json:"message,omitempty"`
    // Path is the location of this commit, transient
    Path string `json:"path,omitempty"`
    // Qri is this commit's qri kind
    Qri string `json:"qri,omitempty"`
    // Signature is a base58 encoded privateKey signing of Title
    Signature string `json:"signature,omitempty"`
    // Time this dataset was created. Required.
    Timestamp time.Time `json:"timestamp"`
    // Title of the commit. Required.
    Title string `json:"title"`
}

Commit encapsulates information about changes to a dataset in relation to other entries in a given history. Commit is directly analogous to the concept of a Commit Message in the git version control system. A full commit defines the administrative metadata of a dataset, answering "who made this dataset, when, and why"

func NewCommitRef Uses

func NewCommitRef(path string) *Commit

NewCommitRef creates an empty struct with it's internal path set

func UnmarshalCommit Uses

func UnmarshalCommit(v interface{}) (*Commit, error)

UnmarshalCommit tries to extract a dataset type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Commit) Assign Uses

func (cm *Commit) Assign(msgs ...*Commit)

Assign collapses all properties of a set of Commit onto one. this is directly inspired by Javascript's Object.assign

func (*Commit) DropTransientValues Uses

func (cm *Commit) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs

func (*Commit) IsEmpty Uses

func (cm *Commit) IsEmpty() bool

IsEmpty checks to see if any fields are filled out other than Path and Qri

func (*Commit) MarshalJSON Uses

func (cm *Commit) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaler interface for Commit Empty Commit instances with a non-empty path marshal to their path value otherwise, Commit marshals to an object

func (*Commit) MarshalJSONObject Uses

func (cm *Commit) MarshalJSONObject() ([]byte, error)

MarshalJSONObject always marshals to a json Object, even if meta is empty or a reference

func (*Commit) UnmarshalJSON Uses

func (cm *Commit) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaller for Commit

type DataFormat Uses

type DataFormat int

DataFormat represents different types of data formats. formats specified here have some degree of support within the dataset packages TODO - consider placing this in a subpackage: dataformats

const (
    // UnknownDataFormat is the default dataformat, meaning
    // that a data format should always be specified when
    // using the DataFormat type
    UnknownDataFormat DataFormat = iota
    // CSVDataFormat specifies comma separated value-formatted data
    CSVDataFormat
    // JSONDataFormat specifies Javascript Object Notation-formatted data
    JSONDataFormat
    // CBORDataFormat specifies RFC 7049 Concise Binary Object Representation
    // read more at cbor.io
    CBORDataFormat
    // XMLDataFormat specifies eXtensible Markup Language-formatted data
    // currently not supported.
    XMLDataFormat
    // XLSXDataFormat specifies microsoft excel formatted data
    XLSXDataFormat
)

func ParseDataFormatString Uses

func ParseDataFormatString(s string) (df DataFormat, err error)

ParseDataFormatString takes a string representation of a data format TODO (b5): trim "." prefix, remove prefixed map keys

func SupportedDataFormats Uses

func SupportedDataFormats() []DataFormat

SupportedDataFormats gives a slice of data formats that are expected to work with this dataset package. As we work through support for different formats, the last step of providing full support to a format will be an addition to this slice

func (DataFormat) MarshalJSON Uses

func (f DataFormat) MarshalJSON() ([]byte, error)

MarshalJSON satisfies the json.Marshaler interface

func (DataFormat) String Uses

func (f DataFormat) String() string

String implements stringer interface for DataFormat

func (*DataFormat) UnmarshalJSON Uses

func (f *DataFormat) UnmarshalJSON(data []byte) error

UnmarshalJSON satisfies the json.Unmarshaler interface

type Dataset Uses

type Dataset struct {

    // Body represents dataset data with native go types.
    // Datasets have at most one body. Body, BodyBytes, and BodyPath
    // work together, often with only one field used at a time
    Body interface{} `json:"body,omitempty"`
    // BodyBytes is for representing dataset data as a slice of bytes
    BodyBytes []byte `json:"bodyBytes,omitempty"`
    // BodyPath is the path to the hash of raw data as it resolves on the network
    BodyPath string `json:"bodyPath,omitempty"`

    // Commit contains author & change message information that describes this
    // version of a dataset
    Commit *Commit `json:"commit,omitempty"`
    // Meta contains all human-readable meta about this dataset intended to aid
    // in discovery and organization of this document
    Meta *Meta `json:"meta,omitempty"`

    // name reference for this dataset, transient
    Name string `json:"name,omitempty"`
    // Location of this dataset, transient
    Path string `json:"path,omitempty"`
    // Peername of dataset owner, transient
    Peername string `json:"peername,omitempty"`
    // PreviousPath connects datasets to form a historical merkle-DAG of snapshots
    // of this document, creating a version history
    PreviousPath string `json:"previousPath,omitempty"`
    // ProfileID of dataset owner, transient
    ProfileID string `json:"profileID,omitempty"`
    // Qri is a key for both identifying this document type, and versioning the
    // dataset document definition itself.
    Qri string `json:"qri"`
    // Structure of this dataset
    Structure *Structure `json:"structure,omitempty"`
    // Transform is a path to the transformation that generated this resource
    Transform *Transform `json:"transform,omitempty"`
    // Viz stores configuration data related to representing a dataset as
    // a visualization
    Viz *Viz `json:"viz,omitempty"`
    // contains filtered or unexported fields
}

Dataset is a document for describing & storing structured data. Dataset documents are designed to satisfy the FAIR principle of being Findable, Accessible, Interoperable, and Reproducible, in relation to other dataset documents, and related-but-separate technologies such as data catalogs, HTTP API's, and data package formats Datasets are designed to be stored and distributed on content-addressed (identify-by-hash) systems The dataset document definition is built from a research-first principle, valuing direct interoperability with existing standards over novel definitions or specifications

func NewDatasetRef Uses

func NewDatasetRef(path string) *Dataset

NewDatasetRef creates a Dataset pointer with the internal path property specified, and no other fields.

func UnmarshalDataset Uses

func UnmarshalDataset(v interface{}) (*Dataset, error)

UnmarshalDataset tries to extract a dataset type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Dataset) Assign Uses

func (ds *Dataset) Assign(datasets ...*Dataset)

Assign collapses all properties of a group of datasets onto one. this is directly inspired by Javascript's Object.assign

func (*Dataset) BodyFile Uses

func (ds *Dataset) BodyFile() qfs.File

BodyFile exposes bodyFile if one is set. Callers that use the file in any way (eg. by calling Read) should consume the entire file and call Close

func (*Dataset) DropTransientValues Uses

func (ds *Dataset) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs note that DropTransientValues does *not* drop the transient values of child components of a dataset, each component's DropTransientValues method must be called separately

func (*Dataset) IsEmpty Uses

func (ds *Dataset) IsEmpty() bool

IsEmpty checks to see if dataset has any fields other than the Path & Qri fields

func (*Dataset) MarshalJSON Uses

func (ds *Dataset) MarshalJSON() ([]byte, error)

MarshalJSON uses a map to combine meta & standard fields. Marshalling a map[string]interface{} automatically alpha-sorts the keys.

func (*Dataset) OpenBodyFile Uses

func (ds *Dataset) OpenBodyFile(resolver qfs.PathResolver) (err error)

OpenBodyFile sets the byte stream of file data, prioritizing: * erroring when the body is inline * creating an in-place file from bytes * passing BodyPath to the resolver once resolved, the file is set to an internal field, which is accessible via the BodyFile method. separating into two steps decouples loading from access

func (*Dataset) SetBodyFile Uses

func (ds *Dataset) SetBodyFile(file qfs.File)

SetBodyFile assigns the bodyFile.

func (*Dataset) SignableBytes Uses

func (ds *Dataset) SignableBytes() ([]byte, error)

SignableBytes produces the portion of a commit message used for signing the format for signable bytes is: * commit timestamp in RFC3339 format, UTC timezone * newline character * dataset structure checksum string checksum string should be a base58-encoded multihash of the dataset data

func (*Dataset) UnmarshalJSON Uses

func (ds *Dataset) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaller

type FormatConfig Uses

type FormatConfig interface {
    // Format gives the data format being configured
    Format() DataFormat
    // map gives an object of configuration details
    Map() map[string]interface{}
}

FormatConfig is the interface for data format configurations

func NewXLSXOptions Uses

func NewXLSXOptions(opts map[string]interface{}) (FormatConfig, error)

NewXLSXOptions creates a XLSXOptions pointer from a map

func ParseFormatConfigMap Uses

func ParseFormatConfigMap(f DataFormat, opts map[string]interface{}) (FormatConfig, error)

ParseFormatConfigMap returns a FormatConfig implementation for a given data format and options map, often used in decoding from recorded formats like, say, JSON

type JSONOptions Uses

type JSONOptions struct {
}

JSONOptions specifies configuration details for json file format

func NewJSONOptions Uses

func NewJSONOptions(opts map[string]interface{}) (*JSONOptions, error)

NewJSONOptions creates a JSONOptions pointer from a map

func (*JSONOptions) Format Uses

func (*JSONOptions) Format() DataFormat

Format announces the JSON Data Format for the FormatConfig interface

func (*JSONOptions) Map Uses

func (o *JSONOptions) Map() map[string]interface{}

Map returns a map[string]interface representation of the configuration

type Kind Uses

type Kind string

Kind is a short identifier for all types of qri dataset objects Kind does three things: 1. Distinguish qri datasets from other formats 2. Distinguish different types (Dataset/Structure/Transform/etc.) 3. Distinguish between versions of the dataset spec Kind is a string in the format ds:[version]

func (Kind) String Uses

func (k Kind) String() string

String implements the stringer interface

func (Kind) Type Uses

func (k Kind) Type() string

Type returns the type identifier

func (*Kind) UnmarshalJSON Uses

func (k *Kind) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the JSON.Unmarshaler interface, rejecting any strings that are not a valid kind

func (Kind) Valid Uses

func (k Kind) Valid() error

Valid checks to see if a kind string is valid

func (Kind) Version Uses

func (k Kind) Version() string

Version returns the version portion of the kind identifier

type License Uses

type License struct {
    Type string `json:"type,omitempty"`
    URL  string `json:"url,omitempty"`
}

License represents a legal licensing agreement

func (*License) Decode Uses

func (l *License) Decode(val interface{}) (err error)

Decode reads json.Umarshal-style data into a License

type Meta Uses

type Meta struct {

    // Url to access the dataset
    AccessURL string `json:"accessURL,omitempty"`
    // The frequency with which dataset changes. Must be an ISO 8601 repeating
    // duration
    AccrualPeriodicity string `json:"accrualPeriodicity,omitempty"`
    // Citations is a slice of assets used to build this dataset
    Citations []*Citation `json:"citations"`
    // Contribute
    Contributors []*User `json:"contributors,omitempty"`
    // Description follows the DCAT sense of the word, it should be around a
    // paragraph of human-readable text
    Description string `json:"description,omitempty"`
    // Url that should / must lead directly to the data itself
    DownloadURL string `json:"downloadURL,omitempty"`
    // HomeURL is a path to a "home" resource
    HomeURL string `json:"homeURL,omitempty"`
    // Identifier is for *other* data catalog specifications. Identifier should
    // not be used or relied on to be unique, because this package does not
    // enforce any of these rules.
    Identifier string `json:"identifier,omitempty"`
    // String of Keywords
    Keywords []string `json:"keywords,omitempty"`
    // Languages this dataset is written in
    Language []string `json:"language,omitempty"`
    // License will automatically parse to & from a string value if provided as a
    // raw string
    License *License `json:"license,omitempty"`
    // path is the location of meta, transient
    Path string `json:"path,omitempty"`
    // Kind is required, must be qri:md:[version]
    Qri string `json:"qri,omitempty"`
    // path to dataset readme file, not part of the DCAT spec, but a common
    // convention in software dev
    ReadmeURL string `json:"readmeURL,omitempty"`
    // Title of this dataset
    Title string `json:"title,omitempty"`
    // "Category" for
    Theme []string `json:"theme,omitempty"`
    // Version is the version identifier for this dataset
    Version string `json:"version,omitempty"`
    // contains filtered or unexported fields
}

Meta contains human-readable descriptive metadata that qualifies and distinguishes a dataset. Well-defined Meta should aid in making datasets Findable by describing a dataset in generalizable taxonomies that can aggregate across other dataset documents. Because dataset documents are intended to interoperate with many other data storage and cataloging systems, meta fields and conventions are derived from existing metadata formats whenever possible

func NewMetaRef Uses

func NewMetaRef(path string) *Meta

NewMetaRef creates a Meta pointer with the internal path property specified, and no other fields.

func UnmarshalMeta Uses

func UnmarshalMeta(v interface{}) (*Meta, error)

UnmarshalMeta tries to extract a metadata type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Meta) Assign Uses

func (md *Meta) Assign(metas ...*Meta)

Assign collapses all properties of a group of metadata structs onto one. this is directly inspired by Javascript's Object.assign

func (*Meta) DropTransientValues Uses

func (md *Meta) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs

func (*Meta) IsEmpty Uses

func (md *Meta) IsEmpty() bool

IsEmpty checks to see if dataset has any fields other than the internal path

func (*Meta) MarshalJSON Uses

func (md *Meta) MarshalJSON() ([]byte, error)

MarshalJSON uses a map to combine meta & standard fields. Marshalling a map[string]interface{} automatically alpha-sorts the keys.

func (*Meta) MarshalJSONObject Uses

func (md *Meta) MarshalJSONObject() ([]byte, error)

MarshalJSONObject always marshals to a json Object, even if meta is empty or a reference

func (*Meta) Meta Uses

func (md *Meta) Meta() map[string]interface{}

Meta gives access to additional metadata not covered by dataset metadata

func (*Meta) Set Uses

func (md *Meta) Set(key string, val interface{}) (err error)

Set writes value to key in metadata, erroring if the type is invalid input values are expected to be json.Unmarshal types

func (*Meta) UnmarshalJSON Uses

func (md *Meta) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaller

type Structure Uses

type Structure struct {
    // Checksum is a bas58-encoded multihash checksum of the entire data
    // file this structure points to. This is different from IPFS
    // hashes, which are calculated after breaking the file into blocks
    Checksum string `json:"checksum,omitempty"`
    // Compression specifies any compression on the source data,
    // if empty assume no compression
    Compression string `json:"compression,omitempty"`
    // Maximum nesting level of composite types in the dataset. eg: depth 1 == [], depth 2 == [[]]
    Depth int `json:"depth,omitempty"`
    // Encoding specifics character encoding, assume utf-8 if not specified
    Encoding string `json:"encoding,omitempty"`
    // ErrCount is the number of errors returned by validating data
    // against this schema. required
    ErrCount int `json:"errCount"`
    // Entries is number of top-level entries in the dataset. With tablular data
    // this is the same as the number of "rows"
    Entries int `json:"entries,omitempty"`
    // Format specifies the format of the raw data MIME type
    Format string `json:"format"`
    // FormatConfig removes as much ambiguity as possible about how
    // to interpret the speficied format.
    // FormatConfig FormatConfig `json:"formatConfig,omitempty"`
    FormatConfig map[string]interface{} `json:"formatConfig,omitempty"`

    // Length is the length of the data object in bytes.
    // must always match & be present
    Length int `json:"length,omitempty"`
    // location of this structure, transient
    Path string `json:"path,omitempty"`
    // Qri should always be KindStructure
    Qri string `json:"qri"`
    // Schema contains the schema definition for the underlying data, schemas
    // are defined using the IETF json-schema specification. for more info
    // on json-schema see: https://json-schema.org
    Schema map[string]interface{} `json:"schema,omitempty"`
}

Structure defines the characteristics of a dataset document necessary for a machine to interpret the dataset body. Structure fields are things like the encoding data format (JSON,CSV,etc.), length of the dataset body in bytes, stored in a rigid form intended for machine use. A well defined structure & accompanying software should allow the end user to spend more time focusing on the data itself Two dataset documents that both have a defined structure will have some degree of natural interoperability, depending first on the amount of detail provided in a dataset's structure, and then by the natural comparibilty of the datasets

func NewStructureRef Uses

func NewStructureRef(path string) *Structure

NewStructureRef creates an empty struct with it's internal path set

func UnmarshalStructure Uses

func UnmarshalStructure(v interface{}) (*Structure, error)

UnmarshalStructure tries to extract a structure type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Structure) Abstract Uses

func (s *Structure) Abstract() *Structure

Abstract returns this structure instance in it's "Abstract" form stripping all nonessential values & renaming all schema field names to standard variable names

func (*Structure) Assign Uses

func (s *Structure) Assign(structures ...*Structure)

Assign collapses all properties of a group of structures on to one this is directly inspired by Javascript's Object.assign

func (*Structure) DataFormat Uses

func (st *Structure) DataFormat() DataFormat

DataFormat gives format as a DataFormat type, returning UnknownDataFormat in any case where st.DataFormat is an invalid string

func (*Structure) DropTransientValues Uses

func (st *Structure) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs

func (*Structure) Hash Uses

func (s *Structure) Hash() (string, error)

Hash gives the hash of this structure

func (*Structure) IsEmpty Uses

func (s *Structure) IsEmpty() bool

IsEmpty checks to see if structure has any fields other than the internal path

func (*Structure) JSONSchema Uses

func (st *Structure) JSONSchema() (*jsonschema.RootSchema, error)

JSONSchema parses the Schema field into a json-schema

func (Structure) MarshalJSON Uses

func (s Structure) MarshalJSON() (data []byte, err error)

MarshalJSON satisfies the json.Marshaler interface

func (Structure) MarshalJSONObject Uses

func (s Structure) MarshalJSONObject() ([]byte, error)

MarshalJSONObject always marshals to a json Object, even if meta is empty or a reference

func (*Structure) UnmarshalJSON Uses

func (s *Structure) UnmarshalJSON(data []byte) (err error)

UnmarshalJSON satisfies the json.Unmarshaler interface

type Theme Uses

type Theme struct {
    Description     string `json:"description,omitempty"`
    DisplayName     string `json:"display_name,omitempty"`
    ImageDisplayURL string `json:"image_display_url,omitempty"`
    ID              string `json:"id,omitempty"`
    Name            string `json:"name,omitempty"`
    Title           string `json:"title,omitempty"`
}

Theme is pulled from the Project Open Data Schema version 1.1

type Transform Uses

type Transform struct {
    // Config outlines any configuration that would affect the resulting hash
    Config map[string]interface{} `json:"config,omitempty"`
    // location of the transform object, transient
    Path string `json:"path,omitempty"`
    // Kind should always equal KindTransform
    Qri string `json:"qri,omitempty"`
    // Resources is a map of all datasets referenced in this transform, with
    // alphabetical keys generated by datasets in order of appearance within the
    // transform
    Resources map[string]*TransformResource `json:"resources,omitempty"`

    // ScriptBytes is for representing a script as a slice of bytes, transient
    ScriptBytes []byte `json:"scriptBytes,omitempty"`
    // ScriptPath is the path to the script that produced this transformation.
    ScriptPath string `json:"scriptPath,omitempty"`
    // Secrets is a map of secret values used in the transformation, transient.
    // TODO (b5): make this not-transient by censoring the values used, but not keys
    Secrets map[string]string `json:"secrets,omitempty"`
    // Syntax this transform was written in
    Syntax string `json:"syntax,omitempty"`
    // SyntaxVersion is an identifier for the application and version number that
    // produced the result
    SyntaxVersion string `json:"syntaxVersion,omitempty"`
    // contains filtered or unexported fields
}

Transform is a record of executing a transformation on data. Transforms can theoretically be anything from an SQL query, a jupyter notebook, the state of an ETL pipeline, etc, so long as the input is zero or more datasets, and the output is a single dataset Ideally, transforms should contain all the machine-necessary bits to deterministicly execute the algorithm referenced in "ScriptPath".

func NewTransformRef Uses

func NewTransformRef(path string) *Transform

NewTransformRef creates a Transform pointer with the internal path property specified, and no other fields.

func UnmarshalTransform Uses

func UnmarshalTransform(v interface{}) (*Transform, error)

UnmarshalTransform tries to extract a resource type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Transform) Assign Uses

func (q *Transform) Assign(qs ...*Transform)

Assign collapses all properties of a group of queries onto one. this is directly inspired by Javascript's Object.assign

func (*Transform) DropTransientValues Uses

func (q *Transform) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs

func (*Transform) IsEmpty Uses

func (q *Transform) IsEmpty() bool

IsEmpty checks to see if transform has any fields other than the internal path

func (Transform) MarshalJSON Uses

func (q Transform) MarshalJSON() ([]byte, error)

MarshalJSON satisfies the json.Marshaler interface

func (Transform) MarshalJSONObject Uses

func (q Transform) MarshalJSONObject() ([]byte, error)

MarshalJSONObject always marshals to a json Object, even if meta is empty or a reference

func (*Transform) OpenScriptFile Uses

func (q *Transform) OpenScriptFile(resolver qfs.PathResolver) (err error)

OpenScriptFile generates a byte stream of script data prioritizing creating an in-place file from ScriptBytes when defined, fetching from the passed-in resolver otherwise

func (*Transform) ScriptFile Uses

func (q *Transform) ScriptFile() qfs.File

ScriptFile gives the internal file, if any. Callers that use the file in any way (eg. by calling Read) should consume the entire file and call Close

func (*Transform) SetScriptFile Uses

func (q *Transform) SetScriptFile(file qfs.File)

SetScriptFile assigns the scriptFile

func (*Transform) UnmarshalJSON Uses

func (q *Transform) UnmarshalJSON(data []byte) error

UnmarshalJSON satisfies the json.Unmarshaler interface

type TransformResource Uses

type TransformResource struct {
    Path string `json:"path"`
}

TransformResource describes an external data dependency, the prime use case is for importing other datasets, but in the future this may be expanded to include details that specify resources other than datasets (urls?), and details for interpreting the resource (eg. a selector to specify only a subset of a resource is required)

func (*TransformResource) UnmarshalJSON Uses

func (r *TransformResource) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler, allowing both string and object representations

type User Uses

type User struct {
    ID       string `json:"id,omitempty"`
    Fullname string `json:"name,omitempty"`
    Email    string `json:"email,omitempty"`
}

User is a placholder for talking about people, groups, organizations

func (*User) Decode Uses

func (u *User) Decode(val interface{}) (err error)

Decode reads json.Umarshal-style data into a User

type Viz Uses

type Viz struct {
    // Format designates the visualization configuration syntax. currently the
    // only supported syntax is "html"
    Format string `json:"format,omitempty"`
    // path is the location of a viz, transient
    Path string `json:"path,omitempty"`
    // Qri should always be "vc:0"
    Qri string `json:"qri,omitempty"`

    // ScriptBytes is for representing a script as a slice of bytes, transient
    ScriptBytes []byte `json:"scriptBytes,omitempty"`
    // ScriptPath is the path to the script that created this
    ScriptPath string `json:"scriptPath,omitempty"`
    // contains filtered or unexported fields
}

Viz stores configuration data related to representing a dataset as a visualization

func NewVizRef Uses

func NewVizRef(path string) *Viz

NewVizRef creates an empty struct with it's internal path set

func UnmarshalViz Uses

func UnmarshalViz(v interface{}) (*Viz, error)

UnmarshalViz tries to extract a resource type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Viz) Assign Uses

func (v *Viz) Assign(visConfigs ...*Viz)

Assign collapses all properties of a group of structures on to one this is directly inspired by Javascript's Object.assign

func (*Viz) DropTransientValues Uses

func (v *Viz) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs

func (*Viz) IsEmpty Uses

func (v *Viz) IsEmpty() bool

IsEmpty checks to see if Viz has any fields other than the internal path

func (*Viz) MarshalJSON Uses

func (v *Viz) MarshalJSON() ([]byte, error)

MarshalJSON satisfies the json.Marshaler interface

func (*Viz) MarshalJSONObject Uses

func (v *Viz) MarshalJSONObject() ([]byte, error)

MarshalJSONObject always marshals to a json Object, even if Viz is empty or a reference

func (*Viz) OpenScriptFile Uses

func (v *Viz) OpenScriptFile(resolver qfs.PathResolver) (err error)

OpenScriptFile generates a byte stream of script data prioritizing creating an in-place file from ScriptBytes when defined, fetching from the passed-in resolver otherwise

func (*Viz) ScriptFile Uses

func (v *Viz) ScriptFile() qfs.File

ScriptFile exposes scriptFile if one is set. Callers that use the file in any way (eg. by calling Read) should consume the entire file and call Close

func (*Viz) SetScriptFile Uses

func (v *Viz) SetScriptFile(file qfs.File)

SetScriptFile assigns the unexported scriptFile

func (*Viz) UnmarshalJSON Uses

func (v *Viz) UnmarshalJSON(data []byte) error

UnmarshalJSON satisfies the json.Unmarshaler interface

type XLSXOptions Uses

type XLSXOptions struct {
    SheetName string `json:"sheetName,omitempty"`
}

XLSXOptions specifies configuraiton details for the xlsx file format

func (*XLSXOptions) Format Uses

func (*XLSXOptions) Format() DataFormat

Format announces the XLSX data format for the FormatConfig interface

func (*XLSXOptions) Map Uses

func (o *XLSXOptions) Map() map[string]interface{}

Map structures XLSXOptions as a map of string keys to values

Directories

PathSynopsis
compressionPackage compression is a horrible hack & should be replaced as soon as humanly possible
detect
dsfsPackage dsfs glues datsets to cafs (content-addressed-file-system)
dsgraphPackage dsgraph is a placeholder package for linking queries, resources, and metadata until proper packaging & architectural decisions can be made
dsioPackage dsio defines writers & readers for operating on "container" data structures (objects and arrays)
dsio/replacecrPackage replacecr defines a wrapper for replacing solo carriage return characters (\r) with carriage-return + line feed (\r\n)
dstestPackage dstest defines an interface for reading test cases from static files leveraging directories of test dataset input files & expected output files
dsutilPackage dsutil includes dataset util funcs, placed here to avoid dataset package bloat TODO - consider merging this package with the dsfs package, as most of the functions in here rely on a Filestore argument
generatePackage generate is for generating random data from given structures
subsetPackage subset provides methods for extracting defined abbreviations of a dataset document.
use_generate
validate
vals

Package dataset imports 11 packages (graph) and is imported by 25 packages. Updated 2019-02-12. Refresh now. Tools for package owners.