tokenizer

package
v0.900.9 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 27, 2024 License: MIT Imports: 5 Imported by: 0

Documentation

Overview

Package tokenizer implements a rudimentary tokens parser of buffered io.Reader while respecting quotes and parenthesis boundaries.

Example

tk := tokenizer.NewFromString("a, b, (c, d)")
result, _ := tk.ScanAll() // ["a", "b", "(c, d)"]

Index

Constants

This section is empty.

Variables

View Source
var DefaultSeparators = []rune{','}

DefaultSeparators is a list with the default token separator characters.

Functions

This section is empty.

Types

type Tokenizer

type Tokenizer struct {
	// contains filtered or unexported fields
}

Tokenizer defines a struct that parses a reader into tokens while respecting quotes and parenthesis boundaries.

func New

func New(r io.Reader) *Tokenizer

New creates new Tokenizer from the provided reader with DefaultSeparators.

func NewFromBytes

func NewFromBytes(b []byte) *Tokenizer

NewFromBytes creates new Tokenizer from the provided bytes slice.

func NewFromString

func NewFromString(str string) *Tokenizer

NewFromString creates new Tokenizer from the provided string.

func (*Tokenizer) IgnoreParenthesis

func (t *Tokenizer) IgnoreParenthesis(state bool)

IgnoreParenthesis defines whether to ignore the parenthesis boundaries and to treat the '(' and ')' as regular characters.

func (*Tokenizer) KeepEmptyTokens

func (t *Tokenizer) KeepEmptyTokens(state bool)

KeepEmptyTokens defines whether to keep empty tokens on Scan() (default to false).

func (*Tokenizer) KeepSeparator

func (t *Tokenizer) KeepSeparator(state bool)

KeepSeparator defines whether to keep the separator rune as part of the token (default to false).

func (*Tokenizer) Scan

func (t *Tokenizer) Scan() (string, error)

Scan reads and returns the next available token from the Tokenizer's buffer (trimmed!).

Empty tokens are skipped if t.keepEmptyTokens is not set (which is the default).

Returns io.EOF error when there are no more tokens to scan.

func (*Tokenizer) ScanAll

func (t *Tokenizer) ScanAll() ([]string, error)

ScanAll reads the entire Tokenizer's buffer and return all found tokens.

func (*Tokenizer) Separators

func (t *Tokenizer) Separators(separators ...rune)

Separators defines the provided separatos of the current Tokenizer.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL