tokenizer

package
v0.0.0-...-c87e9c3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 28, 2019 License: GPL-3.0 Imports: 4 Imported by: 0

Documentation

Overview

Package tokenizer provides functionality for splitting Cookie source code into *tokens*.

Index

Constants

View Source
const (
	// IdentToken is a cookie language identifier: [a-z_][a-z0-9_]*
	IdentToken = iota

	// IntToken is a cookie language integer: [-]?[1-9][0-9]*|0
	IntToken = iota

	// AssignToken is a cookie language assignment token: =
	AssignToken = iota

	// EolToken is a cookie language assignment end-of-line token: ;
	EolToken = iota

	// CurlyOpenToken is a cookie language function creation token: {
	CurlyOpenToken = iota

	// CurlyCloseToken is a cookie language function end token: {
	CurlyCloseToken = iota

	// FunctionToken is a cookie language function token: ()
	FunctionToken = iota
)

Variables

This section is empty.

Functions

This section is empty.

Types

type TokenAction

type TokenAction struct {
	Token  int
	Action func([]byte) error
}

TokenAction is a simple mapping of a token type (IdentToken, IntToken, etc) to a function that will be invoked if that type of token is found. The []byte will be the token that was found.

type Tokenizer

type Tokenizer struct {
	// contains filtered or unexported fields
}

Tokenizer is the basic handle that stores state for tokenizer a given IO stream.

func CreateTokenizer

func CreateTokenizer(rd io.Reader) *Tokenizer

CreateTokenizer initializes and returns a Tokenizer object that can be used to split the IO stream into a series of tokens.

func (*Tokenizer) NextToken

func (t *Tokenizer) NextToken(ra []TokenAction) error

NextToken takes a set of TokenActions and based on which Token is found next, invokes the given action associated with it. An error is returned if none of the given tokens is found next.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL