Documentation ¶
Overview ¶
Package tokenizer provides functionality for splitting Cookie source code into *tokens*.
Index ¶
Constants ¶
const ( // IdentToken is a cookie language identifier: [a-z_][a-z0-9_]* IdentToken = iota // IntToken is a cookie language integer: [-]?[1-9][0-9]*|0 IntToken = iota // AssignToken is a cookie language assignment token: = AssignToken = iota // EolToken is a cookie language assignment end-of-line token: ; EolToken = iota // CurlyOpenToken is a cookie language function creation token: { CurlyOpenToken = iota // CurlyCloseToken is a cookie language function end token: { CurlyCloseToken = iota // FunctionToken is a cookie language function token: () FunctionToken = iota )
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type TokenAction ¶
TokenAction is a simple mapping of a token type (IdentToken, IntToken, etc) to a function that will be invoked if that type of token is found. The []byte will be the token that was found.
type Tokenizer ¶
type Tokenizer struct {
// contains filtered or unexported fields
}
Tokenizer is the basic handle that stores state for tokenizer a given IO stream.
func CreateTokenizer ¶
CreateTokenizer initializes and returns a Tokenizer object that can be used to split the IO stream into a series of tokens.
func (*Tokenizer) NextToken ¶
func (t *Tokenizer) NextToken(ra []TokenAction) error
NextToken takes a set of TokenActions and based on which Token is found next, invokes the given action associated with it. An error is returned if none of the given tokens is found next.