Documentation ¶
Overview ¶
Package lexer contains types and methods to transform source text into tokens.
Index ¶
Constants ¶
View Source
const EOF rune = -1
EOF indicates the end of file.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Tokenizer ¶
type Tokenizer struct {
// contains filtered or unexported fields
}
Tokenizer is used to transform the source text into lexical tokens.
func NewTokenizer ¶
func NewTokenizer(src string, opts ...TokenizerOption) *Tokenizer
NewTokenizer will create a Tokenizer used to produce tokens from given source text.
func (*Tokenizer) SetRawMode ¶
func (t *Tokenizer) SetRawMode()
SetRawMode sets tokenizer into raw mode making it ignore context of tokenizing.
func (*Tokenizer) UnsetRawMode ¶
func (t *Tokenizer) UnsetRawMode()
UnsetRawMode removes raw mode in tokenizer making it pay attention to the context of tokenizing.
type TokenizerOption ¶
type TokenizerOption func(*tokenizerOpts)
func WithUnsafe ¶
func WithUnsafe() TokenizerOption
WithUnsafe will make tokenizer to convert string to byte/rune slices using unsafe package (maybe).
Click to show internal directories.
Click to hide internal directories.