lexer

package
v0.0.0-...-70fd0a4 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 24, 2023 License: Apache-2.0 Imports: 6 Imported by: 0

Documentation

Overview

Package lexer contains types and methods to transform source text into tokens.

Index

Constants

View Source
const EOF rune = -1

EOF indicates the end of file.

Variables

This section is empty.

Functions

This section is empty.

Types

type Tokenizer

type Tokenizer struct {
	// contains filtered or unexported fields
}

Tokenizer is used to transform the source text into lexical tokens.

func NewTokenizer

func NewTokenizer(src string, opts ...TokenizerOption) *Tokenizer

NewTokenizer will create a Tokenizer used to produce tokens from given source text.

func (*Tokenizer) Next

func (t *Tokenizer) Next() token.Token

Next emits next token.

func (*Tokenizer) SetRawMode

func (t *Tokenizer) SetRawMode()

SetRawMode sets tokenizer into raw mode making it ignore context of tokenizing.

func (*Tokenizer) UnsetRawMode

func (t *Tokenizer) UnsetRawMode()

UnsetRawMode removes raw mode in tokenizer making it pay attention to the context of tokenizing.

type TokenizerOption

type TokenizerOption func(*tokenizerOpts)

func WithUnsafe

func WithUnsafe() TokenizerOption

WithUnsafe will make tokenizer to convert string to byte/rune slices using unsafe package (maybe).

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL