tokenizer

package
v0.0.6 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 12, 2021 License: MIT Imports: 2 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type StopIteration

type StopIteration struct{}

func (*StopIteration) Error

func (s *StopIteration) Error() string

type Token

type Token struct {
	Type   TokenType
	Value  string
	Line   int
	Column int
}

Token reprents a token recognized by the lexer

type TokenError

type TokenError struct {
	Line   int
	Column int
	Value  rune
}

func (*TokenError) Error

func (t *TokenError) Error() string

type TokenIterator

type TokenIterator interface {
	Next() (token Token, err error)
}

TokenIterator is the interface for Tokenizers.

Next returns the next token in the input string and any error encountered during lexing

type TokenType

type TokenType string
const (
	LeftBracket  TokenType = "LeftBracket"
	RightBracket TokenType = "RightBracket"
	Dash         TokenType = "Dash"
	Pipe         TokenType = "Pipe"
	Label        TokenType = "Label"
)

type Tokenizer

type Tokenizer struct {
	Input         string
	CurrentRune   rune
	NextRune      rune
	CurrentLine   int
	CurrentColumn int
	// contains filtered or unexported fields
}

func New

func New(input string) Tokenizer

func (*Tokenizer) Next

func (tokenizer *Tokenizer) Next() (Token, error)

Next implements the TokenIterator interface: it returns the next token in the input string, and any error encountered during lexing

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL