lexer

package
v0.0.0-...-7240e86 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 31, 2023 License: Apache-2.0 Imports: 3 Imported by: 0

Documentation

Index

Constants

View Source
const (
	EOFRune    rune      = -1
	EmptyToken TokenType = 0
)

Variables

This section is empty.

Functions

This section is empty.

Types

type L

type L struct {
	Err error

	ErrorHandler func(e string)
	// contains filtered or unexported fields
}

func New

func New(src string, start StateFunc) *L

New creates a returns a lexer ready to parse the given source code.

func (*L) Current

func (l *L) Current() string

Current returns the value being being analyzed at this moment.

func (*L) Emit

func (l *L) Emit(t TokenType)

Emit will receive a token type and push a new token with the current analyzed value into the tokens channel.

func (*L) Error

func (l *L) Error(e string)

func (*L) Ignore

func (l *L) Ignore()

Ignore clears the rewind stack and then sets the current beginning position to the current position in the source which effectively ignores the section of the source being analyzed.

func (*L) Next

func (l *L) Next() rune

Next pulls the next rune from the Lexer and returns it, moving the position forward in the source.

func (*L) NextToken

func (l *L) NextToken() (*Token, bool)

NextToken returns the next token from the lexer and a value to denote whether or not the token is finished.

func (*L) Peek

func (l *L) Peek() rune

Peek performs a Next operation immediately followed by a Rewind returning the peeked rune.

func (*L) Rewind

func (l *L) Rewind()

Rewind will take the last rune read (if any) and rewind back. Rewinds can occur more than once per call to Next but you can never rewind past the last point a token was emitted.

func (*L) Start

func (l *L) Start()

Start begins executing the Lexer in an asynchronous manner (using a goroutine).

func (*L) StartSync

func (l *L) StartSync()

func (*L) Take

func (l *L) Take(chars string)

Take receives a string containing all acceptable strings and will contine over each consecutive character in the source until a token not in the given string is encountered. This should be used to quickly pull token parts.

type StateFunc

type StateFunc func(*L) StateFunc

type Token

type Token struct {
	Type  TokenType
	Value string
}

type TokenType

type TokenType int

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL