lexer

package
v0.0.0-...-b83f9b5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 22, 2023 License: MIT Imports: 5 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

View Source
var (
	ErrForceStopped = errors.New("force stopped")
)

error messages

Functions

This section is empty.

Types

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

Lexer represents a lexical analyzer

func New

func New(r io.Reader) *Lexer

New initializes a Lexer object

func (*Lexer) Next

func (lx *Lexer) Next() bool

Next sends a signal to the Scan method for it to continue scanning

func (*Lexer) Scan

func (lx *Lexer) Scan() error

Scan starts scanning the reader for tokens.

func (*Lexer) Stop

func (lx *Lexer) Stop()

Stop requests the Scan method to stop scanning

func (*Lexer) Token

func (lx *Lexer) Token() *Token

Token returns the most recent scanned token

type Token

type Token struct {
	// contains filtered or unexported fields
}

Token represents a known sequence of characters (lexical unit)

func NewToken

func NewToken(tt TokenType, lexeme string, pos *scanner.Position) *Token

NewToken creates a lexical unit

func Tokenize

func Tokenize(in []byte) ([]Token, error)

Tokenize takes an array of bytes and returns all the tokens within it, or an error if a token can't be identified.

func (Token) Pos

func (t Token) Pos() scanner.Position

Pos returns the line and column of the lexical unit

func (Token) String

func (t Token) String() string

func (Token) Text

func (t Token) Text() string

Text returns the raw text of the lexical unit

func (Token) Type

func (t Token) Type() TokenType

Type returns the type of the lexical unit

type TokenType

type TokenType uint8

TokenType represents all the possible types of a lexical unit

const (
	TokenInvalid         TokenType = iota
	TokenOpenExpression            // Open parenthesis: "["
	TokenCloseExpression           // Close parenthesis: "]"
	TokenOpenList                  // Open square bracket: "("
	TokenCloseList                 // Close square bracker: ")"
	TokenOpenMap                   // Open curly bracket: "{"
	TokenCloseMap                  // Close curly bracker: "}"
	TokenNewLine                   // Newline: "\n"
	TokenDoubleQuote               // Double quote: '"'
	TokenHash                      // Hash: "#"
	TokenWhitespace                // Space, tab, linefeed or carriage return: \s\f\t\r
	TokenWord                      // Letters ([a-zA-Z]) and underscore
	TokenInteger                   // Integers
	TokenSequence                  // Extended sequence
	TokenColon                     // Colon: ":"
	TokenDot                       // Dot: "."
	TokenBackslash                 // Backslash: "\"
	TokenEOF                       // End of file
)

List of types of lexical units

func (TokenType) String

func (tt TokenType) String() string

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL