lexer

package
v0.2.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 16, 2019 License: MIT Imports: 2 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

Lexer is a tokenizer that returns individual runes along with their associated types.

func New

func New(source string) *Lexer

New returns a new Lexer that wraps the given source string.

func (*Lexer) Next

func (l *Lexer) Next() bool

Next advances the lexer to the next token, and discards the current one. It must be called before any calls to Scan.

func (*Lexer) Peek

func (l *Lexer) Peek() (token *Token)

Peek returns the next token without consuming the current one. If the current token is the last token, Peek returns nil. It can be called before the first call to Next.

func (*Lexer) Scan

func (l *Lexer) Scan() *Token

Scan returns the current token.

type Token

type Token struct {
	Value string
	Type  TokenType
}

type TokenType

type TokenType int

TokenType enumerates the possible token types returned by the Lexer. Any unexported types never exit the package.

const (
	Asterisk TokenType
	Text
	Bracket
	Backslash
	Dash
	Caret
	Plus
)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL