lexer

package module
v0.0.0-...-25135da Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 11, 2015 License: MIT Imports: 3 Imported by: 0

Documentation

Index

Constants

View Source
const EOF rune = 0

End of file

View Source
const NEWLINE string = "\n"

Newline character

Variables

This section is empty.

Functions

This section is empty.

Types

type LexFn

type LexFn func(*Lexer) LexFn

LexFn defines a function type that lexer parsing functions must implement. These functions are what parse input text

type Lexer

type Lexer struct {
	Name   string
	Input  string
	Tokens chan Token
	State  LexFn

	Start int
	Pos   int
	Width int
}

Lexer object contains the state of our parser and provides a stream for accepting tokens.

Based on work by Rob Pike http://cuddle.googlecode.com/hg/talk/lex.html#landing-slide

func NewLexer

func NewLexer(name string, input string, startFn LexFn) *Lexer

NewLexer starts a new lexer with a given input string. This returns the instance of the lexer and a channel of tokens. Reading this stream is the way to parse a given input and perform processing.

func (*Lexer) Backup

func (lexer *Lexer) Backup()

Backup puts the position tracker back to the beginning of the last read token.

func (*Lexer) CurrentCharacter

func (lexer *Lexer) CurrentCharacter() string

CurrentCharacter returns the current character at the position tracker

func (*Lexer) CurrentInput

func (lexer *Lexer) CurrentInput() string

CurrentInput returns a slice of the current input from the current lexer start position to the current position.

func (*Lexer) Dec

func (lexer *Lexer) Dec()

Dec dsecrement the position tracker back a single character

func (*Lexer) Discard

func (lexer *Lexer) Discard(count int)

Discard throws away count characters by skipping right over them.

func (*Lexer) Emit

func (lexer *Lexer) Emit(tokenType TokenType)

Emit puts a token onto the token channel. The value of this token is read from the input based on the current lexer position.

func (*Lexer) EmitWithTransform

func (lexer *Lexer) EmitWithTransform(tokenType TokenType, transformFn TokenValueTransformer)

EmitWithTransform allows you to put a typed-token onto the channel. The value is read from the input based on the current lexer position, and then passed to a provided transform function. That is then placed on the token channel.

func (*Lexer) Errorf

func (lexer *Lexer) Errorf(format string, args ...interface{}) LexFn

Errorf returns a token with error information. This conforms to the LexFn type

func (*Lexer) Ignore

func (lexer *Lexer) Ignore()

Ignore disregards the current token by setting the lexer's start position to the current reading position.

func (*Lexer) Inc

func (lexer *Lexer) Inc(count int)

Inc move the position tracker forward x characters

func (*Lexer) InputToEnd

func (lexer *Lexer) InputToEnd() string

InputToEnd returns a slice of the input from the current lexer position to the end of the input string.

func (*Lexer) IsEOF

func (lexer *Lexer) IsEOF() bool

IsEOF returns true if the lexer is at the end of the input stream.

func (*Lexer) IsNewline

func (lexer *Lexer) IsNewline() bool

IsNewline returns true if the current character is a newline character

func (*Lexer) IsNumber

func (lexer *Lexer) IsNumber() bool

IsNumber returns true if the current character is a number

func (*Lexer) IsWhitespace

func (lexer *Lexer) IsWhitespace() bool

IsWhitespace returns true if then current character is whitespace

func (*Lexer) Next

func (lexer *Lexer) Next() rune

Next reads the next rune (character) from the input stream and advances the lexer position.

func (*Lexer) NextToken

func (lexer *Lexer) NextToken() Token

NextToken returns the next token from the channel

func (*Lexer) Peek

func (lexer *Lexer) Peek() rune

Peek returns the next rune in the stream, then puts the lexer position back. Basically reads the next rune without consuming it.

func (*Lexer) PeekCharacters

func (lexer *Lexer) PeekCharacters(numCharacters int) string

PeekCharacters returns what the next set of characters in the input stream is.

func (*Lexer) Run

func (lexer *Lexer) Run()

Run starts the lexical analysis and feeding tokens into the token channel.

func (*Lexer) Shutdown

func (lexer *Lexer) Shutdown()

Shutdown closes up the token stream

func (*Lexer) SkipWhitespace

func (lexer *Lexer) SkipWhitespace()

SkipWhitespace skips whitespace characters until we get something meaningful.

type Token

type Token struct {
	Type  TokenType
	Value interface{}
}

A Token represents a parsed item in a source input. A token has a type and a value. These are used to determine what to do next.

func (Token) IsEOF

func (token Token) IsEOF() bool

func (Token) IsEmpty

func (token Token) IsEmpty() bool

func (Token) IsError

func (token Token) IsError() bool

func (Token) String

func (token Token) String() string

type TokenType

type TokenType int

A TokenType defines the types of tokens available. Create your own to describe your input

const (
	TOKEN_ERROR TokenType = -2
	TOKEN_EOF   TokenType = -1
)

type TokenValueTransformer

type TokenValueTransformer func(tokenValue string) interface{}

A TokenValueTransformer is a function definition used to provide custom token value transformation from string to a typed-interface.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL