Documentation ¶
Index ¶
- Constants
- type LexFn
- type Lexer
- func (lexer *Lexer) Backup()
- func (lexer *Lexer) CurrentCharacter() string
- func (lexer *Lexer) CurrentInput() string
- func (lexer *Lexer) Dec()
- func (lexer *Lexer) Discard(count int)
- func (lexer *Lexer) Emit(tokenType TokenType)
- func (lexer *Lexer) EmitWithTransform(tokenType TokenType, transformFn TokenValueTransformer)
- func (lexer *Lexer) Errorf(format string, args ...interface{}) LexFn
- func (lexer *Lexer) Ignore()
- func (lexer *Lexer) Inc(count int)
- func (lexer *Lexer) InputToEnd() string
- func (lexer *Lexer) IsEOF() bool
- func (lexer *Lexer) IsNewline() bool
- func (lexer *Lexer) IsNumber() bool
- func (lexer *Lexer) IsWhitespace() bool
- func (lexer *Lexer) Next() rune
- func (lexer *Lexer) NextToken() Token
- func (lexer *Lexer) Peek() rune
- func (lexer *Lexer) PeekCharacters(numCharacters int) string
- func (lexer *Lexer) Run()
- func (lexer *Lexer) Shutdown()
- func (lexer *Lexer) SkipWhitespace()
- type Token
- type TokenType
- type TokenValueTransformer
Constants ¶
const EOF rune = 0
End of file
const NEWLINE string = "\n"
Newline character
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type LexFn ¶
LexFn defines a function type that lexer parsing functions must implement. These functions are what parse input text
type Lexer ¶
type Lexer struct { Name string Input string Tokens chan Token State LexFn Start int Pos int Width int }
Lexer object contains the state of our parser and provides a stream for accepting tokens.
Based on work by Rob Pike http://cuddle.googlecode.com/hg/talk/lex.html#landing-slide
func NewLexer ¶
NewLexer starts a new lexer with a given input string. This returns the instance of the lexer and a channel of tokens. Reading this stream is the way to parse a given input and perform processing.
func (*Lexer) Backup ¶
func (lexer *Lexer) Backup()
Backup puts the position tracker back to the beginning of the last read token.
func (*Lexer) CurrentCharacter ¶
CurrentCharacter returns the current character at the position tracker
func (*Lexer) CurrentInput ¶
CurrentInput returns a slice of the current input from the current lexer start position to the current position.
func (*Lexer) Dec ¶
func (lexer *Lexer) Dec()
Dec dsecrement the position tracker back a single character
func (*Lexer) Emit ¶
Emit puts a token onto the token channel. The value of this token is read from the input based on the current lexer position.
func (*Lexer) EmitWithTransform ¶
func (lexer *Lexer) EmitWithTransform(tokenType TokenType, transformFn TokenValueTransformer)
EmitWithTransform allows you to put a typed-token onto the channel. The value is read from the input based on the current lexer position, and then passed to a provided transform function. That is then placed on the token channel.
func (*Lexer) Errorf ¶
Errorf returns a token with error information. This conforms to the LexFn type
func (*Lexer) Ignore ¶
func (lexer *Lexer) Ignore()
Ignore disregards the current token by setting the lexer's start position to the current reading position.
func (*Lexer) InputToEnd ¶
InputToEnd returns a slice of the input from the current lexer position to the end of the input string.
func (*Lexer) IsWhitespace ¶
IsWhitespace returns true if then current character is whitespace
func (*Lexer) Next ¶
Next reads the next rune (character) from the input stream and advances the lexer position.
func (*Lexer) Peek ¶
Peek returns the next rune in the stream, then puts the lexer position back. Basically reads the next rune without consuming it.
func (*Lexer) PeekCharacters ¶
PeekCharacters returns what the next set of characters in the input stream is.
func (*Lexer) Run ¶
func (lexer *Lexer) Run()
Run starts the lexical analysis and feeding tokens into the token channel.
func (*Lexer) SkipWhitespace ¶
func (lexer *Lexer) SkipWhitespace()
SkipWhitespace skips whitespace characters until we get something meaningful.
type Token ¶
type Token struct { Type TokenType Value interface{} }
A Token represents a parsed item in a source input. A token has a type and a value. These are used to determine what to do next.
type TokenType ¶
type TokenType int
A TokenType defines the types of tokens available. Create your own to describe your input
type TokenValueTransformer ¶
type TokenValueTransformer func(tokenValue string) interface{}
A TokenValueTransformer is a function definition used to provide custom token value transformation from string to a typed-interface.