parse

package
v0.0.0-...-b42f105 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 28, 2014 License: GPL-3.0 Imports: 3 Imported by: 0

README

A simple parser library
=======================

Includes:

* a way of building lexers after Pike's state function concept

* a Pratt operator precedence parser
 

Documentation

Index

Constants

View Source
const Eof rune = -1

Out of band representation for EOF as a character.

Variables

This section is empty.

Functions

func Expect

func Expect(s string, t Token)

Panic with an Expected error unless the token's string matches.

func Expected

func Expected(s string, t Token) error

func TokenError

func TokenError(format string, t Token, args ...interface{}) error

func Unexpected

func Unexpected(t Token) error

func UnexpectedEof

func UnexpectedEof() error

Types

type Both

type Both interface {
	Prefix
	Infix
}

Handles both prefix and infix.

type Infix

type Infix interface {
	Precedence(t Token) int
	Infix(p *Parser, l *Lexer, left *Node, t Token) *Node
}

Handles tokens after the first in a particular call to Parse().

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

A lexical analyser where each state is a function.

func (*Lexer) Init

func (l *Lexer) Init(src io.Reader, nm string, start State) *Lexer

Initialise a Lexer with a source and a start state before beginning processing.

func (*Lexer) Lookahead

func (l *Lexer) Lookahead() Token

Parse a token from the source as Next(), but remember that token so that it is returned from the next call to Parse() or Lookahead().

func (*Lexer) Next

func (l *Lexer) Next() Token

Parse a token from the source.

Matching process:

  1. Begin with the start state passed into Init().
  2. Call the current state, passing in the Lexer.
  3. This provides the next state.
  4. If the next state is nil, halt.
  5. Otherwise go back to step 2 with the current state as the next state.

States should issue calls to Read(), Peek(), Clear() and Save() as required to move through the source and match tokens. These methods should not be called outside the dynamic extent of Next().

A given run through the matching process may produce no tokens (in which case the matching process is repeated), one token (in which case it is returned) or more tokens (in which case the first token match is returned, and the matching process is foregone for retrieving those extra tokens).

func (*Lexer) Scanned

func (l *Lexer) Scanned() string

Return the text that the lexer has scanned.

type Node

type Node struct {
	Kind   int
	Token  Token
	Data   interface{}
	Parent *Node
	Child  []*Node
}

Node in the syntax tree.

func (*Node) Add

func (n *Node) Add(cs ...*Node) *Node

Add a child node, setting said node's Parent field appropriately.

func (*Node) Scan

func (n *Node) Scan(f func(*Node) bool)

Call a function on the node. Recursively scan the node's children if that function returns true.

func (*Node) String

func (n *Node) String() string

type Parser

type Parser struct {
	// contains filtered or unexported fields
}

A Pratt operator precedence parser.

func (*Parser) Parse

func (p *Parser) Parse(l *Lexer, prec int) *Node

Parse an expression at a given precedence level, returning a syntax tree.

func (*Parser) ParseWith

func (p *Parser) ParseWith(l *Lexer, prec int, t Token) *Node

Parse an expression at a given precedence level, given an initial token.

func (*Parser) RegBoth

func (p *Parser) RegBoth(k int, s string, bp Both)

Register a handler for both infix and prefix.

func (*Parser) RegElse

func (p *Parser) RegElse(ep Prefix)

Register a handler for a failure to match

func (*Parser) RegInfix

func (p *Parser) RegInfix(k int, s string, ip Infix)

Register an infix handler.

func (*Parser) RegPrefix

func (p *Parser) RegPrefix(k int, s string, pp Prefix)

Register a prefix handler.

type ParserFunc

type ParserFunc func(p *Parser, l *Lexer, t Token) *Node

Simple wrapper round Prefix

func (ParserFunc) Prefix

func (f ParserFunc) Prefix(p *Parser, l *Lexer, t Token) *Node

type Prefix

type Prefix interface {
	Prefix(p *Parser, l *Lexer, t Token) *Node
}

Handles the token at the start of the expression.

type Source

type Source struct {
	// contains filtered or unexported fields
}

func (*Source) Clear

func (s *Source) Clear()

Ignore the portion of text currently under consideration.

func (*Source) Peek

func (s *Source) Peek() rune

Read a character from the source as in Read(), but remember that character so that it is returned from the next call to Read() or Peek().

func (*Source) Pos

func (s *Source) Pos() (int, int)

func (*Source) Read

func (s *Source) Read() rune

Read a character from the source. Will return Eof when at the end of the source and panic for any other errors.

func (*Source) Save

func (s *Source) Save(k int)

Save the portion of text currently under consideration with a provided Kind.

func (*Source) SetPos

func (s *Source) SetPos(i, n int)

type State

type State func(l *Source) State

A state in the lexical analysis. The programmer should provide these to the Lexer to allow it to process text.

type Token

type Token struct {
	Kind, Line int
	File, Text string
}

A token represents a section of a source text.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL