Documentation ¶
Index ¶
Constants ¶
const ( TokenT = iota TokenNT TokenEq TokenBar TokenEOL )
Variables ¶
This section is empty.
Functions ¶
func Tokenize ¶
func Tokenize(r io.Reader) (*TokenStream, *ParseError)
Tokenize will take any input (accessible from a Reader) and produce a token stream. If an error occurs while parsing, it will return a partial token stream.
Types ¶
type Expr ¶
type Expr struct { // Symbols is a slice of Symbols which would be required for us to // match. Symbols []Symbol // OrMatch is an expr which can be considered if this expr is not a // match, which itself may link to another expr. OrMatch *Expr }
An Expr is an expression which would match a conditional branch of logic for a given input. Every symbol in an expression is evaluated using boolean AND logic; boolean ORs by iterating to the next expression in OrMatch.
type Grammar ¶
func NewGrammar ¶
func NewGrammar(stream *TokenStream) (*Grammar, error)
NewGrammar takes a TokenStream and proceeds to build a grammar from it.
func (*Grammar) Build ¶
func (g *Grammar) Build(stream *TokenStream) error
Build will create all of the rules for a grammar based on an input stream of tokens
func (*Grammar) DefineRule ¶
func (*Grammar) Match ¶
func (g *Grammar) Match(str string) *ParseError
type Nonterminal ¶
type Nonterminal struct {
Name string
}
Nonterminals are symbols which represent other rules.
func NewNonterminal ¶
func NewNonterminal(_ *Grammar, val string) *Nonterminal
NewNonterminal returns a new Nonterminal object which is named for val
func (*Nonterminal) Match ¶
func (n *Nonterminal) Match(g *Grammar, scan *Scanner) *ParseError
type ParseError ¶
func (*ParseError) Error ¶
func (p *ParseError) Error() string
type Rule ¶
type Rule struct { // Name is the name of the rule (the left-hand side of the rule // definition) Name string // Condition is the expression which must be matched for a rule to accept // certain input. Condition *Expr }
A Rule is a named record that encapsulates some conditional logic for a given set of input. It's the entirety of a `<foo> ::= "..."` construct in BNF.
type Scanner ¶
type Scanner struct {
// contains filtered or unexported fields
}
func NewScanner ¶
func (*Scanner) FastForward ¶
func (*Scanner) StartsWith ¶
type Symbol ¶
type Symbol interface {
Match(*Grammar, *Scanner) *ParseError
}
Symbols are data that can be matched against input. This interface is currently empty because I don't yet know what would be that API.
type Terminal ¶
type Terminal struct {
Value string
}
Terminals are symbols which represent literal string values
func NewTerminal ¶
NewTerminal returns a new Terminal object that has a literal value of val
type TokenStream ¶
type TokenStream struct {
// contains filtered or unexported fields
}
A TokenStream is a set of tokens
func (*TokenStream) At ¶
func (s *TokenStream) At(first int, types ...int) bool
At will return true if the current tokens match the types given. For example, `stream.At(TokenNonterminal, TokenOpEqual)` would return true for tokens like `<foo> ::=`. We require at least one type, which is why the first parameter is not folded into the types slice.
func (*TokenStream) Ended ¶
func (s *TokenStream) Ended() bool
Ended will return true if the stream has no more tokens to provide. This is effectively the same as testing if `stream.Next() == nil`.
func (*TokenStream) Next ¶
func (s *TokenStream) Next() *Token
Next will return the next token according to the pos field. It will also increment pos to the next place. If we've reached the end of the stream, Next will return nil.