bnf

package
v0.0.0-...-9b2de64 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 6, 2022 License: MIT Imports: 7 Imported by: 0

Documentation

Index

Constants

View Source
const (
	TokenT = iota
	TokenNT
	TokenEq
	TokenBar
	TokenEOL
)

Variables

This section is empty.

Functions

func Tokenize

func Tokenize(r io.Reader) (*TokenStream, *ParseError)

Tokenize will take any input (accessible from a Reader) and produce a token stream. If an error occurs while parsing, it will return a partial token stream.

Types

type Expr

type Expr struct {
	// Symbols is a slice of Symbols which would be required for us to
	// match.
	Symbols []Symbol

	// OrMatch is an expr which can be considered if this expr is not a
	// match, which itself may link to another expr.
	OrMatch *Expr
}

An Expr is an expression which would match a conditional branch of logic for a given input. Every symbol in an expression is evaluated using boolean AND logic; boolean ORs by iterating to the next expression in OrMatch.

func (*Expr) Match

func (e *Expr) Match(g *Grammar, scan *Scanner) *ParseError

type Grammar

type Grammar struct {
	MainRule *Rule
	Rules    map[string]*Rule
}

func NewGrammar

func NewGrammar(stream *TokenStream) (*Grammar, error)

NewGrammar takes a TokenStream and proceeds to build a grammar from it.

func (*Grammar) Build

func (g *Grammar) Build(stream *TokenStream) error

Build will create all of the rules for a grammar based on an input stream of tokens

func (*Grammar) DefineRule

func (g *Grammar) DefineRule(r *Rule)

func (*Grammar) Match

func (g *Grammar) Match(str string) *ParseError

type Nonterminal

type Nonterminal struct {
	Name string
}

Nonterminals are symbols which represent other rules.

func NewNonterminal

func NewNonterminal(_ *Grammar, val string) *Nonterminal

NewNonterminal returns a new Nonterminal object which is named for val

func (*Nonterminal) Match

func (n *Nonterminal) Match(g *Grammar, scan *Scanner) *ParseError

type ParseError

type ParseError struct {
	File      string
	Line      int
	Incidence string
	Err       error
}

func (*ParseError) Error

func (p *ParseError) Error() string

type Rule

type Rule struct {
	// Name is the name of the rule (the left-hand side of the rule
	// definition)
	Name string

	// Condition is the expression which must be matched for a rule to accept
	// certain input.
	Condition *Expr
}

A Rule is a named record that encapsulates some conditional logic for a given set of input. It's the entirety of a `<foo> ::= "..."` construct in BNF.

func NewRule

func NewRule(_ *Grammar, name string) *Rule

NewRule returns a new rule object with an expression already allocated. (There is no practical time where you would expect to see a rule without a condition.)

func (*Rule) Match

func (r *Rule) Match(g *Grammar, scan *Scanner) *ParseError

type Scanner

type Scanner struct {
	// contains filtered or unexported fields
}

func NewScanner

func NewScanner(s string) *Scanner

func (*Scanner) FastForward

func (s *Scanner) FastForward(n int)

func (*Scanner) Revert

func (s *Scanner) Revert()

func (*Scanner) Save

func (s *Scanner) Save()

func (*Scanner) Show

func (s *Scanner) Show() string

func (*Scanner) StartsWith

func (s *Scanner) StartsWith(input string) bool

type Symbol

type Symbol interface {
	Match(*Grammar, *Scanner) *ParseError
}

Symbols are data that can be matched against input. This interface is currently empty because I don't yet know what would be that API.

type Terminal

type Terminal struct {
	Value string
}

Terminals are symbols which represent literal string values

func NewTerminal

func NewTerminal(_ *Grammar, val string) *Terminal

NewTerminal returns a new Terminal object that has a literal value of val

func (*Terminal) Match

func (t *Terminal) Match(_ *Grammar, scan *Scanner) *ParseError

type Token

type Token struct {
	Type  int
	Value string
}

A Token is an element of BNF grammar; it's any single thing we might parse

type TokenStream

type TokenStream struct {
	// contains filtered or unexported fields
}

A TokenStream is a set of tokens

func (*TokenStream) At

func (s *TokenStream) At(first int, types ...int) bool

At will return true if the current tokens match the types given. For example, `stream.At(TokenNonterminal, TokenOpEqual)` would return true for tokens like `<foo> ::=`. We require at least one type, which is why the first parameter is not folded into the types slice.

func (*TokenStream) Ended

func (s *TokenStream) Ended() bool

Ended will return true if the stream has no more tokens to provide. This is effectively the same as testing if `stream.Next() == nil`.

func (*TokenStream) Next

func (s *TokenStream) Next() *Token

Next will return the next token according to the pos field. It will also increment pos to the next place. If we've reached the end of the stream, Next will return nil.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL