parser

package
v1.9.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 6, 2015 License: MIT Imports: 5 Imported by: 1

Documentation

Index

Constants

View Source
const EOF rune = 0

EOF is defined to get a char

View Source
const LeftBracket string = "[["

LeftBracket

View Source
const MinusSign = "--"

MinusSign

View Source
const NEWLINE string = "\n"

NEWLINE

View Source
const RightBracket string = "]]"

RightBracket

Variables

View Source
var ErrNoMatch = fmt.Errorf("No matching comment")

ErrNoMatch is returned when no matching comment can be found for a row

Functions

This section is empty.

Types

type Comment

type Comment struct {
	Value string
	Row   int
}

Comment contains information about a comment

type Comments

type Comments []Comment

Comments that has been parsed

func File

func File(file string) (Comments, error)

File parses a file

func String

func String(str string) Comments

String parses a string

func (Comments) Get

func (c Comments) Get(row int) (*Comment, error)

Get returns a Comment or an error

type LexFn

type LexFn func(*Lexer) LexFn

LexFn defines a lexer

func LexBegin

func LexBegin(lexer *Lexer) LexFn

LexBegin is the initial lexer function

func LexCode

func LexCode(lexer *Lexer) LexFn

LexCode parsers random code

func LexMultiLineComment

func LexMultiLineComment(lexer *Lexer) LexFn

LexMultiLineComment lexer

func LexSingleLineComment

func LexSingleLineComment(lexer *Lexer) LexFn

LexSingleLineComment lexer

type Lexer

type Lexer struct {
	Name   string
	Input  string
	Tokens chan Token
	State  LexFn

	Start int
	Pos   int
	Width int
	Row   int
}

Lexer defines our parser

func BeginLexing

func BeginLexing(name, input string) *Lexer

BeginLexing returns a new lexer

func (*Lexer) Backup

func (l *Lexer) Backup()

Backup moves parser to last read token

func (*Lexer) Dec

func (l *Lexer) Dec()

Dec decrements the possition

func (*Lexer) Emit

func (l *Lexer) Emit(tokenType TokenType)

Emit puts a token onto the token channel. The value of l token is read from the input based on the current lexer position.

func (*Lexer) Ignore

func (l *Lexer) Ignore()

Ignore ignors the currently parsed data

func (*Lexer) Inc

func (l *Lexer) Inc()

Inc increments the position

func (*Lexer) InputToEnd

func (l *Lexer) InputToEnd() string

InputToEnd returns a slice of the input from the current lexer position to the end of the input string.

func (*Lexer) IsEOF

func (l *Lexer) IsEOF() bool

IsEOF checks if we reached the end of file

func (*Lexer) Next

func (l *Lexer) Next() rune

Next reads the next rune (character) from the input stream and advances the lexer position.

func (*Lexer) NextToken

func (l *Lexer) NextToken() Token

NextToken returns the next token from the channel

func (*Lexer) Peek

func (l *Lexer) Peek() rune

Peek returns next rune without advancing the parser

func (*Lexer) Run

func (l *Lexer) Run()

Run the parser

func (*Lexer) Shutdown

func (l *Lexer) Shutdown()

Shutdown the parser

func (*Lexer) SkipWhitespace

func (l *Lexer) SkipWhitespace()

SkipWhitespace skips whitespaces

type Token

type Token struct {
	Type  TokenType
	Value string
	Row   int
}

Token is a parsed token

type TokenType

type TokenType int

TokenType defines a token

const (
	// TError defines a parse error
	TError TokenType = iota

	// TEOF is End of File
	TEOF

	// TSLComment is --
	TSLComment

	// TRMLComment is --[[
	TRMLComment

	// TLMLComment is --]]
	TLMLComment

	// TNewline is \n or \r
	TNewline

	// TComment is the comment
	TComment
)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL