lexer

package
v0.0.0-...-fa3d426 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 16, 2016 License: MIT, MIT Imports: 5 Imported by: 0

Documentation

Overview

Package lexer provides a handlebars tokenizer.

Example

package example

source := "You know {{nothing}} John Snow"

output := ""

lex := Scan(source)
for {
	// consume next token
	token := lex.NextToken()

	output += fmt.Sprintf(" %s", token)

	// stops when all tokens have been consumed, or on error
	if token.Kind == TokenEOF || token.Kind == TokenError {
		break
	}
}

fmt.Print(output)
Output:

Content{"You know "} Open{"{{"} ID{"nothing"} Close{"}}"} Content{" John Snow"} EOF

Index

Examples

Constants

View Source
const (
	// Mustaches detection
	ESCAPED_ESCAPED_OPEN_MUSTACHE  = "\\\\{{"
	ESCAPED_OPEN_MUSTACHE          = "\\{{"
	OPEN_MUSTACHE                  = "{{"
	CLOSE_MUSTACHE                 = "}}"
	CLOSE_STRIP_MUSTACHE           = "~}}"
	CLOSE_UNESCAPED_STRIP_MUSTACHE = "}~}}"
)
View Source
const (
	// Option to generate token position in its string representation
	DUMP_TOKEN_POS = false

	// Option to generate values for all token kinds for their string representations
	DUMP_ALL_TOKENS_VAL = true
)

Variables

This section is empty.

Functions

This section is empty.

Types

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

Lexer is a lexical analyzer.

func Scan

func Scan(input string) *Lexer

Scan scans given input.

Tokens can then be fetched sequentially thanks to NextToken() function on returned lexer.

func (*Lexer) Line

func (l *Lexer) Line() int

Line returns the current line number.

func (*Lexer) NextToken

func (l *Lexer) NextToken() Token

NextToken returns the next scanned token.

func (*Lexer) Pos

func (l *Lexer) Pos() int

Pos returns the current byte position.

type Token

type Token struct {
	Kind TokenKind // Token kind
	Val  string    // Token value

	Pos  int // Byte position in input string
	Line int // Line number in input string
}

Token represents a scanned token.

func Collect

func Collect(input string) []Token

Collect scans and collect all tokens.

This should be used for debugging purpose only. You should use Scan() and lexer.NextToken() functions instead.

func (Token) String

func (t Token) String() string

String returns the token string representation for debugging.

type TokenKind

type TokenKind int

TokenKind represents a Token type.

const (
	TokenError TokenKind = iota
	TokenEOF

	// mustache delimiters
	TokenOpen             // OPEN
	TokenClose            // CLOSE
	TokenOpenRawBlock     // OPEN_RAW_BLOCK
	TokenCloseRawBlock    // CLOSE_RAW_BLOCK
	TokenOpenEndRawBlock  // END_RAW_BLOCK
	TokenOpenUnescaped    // OPEN_UNESCAPED
	TokenCloseUnescaped   // CLOSE_UNESCAPED
	TokenOpenBlock        // OPEN_BLOCK
	TokenOpenEndBlock     // OPEN_ENDBLOCK
	TokenInverse          // INVERSE
	TokenOpenInverse      // OPEN_INVERSE
	TokenOpenInverseChain // OPEN_INVERSE_CHAIN
	TokenOpenPartial      // OPEN_PARTIAL
	TokenComment          // COMMENT

	// inside mustaches
	TokenOpenSexpr        // OPEN_SEXPR
	TokenCloseSexpr       // CLOSE_SEXPR
	TokenEquals           // EQUALS
	TokenData             // DATA
	TokenSep              // SEP
	TokenOpenBlockParams  // OPEN_BLOCK_PARAMS
	TokenCloseBlockParams // CLOSE_BLOCK_PARAMS

	// tokens with content
	TokenContent // CONTENT
	TokenID      // ID
	TokenString  // STRING
	TokenNumber  // NUMBER
	TokenBoolean // BOOLEAN
)

func (TokenKind) String

func (k TokenKind) String() string

String returns the token kind string representation for debugging.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL