lexer

package
v0.0.0-...-129e209 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 9, 2023 License: ISC Imports: 6 Imported by: 0

Documentation

Overview

Package lexer implements a tokenizer or lexer for Aurora Lyrics Language (ALF) source files.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Item

type Item struct {
	Token   Token
	Literal string
	Line    int // Starts from 0.
	Col     int // Starts from 0.
}

Item contains a token, its literal text and its location in the source file.

func (Item) String

func (i Item) String() string

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

Lexer is a token generator for Aurora Lyrics Format (ALF) source files.

func New

func New(r io.Reader) (*Lexer, <-chan Item)

New creates and initializes a new `Lexer` and starts collecting tokens immediately using concurrency. The `r` argument must be source code in ALF format.

The first value returned is a pointer to a `Lexer` structure, provided for monitoring purposes (it currently does nothing).

The second value returned is a read-only channel on which `Item` structures are sent with the tokens found.

func (*Lexer) Error

func (l *Lexer) Error() error

type Token

type Token int

Token is a unique identifier for each part of the Aurora Lyrics Format (ALF) grammar.

const (
	TokenError Token = iota
	TokenEOF
	TokenNewline
	TokenWhitespace
	TokenIndent
	TokenColon
	TokenComment
	TokenName
	TokenText
	TokenList
)

Aurora Lyrics Format (ALF) grammar tokens.

func (Token) String

func (i Token) String() string

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL