lexer

package
v0.0.0-...-cf91f57 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 8, 2022 License: GPL-3.0 Imports: 7 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func DescribeToken

func DescribeToken(token *Token) string

DescribeToken returns a human-readable token description, including contextual information if applicable like register number.

func DescribeTokenKind

func DescribeTokenKind(kind TokenKind) string

DescribeTokenKind returns a human-readable TokenKind

func GetTokenOpOpcode

func GetTokenOpOpcode(opKind TokenKind) arch.Opcode

GetTokenOpOpcode returns the arch.Opcode for the given TokenKind

func IsTokenDirective

func IsTokenDirective(kind TokenKind) bool

IsTokenDirective returns whether the token is in the Directive category

func IsTokenIdentifier

func IsTokenIdentifier(kind TokenKind) bool

IsTokenIdentifier returns whether the token is in the Identifier category

func IsTokenImm

func IsTokenImm(kind TokenKind) bool

IsTokenImm returns whether the token is in the Imm category

func IsTokenOp

func IsTokenOp(kind TokenKind) bool

IsTokenOp returns whether the token is in the Op category

Types

type Token

type Token struct {
	Kind   TokenKind
	Value  string
	Row    int
	Column int
}

Token describes a lexeme within an input

type TokenKind

type TokenKind int
const (
	ADD TokenKind
	ADDI
	SUB
	SUBI
	AND
	OR
	XOR
	LSL
	LSR
	CMP
	CMPI
	LDREG
	LDWORD
	LDHWRD
	LDBYTE
	STREG
	STWORD
	STHWRD
	STBYTE
	ADR
	MOV
	MOVZ
	MOVK
	B
	BREG
	BLR
	B_EQ
	B_NEQ
	B_LT
	B_LE
	B_GT
	B_GE
	BL
	PUSH
	POP
	SYSCALL
	HALT
	NOOP

	REGISTER
	IDENTIFIER

	BASE_8_IMM
	BASE_10_IMM
	BASE_16_IMM

	LABEL_DECLARATION
	SECTION
	FILL_STATEMENT
	STRING_STATEMENT
	ADDRESS_OF

	STRING
	LABEL
	COMMA
	LBRACKET
	RBRACKET
	COMMENT
	LINE_END
)

type TokenStream

type TokenStream struct {
	// contains filtered or unexported fields
}

TokenStream describes a stream of lexer Tokens

func NewTokenStream

func NewTokenStream(tokens []*Token) *TokenStream

NewTokenStream returns a TokenStream encapsulating tokens.

tokens should not be modified while TokenStream is in use.

func (*TokenStream) HasNext

func (ts *TokenStream) HasNext() bool

HasNext returns whether the stream has more tokens

func (*TokenStream) Jump

func (ts *TokenStream) Jump(pos int)

Jump goes to an arbitrary position within the TokenStream

func (*TokenStream) Peek

func (ts *TokenStream) Peek() *Token

Peek returns the next Token without removing it

func (*TokenStream) Pop

func (ts *TokenStream) Pop() *Token

Pop removes and returns the next Token, advancing the stream

func (*TokenStream) Pos

func (ts *TokenStream) Pos() int

Pos returns the current position in the TokenStream

func (*TokenStream) Remaining

func (ts *TokenStream) Remaining() int

Remaining returns how many tokens remain in the stream

type Tokenizer

type Tokenizer struct {
	// contains filtered or unexported fields
}

func New

func New(input []byte) (*Tokenizer, error)

New returns a Tokenizer over a byte slice

func (*Tokenizer) Next

func (t *Tokenizer) Next() (*Token, error)

Next extracts the next token from the input buffer. Returns the token, an error. Returns io.EOF on end.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL