lexer

package
v0.9.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 21, 2024 License: Apache-2.0 Imports: 6 Imported by: 0

Documentation

Overview

Package lexer contains TraceQL lexer.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Error

type Error struct {
	Msg string
	Pos scanner.Position
}

Error is a lexing error.

func (*Error) Error

func (e *Error) Error() string

Error implements error.

type Token

type Token struct {
	Type TokenType
	Text string
	Pos  scanner.Position
}

Token is a LogQL token.

func Tokenize

func Tokenize(s string, opts TokenizeOptions) ([]Token, error)

Tokenize scans given string to LogQL tokens.

type TokenType

type TokenType int

TokenType defines LogQL token type.

const (
	Invalid TokenType = iota
	EOF
	Ident
	// Literals
	String
	Integer
	Number
	Duration

	Comma
	Dot
	OpenBrace
	CloseBrace
	OpenParen
	CloseParen
	Eq
	NotEq
	Re
	NotRe
	Gt
	Gte
	Lt
	Lte
	Add
	Sub
	Div
	Mod
	Mul
	Pow
	True
	False
	Nil
	StatusOk
	StatusError
	StatusUnset
	KindUnspecified
	KindInternal
	KindServer
	KindClient
	KindProducer
	KindConsumer
	And
	Or
	Not
	Pipe
	Desc
	Tilde
	SpanDuration
	ChildCount
	Name
	Status
	Kind
	RootName
	RootServiceName
	TraceDuration
	Parent
	Count
	Avg
	Max
	Min
	Sum
	By
	Coalesce
	Select
)

func (TokenType) String

func (i TokenType) String() string

type TokenizeOptions

type TokenizeOptions struct {
	// Filename sets filename for the scanner.
	Filename string
}

TokenizeOptions is a Tokenize options structure.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL