jlexer

package
v1.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 23, 2024 License: MIT Imports: 12 Imported by: 0

Documentation

Overview

Package jlexer contains a JSON lexer implementation.

It is expected that it is mostly used with generated parser code, so the interface is tuned for a parser that knows what kind of data is expected.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type FieldState added in v1.0.1

type FieldState int
const (
	FieldMissing FieldState = iota
	FieldNull
	FieldPresent
	FieldDuplicate
)

type FieldsTracker added in v1.0.1

type FieldsTracker map[string]FieldState

func (FieldsTracker) Add added in v1.0.1

func (f FieldsTracker) Add(field string)

func (FieldsTracker) AddNull added in v1.0.1

func (f FieldsTracker) AddNull(field string)

func (FieldsTracker) GetState added in v1.0.1

func (f FieldsTracker) GetState(field string) FieldState

type Lexer

type Lexer struct {
	Data []byte // Input data given to the lexer.

	UseMultipleErrors            bool // If we want to use multiple errors.
	VerboseErrorsMode            bool // If we want multiple errors to contain additional information.
	RootObjectValidationCritical bool // If we want to stop parsing process if the root object is invalid.
	// contains filtered or unexported fields
}

Lexer is a JSON lexer: it iterates over JSON tokens in a byte slice.

func (*Lexer) AddError

func (r *Lexer) AddError(e error)

func (*Lexer) AddNonFatalError

func (r *Lexer) AddNonFatalError(e error)

func (*Lexer) Bool

func (r *Lexer) Bool() bool

Bool reads a true or false boolean keyword.

func (*Lexer) Bytes

func (r *Lexer) Bytes() []byte

Bytes reads a string literal and base64 decodes it into a byte slice.

func (*Lexer) Consumed

func (r *Lexer) Consumed()

Consumed reads all remaining bytes from the input, publishing an error if there is anything but whitespace remaining.

func (*Lexer) CurrentToken

func (r *Lexer) CurrentToken() TokenKind

CurrentToken returns current token kind if there were no errors and TokenUndef otherwise

func (*Lexer) Delim

func (r *Lexer) Delim(c byte)

Delim consumes a token and verifies that it is the given delimiter.

func (*Lexer) Error

func (r *Lexer) Error() error

func (*Lexer) FetchToken

func (r *Lexer) FetchToken()

FetchToken scans the input for the next token.

func (*Lexer) Float32

func (r *Lexer) Float32() float32

func (*Lexer) Float32Str

func (r *Lexer) Float32Str() float32

func (*Lexer) Float64

func (r *Lexer) Float64() float64

func (*Lexer) Float64Str

func (r *Lexer) Float64Str() float64

func (*Lexer) GetNonFatalErrors

func (r *Lexer) GetNonFatalErrors() []*LexerError

func (*Lexer) GetPos

func (r *Lexer) GetPos() int

func (*Lexer) Int

func (r *Lexer) Int() int

func (*Lexer) Int16

func (r *Lexer) Int16() int16

func (*Lexer) Int16Str

func (r *Lexer) Int16Str() int16

func (*Lexer) Int32

func (r *Lexer) Int32() int32

func (*Lexer) Int32Str

func (r *Lexer) Int32Str() int32

func (*Lexer) Int64

func (r *Lexer) Int64() int64

func (*Lexer) Int64Str

func (r *Lexer) Int64Str() int64

func (*Lexer) Int8

func (r *Lexer) Int8() int8

func (*Lexer) Int8Str

func (r *Lexer) Int8Str() int8

func (*Lexer) IntStr

func (r *Lexer) IntStr() int

func (*Lexer) Interface

func (r *Lexer) Interface() interface{}

Interface fetches an interface{} analogous to the 'encoding/json' package.

func (*Lexer) IsBool added in v1.0.4

func (r *Lexer) IsBool() bool

IsBool returns true if the next token is a boolean.

func (*Lexer) IsDelim

func (r *Lexer) IsDelim(c byte) bool

IsDelim returns true if there was no scanning error and next token is the given delimiter.

func (*Lexer) IsNull

func (r *Lexer) IsNull() bool

IsNull returns true if the next token is a null keyword.

func (*Lexer) IsNumber added in v1.0.2

func (r *Lexer) IsNumber() bool

IsNumber returns true if the next token is a number.

func (*Lexer) IsStart

func (r *Lexer) IsStart() bool

IsStart returns whether the lexer is positioned at the start of an input string.

func (*Lexer) IsString added in v1.0.2

func (r *Lexer) IsString() bool

IsString returns true if the next token is a string.

func (*Lexer) JsonNumber

func (r *Lexer) JsonNumber() json.Number

JsonNumber fetches and json.Number from 'encoding/json' package. Both int, float or string, contains them are valid values

func (*Lexer) LastKey added in v1.0.3

func (r *Lexer) LastKey() string

LastKey returns last traversed key in object

func (*Lexer) Null

func (r *Lexer) Null()

Null verifies that the next token is null and consumes it.

func (*Lexer) Ok

func (r *Lexer) Ok() bool

Ok returns true if no error (including io.EOF) was encountered during scanning.

func (*Lexer) Raw

func (r *Lexer) Raw() []byte

Raw fetches the next item recursively as a data slice

func (*Lexer) Skip

func (r *Lexer) Skip()

Skip skips a single token.

func (*Lexer) SkipRecursive

func (r *Lexer) SkipRecursive()

SkipRecursive skips next array or object completely, or just skips a single token if not an array/object.

Note: no syntax validation is performed on the skipped data.

func (*Lexer) String

func (r *Lexer) String() string

String reads a string literal.

func (*Lexer) StringIntern

func (r *Lexer) StringIntern() string

StringIntern reads a string literal, and performs string interning on it.

func (*Lexer) Uint

func (r *Lexer) Uint() uint

func (*Lexer) Uint16

func (r *Lexer) Uint16() uint16

func (*Lexer) Uint16Str

func (r *Lexer) Uint16Str() uint16

func (*Lexer) Uint32

func (r *Lexer) Uint32() uint32

func (*Lexer) Uint32Str

func (r *Lexer) Uint32Str() uint32

func (*Lexer) Uint64

func (r *Lexer) Uint64() uint64

func (*Lexer) Uint64Str

func (r *Lexer) Uint64Str() uint64

func (*Lexer) Uint8

func (r *Lexer) Uint8() uint8

func (*Lexer) Uint8Str

func (r *Lexer) Uint8Str() uint8

func (*Lexer) UintStr

func (r *Lexer) UintStr() uint

func (*Lexer) UintptrStr

func (r *Lexer) UintptrStr() uintptr

func (*Lexer) UnsafeBytes

func (r *Lexer) UnsafeBytes() []byte

UnsafeBytes returns the byte slice if the token is a string literal.

func (*Lexer) UnsafeFieldName

func (r *Lexer) UnsafeFieldName(skipUnescape bool) string

UnsafeFieldName returns current member name string token

func (*Lexer) UnsafeString

func (r *Lexer) UnsafeString() string

UnsafeString returns the string value if the token is a string literal.

Warning: returned string may point to the input buffer, so the string should not outlive the input buffer. Intended pattern of usage is as an argument to a switch statement.

func (*Lexer) WantColon

func (r *Lexer) WantColon()

WantColon requires a colon to be present before fetching next token.

func (*Lexer) WantComma

func (r *Lexer) WantComma()

WantComma requires a comma to be present before fetching next token.

type LexerError

type LexerError struct {
	Reason string
	Offset int
	Data   string

	// Fields that are filled in when verbose mode is enabled.
	Key              string
	IsUnsupportedKey bool
	IsInvalidValue   bool
	IsMissingKey     bool
	IsDuplicateKey   bool
	Flags            uint64
}

LexerError implements the error interface and represents all possible errors that can be generated during parsing the JSON data.

func (*LexerError) Error

func (l *LexerError) Error() string

func (*LexerError) VerboseError

func (l *LexerError) VerboseError() string

type TokenKind

type TokenKind byte

TokenKind determines type of a token.

const (
	TokenUndef  TokenKind = iota // No token.
	TokenDelim                   // Delimiter: one of '{', '}', '[' or ']'.
	TokenString                  // A string literal, e.g. "abc\u1234"
	TokenNumber                  // Number literal, e.g. 1.5e5
	TokenBool                    // Boolean literal: true or false.
	TokenNull                    // null keyword.
)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL