shlex: github.com/google/shlex Index | Files

package shlex

import "github.com/google/shlex"

Package shlex implements a simple lexer which splits input in to tokens using shell-style rules for quoting and commenting.

The basic use case uses the default ASCII lexer to split a string into sub-strings:

shlex.Split("one \"two three\" four") -> []string{"one", "two three", "four"}

To process a stream of strings:

l := NewLexer(os.Stdin)
for ; token, err := l.Next(); err != nil {
	// process token
}

To access the raw token stream (which includes tokens for comments):

  t := NewTokenizer(os.Stdin)
  for ; token, err := t.Next(); err != nil {
	// process token
  }

Index

Package Files

shlex.go

func Split Uses

func Split(s string) ([]string, error)

Split partitions a string into a slice of strings.

type Lexer Uses

type Lexer Tokenizer

Lexer turns an input stream into a sequence of tokens. Whitespace and comments are skipped.

func NewLexer Uses

func NewLexer(r io.Reader) *Lexer

NewLexer creates a new lexer from an input stream.

func (*Lexer) Next Uses

func (l *Lexer) Next() (string, error)

Next returns the next word, or an error. If there are no more words, the error will be io.EOF.

type Token Uses

type Token struct {
    // contains filtered or unexported fields
}

Token is a (type, value) pair representing a lexographical token.

func (*Token) Equal Uses

func (a *Token) Equal(b *Token) bool

Equal reports whether tokens a, and b, are equal. Two tokens are equal if both their types and values are equal. A nil token can never be equal to another token.

type TokenType Uses

type TokenType int

TokenType is a top-level token classification: A word, space, comment, unknown.

const (
    UnknownToken TokenType = iota
    WordToken
    SpaceToken
    CommentToken
)

Classes of lexographic token

type Tokenizer Uses

type Tokenizer struct {
    // contains filtered or unexported fields
}

Tokenizer turns an input stream into a sequence of typed tokens

func NewTokenizer Uses

func NewTokenizer(r io.Reader) *Tokenizer

NewTokenizer creates a new tokenizer from an input stream.

func (*Tokenizer) Next Uses

func (t *Tokenizer) Next() (*Token, error)

Next returns the next token in the stream.

Package shlex imports 4 packages (graph) and is imported by 142 packages. Updated 2019-10-04. Refresh now. Tools for package owners.