tokiponatokens

package
v1.9.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 24, 2024 License: CC0-1.0 Imports: 5 Imported by: 0

Documentation

Overview

Package tokiponatokens is a wrapper to a Toki Poka tokenizer. I have an instance set up here: https://us-central1-golden-cove-408.cloudfunctions.net/function-1

Index

Constants

View Source
const (
	// Who/what the sentence is addressed to in Parts.
	PartAddress      = `address`
	PartSubject      = `subject`
	PartObjectMarker = `objectMarker`
	PartVerbMarker   = `verbMarker`
	PartPrepPhrase   = `prepPhrase`
	PartInterjection = `interjection`
	// A foreign name.
	PartCartouche = `cartouche`
	// Most sentences will end in this.
	PartPunctuation = `punctuation`
)

Individual part type values.

View Source
const (
	PunctPeriod      = `period`
	PunctQuestion    = `question`
	PunctExclamation = `exclamation`
	PunctComma       = `comma`
)

Punctuation constants.

Variables

This section is empty.

Functions

This section is empty.

Types

type Part

type Part struct {
	Type   string   `json:"part"`
	Sep    *string  `json:"sep"`
	Tokens []string `json:"tokens"`
	Parts  []*Part  `json:"parts"`
}

Part is an individual part of a sentence.

func (Part) Braces

func (p Part) Braces() string

type Sentence

type Sentence []Part

Sentence is a series of sentence parts. This correlates to one Toki Pona sentence.

func Tokenize

func Tokenize(aurl, text string) ([]Sentence, error)

Tokenize returns a series of toki pona tokens.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL