go-gpt-3-encoder
Go BPE tokenizer (Encoder+Decoder) for GPT2 and GPT3.
About
GPT2 and GPT3 use byte pair encoding to turn text into a series of integers to feed into the model. This is a Go implementation of OpenAI's original Python encoder/decoder which can be found here.
This code was inspired by Javascript implementation and partially generated by OpenAI himself!
Install
github.com/danielswe88/go-gpt-3-encoder
Usage
Compatible with Node >= 12
package main
import (
"fmt"
"log"
tokenizer "github.com/danielswe88/go-gpt-3-encoder"
)
func main() {
encoder, err := tokenizer.NewEncoder()
if err != nil {
log.Fatal(err)
}
str := "This is an example sentence to try encoding out on!"
encoded, err := encoder.Encode(str)
if err != nil {
log.Fatal(err)
}
fmt.Printf("String contains %d tokens\n", len(encoded))
fmt.Println("We can look at each token and what it represents:")
for _, token := range encoded {
fmt.Printf("%d -- %s\n", token, encoder.Decode([]int{token}))
}
decoded := encoder.Decode(encoded)
fmt.Printf("We can decode it back into: %s\n", decoded)
}
Contribute
Some corner cases are not covered by this library. See @TODO
in tests.