deepmind

package module
v0.0.0-...-501b8d2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 12, 2022 License: MIT Imports: 13 Imported by: 0

README

deepmind

deepmind package is a collection of basic machine learning tools built on top of Gorgonia Library. This repository was originally cloned from zltgo.

Documentation

Index

Constants

This section is empty.

Variables

View Source
var (
	ErrorEmptyInitializer = errors.New("the initializer string is empty")

	// cache all the initializer functions
	Initializers initializerMap
)
View Source
var (
	OneF32   = NewConstant(float32(1.0))
	OneF64   = NewConstant(float64(1.0))
	OneInt   = NewConstant(int(1))
	OneInt64 = NewConstant(int64(1))
	OneInt32 = NewConstant(int32(1))
)
View Source
var Activations activationMap

Functions

func AddBias

func AddBias(x, b *Node) (rv *Node, err error)

x supposed to have a batch size. AddBias adds b to evey sample of x.

func AnyToF64

func AnyToF64(val interface{}) float64

func BinaryCrossEntropy

func BinaryCrossEntropy(output, target *Node) (*Node, error)

BinaryXent is a convenience function for doing binary crossentropy stuff. This is the loss fucntion of choice for two-class classification problems and sigmoid output units. The formula is as below: BCE(p, t) = -Mean{ t * log(p) + (1 - t) * log(1-p)}

func CloneSelf

func CloneSelf(axis int, x *Node, n int) (rv *Node, err error)

CloneSelf Concat x on the provided axis n times.

func CrossEntropy

func CrossEntropy(output, target *Node) (*Node, error)

categorical cross entropy, this is the loss fucntion of choice for multi-class classification problems and softmax output units. The formula is as below: CCE(p, t) = -Mean{ t * log(p) }

func DefaultGetInitWFn

func DefaultGetInitWFn(master, slaver string) (InitWFn, error)

func DtypeOf

func DtypeOf(x *Node) (tensor.Dtype, error)

func F64ToAny

func F64ToAny(v float64, dt tensor.Dtype) interface{}

func F64ToSlice

func F64ToSlice(f64 []float64, dt tensor.Dtype) interface{}

func Fxwb

func Fxwb(act Activation, x, w, b *Node) (*Node, error)

perform f(xw+b)

func GetBackingF64

func GetBackingF64(n *Node) []float64

func GetInitWFn

func GetInitWFn(str string) (InitWFn, error)

GetInitWFn gets InitWFn from a string like "Gaussian(0, 0.08)"

func Linear

func Linear(x *Node) (*Node, error)

there is no activation

func Losses

func Losses(outputs, targets Nodes, f LossFunc) (cost *Node, err error)

Add losses of every output

func MeanSquared

func MeanSquared(output, target *Node) (*Node, error)

mean squared error The formula is as below:

MSE(y, y') = Mean{ (y - y')^2 }

func NodeFromMap

func NodeFromMap(g *ExprGraph, vs map[string][]float64, dt tensor.Dtype, s tensor.Shape, name string) (*Node, error)

if shape is nil, a scalar will be created.

func OneHotCE

func OneHotCE(output *Node, targetId int) (*Node, error)

categorical cross entropy for one hot situation.

func OneHotCEBatch

func OneHotCEBatch(output *Node, targetIds []int) (cost *Node, err error)

Size of targets must equal to batch of output.

func OneSub

func OneSub(x *Node) (*Node, error)

perform 1-x

func ReshapeToMatrix

func ReshapeToMatrix(x *Node) (*Node, error)

if dims of x > 2, x will be reshaped to a matrix. ReshapeToMatrix is needed because trainning mode have batch size, unlike product mode.

func WithBacking

func WithBacking(f64 []float64) NodeConsOpt

length of backing can longer than node.TotalSize()

Types

type Activation

type Activation interface {
	Activate(x *Node) (*Node, error)
	Name() string
}

func NewActivation

func NewActivation(name string, fn func(x *Node) (*Node, error)) Activation

create a Activation by a activate function

type FC

type FC struct {
	// contains filtered or unexported fields
}

fully connected layer that has the operation: activate(x*w + b)

func (*FC) Forward

func (l *FC) Forward(x *Node, states States) (rv *Node, err error)

activate(x*w + b) if dims of x > 2, x will be reshaped.

func (*FC) Init

func (l *FC) Init(g *ExprGraph, dt tensor.Dtype, vs map[string][]float64) error

If vs is nil, the initializer indicated in the Options will be used.

func (*FC) Learnables

func (l *FC) Learnables() Nodes

Learnables must be called after Init.

func (*FC) Name

func (l *FC) Name() string

func (*FC) Options

func (l *FC) Options() interface{}

type FCOpts

type FCOpts struct {
	InputSize  int
	OutputSize int
	// Sigmoid for example, see "active.go" for more activations.
	// Activation is optional,  default is Linear.
	Activation string

	//Gaussian(0.0, 0.08), see  "initializer.go" for more initializers.
	// Initializer is optional,  default is Uniform(-1,1).
	Initializer string

	// Probability of Dropout, it uses randomly zeroes out a *Tensor with a probability
	// drawn from a uniform distribution. Only float32 or float64 type supported.
	// Optional, default is zero, means
	Dropout float64
}

type GRU

type GRU struct {
	// contains filtered or unexported fields
}

Gated Recurrent Unit

func (*GRU) Forward

func (l *GRU) Forward(x *Node, states States) (*Node, error)

func (*GRU) Init

func (l *GRU) Init(g *ExprGraph, dt tensor.Dtype, vs map[string][]float64) error

If vs is nil, the initializer indicated in the Options will be used,

func (*GRU) Learnables

func (l *GRU) Learnables() Nodes

func (*GRU) Name

func (l *GRU) Name() string

func (*GRU) Options

func (l *GRU) Options() interface{}

type GRUOpts

type GRUOpts struct {
	InputSize  int
	HiddenSize int
	// Sigmoid for example, see "active.go" for more activations.
	// Activation is optional,  default is Tanh.
	Activation string

	//Gaussian(0.0, 0.08), see  "initializer.go" for more initializers.
	// Initializer is optional,  default is Uniform(-1,1).
	InitWh string
	InitWr string
	InitWu string

	// Probability of Dropout, it uses randomly zeroes out a *Tensor with a probability
	// drawn from a uniform distribution. Only float32 or float64 type supported.
	// Optional, default is zero, means
	Dropout float64
}

type Initializer

type Initializer func(opts []float64) (InitWFn, error)

type JsonSaver

type JsonSaver struct {
	reflectx.Reflector
	// contains filtered or unexported fields
}

func NewJsonSaver

func NewJsonSaver(dir string) JsonSaver

func (JsonSaver) Load

func (j JsonSaver) Load() (m *Model, err error)

func (JsonSaver) LoadData

func (j JsonSaver) LoadData() (map[string][]float64, error)

func (JsonSaver) LoadGraph

func (j JsonSaver) LoadGraph() (*Model, error)

func (JsonSaver) Save

func (j JsonSaver) Save(m *Model) error

type LSTM

type LSTM struct {
	// contains filtered or unexported fields
}

Long Short Term Memory

func (*LSTM) Forward

func (l *LSTM) Forward(x *Node, states States) (*Node, error)

func (*LSTM) Init

func (l *LSTM) Init(g *ExprGraph, dt tensor.Dtype, vs map[string][]float64) error

If vs is nil, the initializer indicated in the Options will be used,

func (*LSTM) Learnables

func (l *LSTM) Learnables() Nodes

func (*LSTM) Name

func (l *LSTM) Name() string

func (*LSTM) Options

func (l *LSTM) Options() interface{}

type LSTMOpts

type LSTMOpts struct {
	InputSize  int
	HiddenSize int
	// Sigmoid for example, see "active.go" for more activations.
	// Activation is optional,  default is Tanh.
	Activation string

	//Gaussian(0.0, 0.08), see  "initializer.go" for more initializers.
	// Initializer is optional,  default is Uniform(-1,1).
	InitWf string
	InitWi string
	InitWo string
	InitWc string

	// Probability of Dropout, it uses randomly zeroes out a *Tensor with a probability
	// drawn from a uniform distribution. Only float32 or float64 type supported.
	// Optional, default is zero, means
	Dropout float64
}

type Layer

type Layer interface {
	Forward(x *Node, states States) (rv *Node, err error)
	Learnables() Nodes

	// If vs is nil, the initializer indicated in the Options will be used.
	Init(g *ExprGraph, dt tensor.Dtype, vs map[string][]float64) error

	// Get the name of the layer
	Name() string
	// Get the options of the layer
	Options() interface{}
}

Layer is a set of neurons and corresponding activation

func NewFC

func NewFC(name string, opts FCOpts) (Layer, error)

func NewGRU

func NewGRU(name string, opts GRUOpts) (Layer, error)

func NewLSTM

func NewLSTM(name string, opts LSTMOpts) (Layer, error)

func NewLayers

func NewLayers(cfg LayerOpts) ([]Layer, error)

create layers by LayerOpts

func NewRNN

func NewRNN(name string, opts RNNOpts) (Layer, error)

type LayerOpts

type LayerOpts struct {
	Names []string
	Opts  map[string]interface{}
}

type LossFunc

type LossFunc func(output, target *Node) (cost *Node, err error)

type Model

type Model struct {
	Layers []Layer

	//init data from saver
	InitData map[string][]float64
	// contains filtered or unexported fields
}

combin a group of layers

func NewModel

func NewModel(layers ...Layer) *Model

func (*Model) Forward

func (m *Model) Forward(x *Node, states States) (rv *Node, err error)

states must be empty in the beginning. states stores hidden state in the layers if necessary.

func (*Model) GetNode

func (m *Model) GetNode(name string) *Node

get learnable node by name.

func (*Model) Init

func (m *Model) Init(g *ExprGraph, dt tensor.Dtype) error

init weigths and bias at the beginning.

func (*Model) Learnables

func (m *Model) Learnables() Nodes

get all learnable nodes.

func (*Model) LearnablesGrad

func (m *Model) LearnablesGrad() []ValueGrad

func (*Model) StepForward

func (m *Model) StepForward(ns Nodes) (rv Nodes, err error)

len(ns) = number of steps

type RNN

type RNN struct {
	// contains filtered or unexported fields
}

Basic Recurrent Neural Network

func (*RNN) Forward

func (l *RNN) Forward(x *Node, states States) (rv *Node, err error)

h(dt+1) = activate([x, h(dt)]*w + b) if dims of x > 2, x will be reshaped.

func (*RNN) Init

func (l *RNN) Init(g *ExprGraph, dt tensor.Dtype, vs map[string][]float64) error

If vs is nil, the initializer indicated in the Options will be used.

func (*RNN) Learnables

func (l *RNN) Learnables() Nodes

Learnables must be called after Init.

func (*RNN) Name

func (l *RNN) Name() string

func (*RNN) Options

func (l *RNN) Options() interface{}

type RNNOpts

type RNNOpts struct {
	InputSize  int
	HiddenSize int
	// Sigmoid for example, see "active.go" for more activations.
	// Activation is optional,  default is Tanh.
	Activation string

	//Gaussian(0.0, 0.08), see  "initializer.go" for more initializers.
	// Initializer is optional,  default is Uniform(-1,1).
	Initializer string

	// Probability of Dropout, it uses randomly zeroes out a *Tensor with a probability
	// drawn from a uniform distribution. Only float32 or float64 type supported.
	// Optional, default is zero, means
	Dropout float64
}

type Saver

type Saver interface {
	Load() (*Model, error)
	LoadGraph() (*Model, error)
	LoadData() (map[string][]float64, error)
	Save(m *Model) error
}

type States

type States map[string]*Node

func (States) Get

func (s States) Get(name string) *Node

func (States) Len

func (s States) Len() int

func (States) Update

func (s States) Update(n *Node)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL