drago

package module
v0.0.0-...-995f28a Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 21, 2016 License: MIT Imports: 5 Imported by: 0

README

drago

Go Report Card

Simple feed forward neural network implementation. Still need to add some nice utility functions, the logic can stand to be cleaned up in some places, but the algorithms are implemented and it can be used.

Usage:

acts := drago.Activator[]{new(drago.Sigmoid), new(drago.Sigmoid)}
net := drago.New(0.1, 25, []int{5, 2, 2, 1}, acts)
net.Learn([][][]float64{
    {{0, 0}, {1}},
    {{0, 1}, {0}},
    {{1, 1}, {0}},
})

// Predict a value
fmt.Println(net.Predict([]float64{1, 1})

To add an activation function:

An activation function needs both the function and it's derivative. See Sigmoid.go, Tanh.go, and ReLU.go for examples of this.

type YourActivationFunction struct {
}

func (y *YourActivationFunction) Apply(r, c int, val float64) float64 {
    // ...
}

func (y *YourActivationFunction) Derivative(r, c int, val float64) float64 {
    // ...
}

Documentation

Overview

Package Drago provides implementation of feed forward neural network

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Activator

type Activator interface {
	// Apply calculates the activation of a given layer given
	// given previous layer activations
	Apply(int, int, float64) float64
	// Derivative is the calculation used to update weights during backprop
	Derivative(int, int, float64) float64
}

Activator represents the activation function for a given layer

type Criterion

type Criterion interface {
	// Apply calculates the error given the predicted (first) and
	// actual (second) labels
	Apply(*mat64.Dense, *mat64.Dense) float64
	// Derivative calculates the derivative of the error function (Apply)
	// given the predicted (first) and actual (second) labels
	Derivative(*mat64.Dense, *mat64.Dense) *mat64.Dense
}

Criterion calculates error in predicted value for a given sample

type Linear

type Linear struct {
}

Linear represents linear activation function

func (*Linear) Apply

func (l *Linear) Apply(r, c int, value float64) float64

Apply calculates activation for layer given previous layer value

func (*Linear) Derivative

func (l *Linear) Derivative(r, c int, value float64) float64

Derivative returns linear derivative (always 1)

type MSE

type MSE struct {
}

MSE implements Mean Squared Error calculations for Criterion interface

func (*MSE) Apply

func (m *MSE) Apply(prediction, actual *mat64.Dense) float64

Apply calculates the MSE given predicted and actual labels

func (*MSE) Derivative

func (m *MSE) Derivative(prediction, actual *mat64.Dense) *mat64.Dense

Derivative calculates derivative of MSE

type Network

type Network struct {
	Activators   []Activator
	Activations  []*mat64.Dense
	Weights      []*mat64.Dense
	Errors       []*mat64.Dense
	Topology     []int
	Layers       int
	LearningRate float64
	Iterations   int
	Loss         Criterion

	// Controls logging output during training.
	Verbose bool
	// contains filtered or unexported fields
}

Network struct represents the neural network Values are exported but should not be messed with during training, are exported simply for ease of examining the network state

func New

func New(learnRate float64, iterations int, topology []int, acts []Activator) *Network

New creates a new neural network Topology specifies number of hidden layers and nodes in each, as well as size of samples and labels (first and last values, respectively). Acts array should have one activator for each hidden layer

func (*Network) Back

func (n *Network) Back(label []float64)

Back performs back propagation to update weights at each layer

func (*Network) Forward

func (n *Network) Forward(sample []float64)

Forward calculates activations at each layer for given sample

func (*Network) Learn

func (n *Network) Learn(dataset [][][]float64)

Learn trains the network using the provided dataset Samples must have number of features and labels as specified by topology when constructing the network

func (*Network) Predict

func (n *Network) Predict(sample []float64) *mat64.Dense

Predict returns the predicted value of the provided sample Dimensions must match those from provided topology Only use after training the network

type ReLU

type ReLU struct {
}

ReLU struct represents a rectified linear unit

func (*ReLU) Apply

func (u *ReLU) Apply(r, c int, val float64) float64

Apply ReLU calculation to val

func (*ReLU) Derivative

func (u *ReLU) Derivative(r, c int, val float64) float64

Derivative calculates derivative of ReLU for val

type Sigmoid

type Sigmoid struct {
}

Sigmoid represents sigmoid activation function

func (*Sigmoid) Apply

func (s *Sigmoid) Apply(r, c int, value float64) float64

Apply calculates sigmoid of given value, r and c ignored

func (*Sigmoid) Derivative

func (s *Sigmoid) Derivative(r, c int, value float64) float64

Derivative calculates sigmoid derivative of given value, r and c ignored

type Tanh

type Tanh struct {
}

Tanh struct represents hyperbolic tangent activation function

func (*Tanh) Apply

func (t *Tanh) Apply(r, c int, val float64) float64

Apply calculates hyperbolic tangent of val, r and c ignored

func (*Tanh) Derivative

func (t *Tanh) Derivative(r, c int, val float64) float64

Derivative calculates derivative of hyperbolic tangent for val, r and c ignored

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL