neural

package module
v0.0.0-...-41a7f24 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 22, 2016 License: MIT Imports: 6 Imported by: 3

README

Neural Build Status

Package neural implements multi layer neural network of arbitrary number of layers. Goal is to provide fast and flexible solution, to build both regular MLP and more soffisticated structures used for deep learning

As much as possible training is done concurrently to utilize all CPU power

Documentation

Overview

Package neural implements multi layer neural network of arbitrary number of layers. Goal is to provide fast and flexible solution, to build both regular MLP and more soffisticated structures used for deep learning

As much as possible training is done concurrently to utilize all CPU power

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func CalculateCorrectness

func CalculateCorrectness(nn Evaluator, cost Cost, samples []TrainExample) (avgCost float64, errors float64)

CalculateCorrectness evaluates neural network across test samples to give averate cost and error rate

func Load

func Load(nn Evaluator, r io.Reader) error

Load using reader to restore previously persisted data into configured network. Network has to have correct shape when loading data

func Save

func Save(nn Evaluator, w io.Writer) error

Save persists trained network into a writer

func Train

func Train(network Evaluator, trainExamples []TrainExample, options TrainOptions)

Train executes training algorithm using provided Trainers (build with TrainerFactory) Training happens in randomized batches where samples are processed concurrently

Types

type Activator

type Activator interface {
	Activation(dst, potentials []float64)
	Derivative(dst, potentials []float64)
}

Activator calculates neuron activation and it's derivative from given potential

func NewLinearActivator

func NewLinearActivator(a float64) Activator

NewLinearActivator creates Activator that applies linear function to given potential

Activation: a*potential + 0

Derivative: a

func NewRectActivator

func NewRectActivator() Activator

NewRectActivator creates Activator that returns 0 for non positive potential, otherwise it returns potential It's a rectified linear function

Activation: 0 for potential < 0, potential otherwise

Derivative: 0 for potential < 0, 1 otherwise

func NewSigmoidActivator

func NewSigmoidActivator() Activator

NewSigmoidActivator creates Activator that applies linear function to given potential

Activation: 1/(1+exp(-potential))

Derivative: f(potential) * (1- f(potential))

func NewSoftmaxActivator

func NewSoftmaxActivator() Activator

NewSoftmaxActivator creates Activator that scales responses in layer from 0 to 1.

Sum of responses in layer are equal 1, so it can be interpret as probability. This activator should be used in last layer with Log Likelihood. Derivative is not implemented as it should not be needed. If used it will panic.

func NewStepActivator

func NewStepActivator() Activator

NewStepActivator creates Activator that returns 0 or 1 only.

Activation: 1 if potential >= 0 else 0

Derivative: 1 (is that correct?)

func NewTanhActivator

func NewTanhActivator() Activator

NewTanhActivator creates Activator that returns values between -1 and 1. Very similar to sigmoid function in nature.

Activation: tanh(potential)

Derivative: 1/f(potential/2)/2

type Cost

type Cost interface {
	Cost(output, desired []float64) float64
}

Cost interface represents way of calculating neural network cost Cost method should calculate cost of single example. It does not account for normalization. Normalization factor of 1/n should be applied further on

type CostCostDerrivative

type CostCostDerrivative interface {
	Cost
	CostDerivative
}

CostCostDerrivative represents both way of calculating neural network cost as well as it's derivative (delta)

func NewCrossEntropyCost

func NewCrossEntropyCost() CostCostDerrivative

NewCrossEntropyCost creates cross entropy cost function. Comparing with quadratic cost it's derivative is not affected by activation function derivative. That means learning process is faster and avoids saturation of sigmoid function. It should be used together with sigmoid activation function in the last layer. In case of using it with different activator CostDerivative is no longer correct.

func NewLogLikelihoodCost

func NewLogLikelihoodCost() CostCostDerrivative

NewLogLikelihoodCost creates cross entropy cost function. Similar to cross entropy function it's faster than quadratic cost function, however should be used with Softmax activator in last layer for math to be correct.

func NewQuadraticCost

func NewQuadraticCost() CostCostDerrivative

NewQuadraticCost creates quadratic cost function also known as mean squared error or just MSE

type CostDerivative

type CostDerivative interface {
	CostDerivative(dst, output, desired, potentials []float64, activator Activator)
}

CostDerivative method should calculate derivative of cost of single example

type EpocheCallback

type EpocheCallback func(epoche int, dt time.Duration)

EpocheCallback gets called at the end of every epoche with information about the state of training

type Evaluator

type Evaluator interface {
	Evaluate(input []float64) []float64
	Layers() []Layer
}

Evaluator wraps main tasks of NN, evaluate input data

func NewNeuralNetwork

func NewNeuralNetwork(neurons []int, layersFactories ...LayerFactory) Evaluator

NewNeuralNetwork initializes neural network structure of neurons (counts) and layer factories

type Layer

type Layer interface {
	Forward(dst, input []float64)
	Backward(dst, delta []float64)
	SetWeights(weights [][]float64, biases []float64)
	UpdateWeights(weights [][]float64, biases []float64, regularization float64)
	Shapes() (weightsRow, weightsCol, biasesCol int)
	Activator() Activator
	SaverLoader
}

Layer represents a single layer in nerual network

type LayerFactory

type LayerFactory func(inputs, neurons int) Layer

LayerFactory build a Layer of certain type, used to build a network

func NewFullyConnectedLayer

func NewFullyConnectedLayer(activator Activator) LayerFactory

NewFullyConnectedLayer creates new neural network layer with all neurons fully connected to previous layer. Here it's more accruta to say it's using all input values to calculate own outputs func NewFullyConnectedLayer(inputs, neurons int, activator Activator) Layer {

type SaverLoader

type SaverLoader interface {
	Save(w io.Writer) error
	Load(r io.Reader) error
}

SaverLoader define persisting network and loading previously persisted data

type TrainExample

type TrainExample struct {
	Input  []float64
	Output []float64
}

TrainExample represents input-output pair of signals to train on or verify the training

type TrainOptions

type TrainOptions struct {
	Epochs         int
	MiniBatchSize  int
	LearningRate   float64
	Regularization float64 // L2 labda value
	Momentum       float64
	TrainerFactory TrainerFactory
	EpocheCallback EpocheCallback
	Cost           CostDerivative
}

TrainOptions define different switches used to train a network

type Trainer

type Trainer interface {
	Process(sample TrainExample, weightUpdates *WeightUpdates)
}

Trainer implements calculations of weights adjustments (WeightUpdates) in the network It operates on a single training example to prepare fractional result

func NewBackpropagationTrainer

func NewBackpropagationTrainer(network Evaluator, cost CostDerivative) Trainer

NewBackpropagationTrainer builds new trainer that uses backward propagation algorithm

type TrainerFactory

type TrainerFactory func(network Evaluator, cost CostDerivative) Trainer

TrainerFactory build Trainers. Multiple trainers will be created at the beginning of the training.

type WeightUpdates

type WeightUpdates struct {
	Biases  [][]float64
	Weights [][][]float64
}

WeightUpdates is per Layer representation of how to adjust weights of the network

func NewWeightUpdates

func NewWeightUpdates(network Evaluator) WeightUpdates

NewWeightUpdates creates WeightUpdates according to structure of the network (neurons in each layer)

func (*WeightUpdates) Zero

func (w *WeightUpdates) Zero()

Zero sets all weights values to 0

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL