Documentation ¶
Overview ¶
Package Drago provides implementation of feed forward neural network
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Activator ¶
type Activator interface { // Apply calculates the activation of a given layer given // given previous layer activations Apply(int, int, float64) float64 // Derivative is the calculation used to update weights during backprop Derivative(int, int, float64) float64 }
Activator represents the activation function for a given layer
type Criterion ¶
type Criterion interface { // Apply calculates the error given the predicted (first) and // actual (second) labels Apply(*mat64.Dense, *mat64.Dense) float64 // Derivative calculates the derivative of the error function (Apply) // given the predicted (first) and actual (second) labels Derivative(*mat64.Dense, *mat64.Dense) *mat64.Dense }
Criterion calculates error in predicted value for a given sample
type Linear ¶
type Linear struct { }
Linear represents linear activation function
type MSE ¶
type MSE struct { }
MSE implements Mean Squared Error calculations for Criterion interface
type Network ¶
type Network struct { Activators []Activator Activations []*mat64.Dense Weights []*mat64.Dense Errors []*mat64.Dense Topology []int Layers int LearningRate float64 Iterations int Loss Criterion // Controls logging output during training. Verbose bool // contains filtered or unexported fields }
Network struct represents the neural network Values are exported but should not be messed with during training, are exported simply for ease of examining the network state
func New ¶
New creates a new neural network Topology specifies number of hidden layers and nodes in each, as well as size of samples and labels (first and last values, respectively). Acts array should have one activator for each hidden layer
type ReLU ¶
type ReLU struct { }
ReLU struct represents a rectified linear unit
type Sigmoid ¶
type Sigmoid struct { }
Sigmoid represents sigmoid activation function