layers

package
v1.0.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 15, 2024 License: MIT Imports: 12 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type HeInitialization

type HeInitialization struct{}

HeInitialization is a weight initialization technique, which uses the normal distribution with a standard deviation using the previous layer size.

Mainly used with ReLU activation function to account for the zeros in (-inf;0] range.

func (HeInitialization) Generate

func (h HeInitialization) Generate(layerSize [2]int) float64

type Layer

type Layer interface {
	fmt.Stringer
	json.Marshaler
	json.Unmarshaler

	InputSize() [2]int
	OutputSize() [2]int

	IsTraining() bool

	Weights() Matrix[float64]
	Bias() Matrix[float64]
	Activation() activation.ActivationFunction

	ForwardPropagate(X Matrix[float64]) (Y [2]Matrix[float64], err error)
	BackPropagate(
		nextLayerPropagation, input Matrix[float64],
		output [2]Matrix[float64],
		parameters utils.NeuralNetworkParameters,
	) Matrix[float64]
	// contains filtered or unexported methods
}

func NewDense

func NewDense(W, b Matrix[float64], a activation.ActivationFunction) (Layer, error)

NewDense produces a new fully-connected layer of neurons using given weights and biases.

Returns error if weights and biases sizes are non-conformable.

func NewDropout

func NewDropout(inputSize int, rate float64) Layer

NewDropout produces a dropout layer, which nullifies random neurons to reduce overfitting of the model.

Neurons are nullified at random with different neurons being deactivated in different samples with some `rate`.

func NewRandomDense

func NewRandomDense(weightSize [2]int, a activation.ActivationFunction, wi WeightInitialization) Layer

NewRandomDense produces a dense layer using a given weight initialization method.

Beware that different weight initialization techniques are better suited for different activation functions, for example:

  • ReLU - HeInitialization
  • Sigmoid - XavierUniformInitializations

and so on.

type RandomInitialization

type RandomInitialization struct {
	Min float64
	Max float64
}

RandomInitialization randomly generates a number in the range given.

If the limits are not set, Generate produces zeros.

weight := r.Min + rand.Float64()*(r.Max-r.Min)

func (RandomInitialization) Generate

func (r RandomInitialization) Generate(layerSize [2]int) float64

type WeightInitialization

type WeightInitialization interface {
	Generate(layerSize [2]int) float64
}

WeightInitialization, even though contains only one method, is used to produce values for weights in the layer based on its size.

Generate is used to generate a float value as a weight for the layer. Note, that some weight initialization techniques may utilize only one layer dimension, like He initialization.

type XavierNormalInitialization

type XavierNormalInitialization struct{}

XavierNormalInitialization (also known as Glorot) is a weight initialization technique based on the normal distribution.

Mainly used for tanh activation function.

func (XavierNormalInitialization) Generate

func (x XavierNormalInitialization) Generate(layerSize [2]int) float64

type XavierUniformInitialization

type XavierUniformInitialization struct{}

XavierNormalInitialization (also known as Glorot) is a weight initialization technique based on the uniform distribution.

Mainly used for sigmoid activation function.

func (XavierUniformInitialization) Generate

func (x XavierUniformInitialization) Generate(layerSize [2]int) float64

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL