wann

package module
v0.0.0-...-5d79728 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 7, 2022 License: BSD-3-Clause Imports: 15 Imported by: 0

README

Logo

wann Go Report Card GoDoc

Weight Agnostic Neural Networks is a new type of neural network, where the weights of all the neurons are shared and the structure of the network is what matters.

This package implements Weight Agnostic Neural Networks for Go, and is inspired by this paper from June 2019:

"Weight Agnostic Neural Networks" by Adam Gaier and David Ha. (PDF | Interactive version | Google AI blog post)

Features and limitations

  • All activation functions are benchmarked at the start of the program and the results are taken into account when calculating the complexity of a network.
  • All networks can be translated to a Go statement, using the wonderful jennifer package (work in progress, there are a few kinks that needs to be ironed out).
  • Networks can be saved as SVG diagrams. This feature needs more testing.
  • Neural networks can be trained and used. See the cmd folder for examples.
  • A random weight is chosen when training, instead of looping over the range of the weight. The paper describes both methods.
  • After the network has been trained, the optimal weight is found by looping over all weights (with a step size of 0.0001).
  • Increased complexity counts negatively when evolving networks. This optimizes not only for less complex networks, but also for execution speed.
  • The diagram drawing routine plots the activation functions directly onto the nodes, together with a label. This can be saved as an SVG file.

Example program

This is a simple example, for creating a network that can recognize one of four shapes:

package main

import (
    "fmt"
    "os"

    "github.com/xyproto/wann"
)

func main() {
    // Here are four shapes, representing: up, down, left and right:

    up := []float64{
        0.0, 1.0, 0.0, //  o
        1.0, 1.0, 1.0} // ooo

    down := []float64{
        1.0, 1.0, 1.0, // ooo
        0.0, 1.0, 0.0} //  o

    left := []float64{
        1.0, 1.0, 1.0, // ooo
        0.0, 0.0, 1.0} //   o

    right := []float64{
        1.0, 1.0, 1.0, // ooo
        1.0, 0.0, 0.0} // o

    // Prepare the input data as a 2D slice
    inputData := [][]float64{
        up,
        down,
        left,
        right,
    }

    // Target scores for: up, down, left, right
    correctResultsForUp := []float64{1.0, -1.0, -1.0, -1.0}

    // Prepare a neural network configuration struct
    config := &wann.Config{
        InitialConnectionRatio: 0.2,
        Generations:            2000,
        PopulationSize:         500,
        Verbose:                true,
    }

    // Evolve a network, using the input data and the sought after results
    trainedNetwork, err := config.Evolve(inputData, correctResultsForUp)
    if err != nil {
        fmt.Fprintf(os.Stderr, "error: %s\n", err)
        os.Exit(1)
    }

    // Now to test the trained network on 4 different inputs and see if it passes the test
    upScore := trainedNetwork.Evaluate(up)
    downScore := trainedNetwork.Evaluate(down)
    leftScore := trainedNetwork.Evaluate(left)
    rightScore := trainedNetwork.Evaluate(right)

    if config.Verbose {
        if upScore > downScore && upScore > leftScore && upScore > rightScore {
            fmt.Println("Network training complete, the results are good.")
        } else {
            fmt.Println("Network training complete, but the results did not pass the test.")
        }
    }

    // Save the trained network as an SVG image
    if config.Verbose {
        fmt.Print("Writing network.svg...")
    }
    if err := trainedNetwork.WriteSVG("network.svg"); err != nil {
        fmt.Fprintf(os.Stderr, "error: %s\n", err)
        os.Exit(1)
    }
    if config.Verbose {
        fmt.Println("ok")
    }
}

Here is the resulting network generated by the above program:

Network

This makes sense, since taking the third number in the input data (index 2), running it through a swish function and then inverting it should be a usable detector for the up pattern.

  • The generated networks may differ for each run.

Quick start

This requires Go 1.11 or later.

Clone the repository:

git clone https://github.com/xyproto/wann

Enter the cmd/evolve directory:

cd wann/cmd/evolve

Build and run the example:

go build && ./evolve

Take a look at the best network for judging if a set of numbers that are either 0 or 1 are of one category:

xdg-open network.svg

(If needed, use your favorite SVG viewer instead of the xdg-open command).

Ideas

  • Adding convolution nodes might give interesting results.

Generating Go code from a trained network

This is an experimental feature and a work in progress!

The idea is to generate one large expression from all the expressions that each node in the network represents.

Right now, this only works for networks that has a depth of 1.

For example, adding these two lines to cmd/evolve/main.go:

// Output a Go function for this network
fmt.Println(trainedNetwork.GoFunction())

Produces this output:

func f(x float64) float64 { return -x }

The plan is to output a function that takes the input data instead, and refers to the input data by index. Support for deeper networks also needs to be added.

There is a complete example for outputting Go code in cmd/gofunction.

General info

Documentation

Index

Examples

Constants

This section is empty.

Variables

ActivationFunctions is a collection of activation functions, where the keys are constants that are defined above https://github.com/google/brain-tokyo-workshop/blob/master/WANNRelease/WANN/wann_src/ind.py

View Source
var ComplexityEstimate = make(map[ActivationFunctionIndex]float64)

ComplexityEstimate is a map for having an estimate of how complex each function is, based on a quick benchmark of each function. The complexity estimates will vary, depending on the performance.

Functions

func ActivationStatement

func ActivationStatement(af ActivationFunctionIndex, w float64, inputStatements []*jen.Statement) *jen.Statement

ActivationStatement creates an activation function statment, given a weight and input statements returns: activationFunction(input0 * w + input1 * w + ...) The function calling this function is responsible for inserting network input values into the network input nodes.

func Render

func Render(inner *jen.Statement) string

Render renders a *jen.Statement to a string, if possible if there is an error about an extra ")", then that's because anonymous functions are not supported by jen Do not Render until statements could be placed at the top-level in a Go program.

func RunStatementInputData

func RunStatementInputData(statement *jen.Statement, inputData []float64) (float64, error)

RunStatementInputData will run the given statement by wrapping it in a program and using "go run"

func RunStatementX

func RunStatementX(statement *jen.Statement, x float64) (float64, error)

RunStatementX will run the given statement by wrapping it in a program and using "go run"

func ScorePopulation

func ScorePopulation(population []*Network, weight float64, inputData [][]float64, incorrectOutputMultipliers []float64) (map[int]float64, float64)

ScorePopulation evaluates a population, given a slice of input numbers. It returns a map with scores, together with the sum of scores.

Types

type ActivationFunctionIndex

type ActivationFunctionIndex int

ActivationFunctionIndex is a number that represents a specific activation function

const (
	// Step is a step. First 0 and then abrubtly up to 1.
	Step ActivationFunctionIndex = iota
	// Linear is the linear activation function. Gradually from 0 to 1.
	Linear
	// Sin is the sinoid activation function
	Sin
	// Gauss is the Gaussian function, with a mean of 0 and a sigma of 1
	Gauss
	// Tanh is math.Tanh
	Tanh
	// Sigmoid is the optimized sigmoid function from github.com/xyproto/swish
	Sigmoid
	// Inv is the inverse linear function
	Inv
	// Abs is math.Abs
	Abs
	// ReLU or ReLU is the rectified linear unit, first 0 and then the linear function
	ReLU
	// Cos is the cosoid (?) activation function
	Cos
	// Squared increases rapidly
	Squared
	// Swish is a later invention than ReLU, _|
	Swish
	// SoftPlus is log(1 + exp(x))
	SoftPlus
)

func (ActivationFunctionIndex) Call

Call runs an activation function with the given float64 value. The activation function is chosen by one of the constants above.

Example
fmt.Println(Gauss.Call(2.0))
Output:

0.13427659965015956

func (ActivationFunctionIndex) GoRun

func (afi ActivationFunctionIndex) GoRun(x float64) (float64, error)

GoRun will first construct the expression using jennifer and then evaluate the result using "go run" and a source file innn /tmp

Example
// Run the Gauss function directly
fmt.Println(ActivationFunctions[Gauss](0.5))
// Use Jennifer to generate a source file just for running the Gauss function, then use "go run" and fetch the result
if result, err := Gauss.GoRun(0.5); err == nil { // no error
	fmt.Println(result)
}
Output:

0.8824699625576026
0.8824969025845955

func (ActivationFunctionIndex) Name

func (afi ActivationFunctionIndex) Name() string

Name returns a name for each activation function

func (ActivationFunctionIndex) Statement

func (afi ActivationFunctionIndex) Statement(inner *jen.Statement) *jen.Statement

Statement returns the Statement statement for this activation function, using the given inner statement

func (ActivationFunctionIndex) String

func (afi ActivationFunctionIndex) String() string

String returns the Go expression for this activation function, using "x" as the input variable name

type Config

type Config struct {

	// When initializing a network, this is the propability that the node will be connected to the output node
	InitialConnectionRatio float64

	// How many generations to train for, at a maximum?
	Generations int
	// How large population sizes to use per generation?
	PopulationSize int
	// For how many generations should the training go on, without any improvement in the best score? Disabled if 0.
	MaxIterationsWithoutBestImprovement int
	// RandomSeed, for initializing the random number generator. The current time is used for the seed if this is set to 0.
	RandomSeed int64
	// Verbose output
	Verbose bool
	// contains filtered or unexported fields
}

Config is a struct that is used when initializing new Network structs. The idea is that referring to fields by name is more explicit, and that it can be re-used in connection with having a configuration file, in the future.

func (*Config) Evolve

func (config *Config) Evolve(inputData [][]float64, incorrectOutputMultipliers []float64) (*Network, error)

Evolve evolves a neural network, given a slice of training data and a slice of correct output values. Will overwrite config.Inputs.

func (*Config) Init

func (config *Config) Init()

Init will initialize the pseudo-random number generator and estimate the complexity of the available activation functions

type Network

type Network struct {
	AllNodes   []Neuron      // Storing the actual neurons
	InputNodes []NeuronIndex // Pointers to the input nodes
	OutputNode NeuronIndex   // Pointer to the output node
	Weight     float64       // Shared weight
}

Network is a collection of nodes, an output node and a shared weight.

func NewNetwork

func NewNetwork(cs ...*Config) Network

NewNetwork creates a new minimal network with n input nodes and ratio of r connections. Passing "nil" as an argument is supported.

func (*Network) AddConnection

func (net *Network) AddConnection(a, b NeuronIndex) error

AddConnection adds a connection from a to b. The order is swapped if needed, then a is added as an input to b.

func (*Network) All

func (net *Network) All() []*Neuron

All returns a slice with pointers to all nodes in this network

func (*Network) Complexity

func (net *Network) Complexity() float64

Complexity measures the network complexity Will return 1.0 at a minimum

func (*Network) Connected

func (net *Network) Connected() []NeuronIndex

Connected returns a slice of neuron indexes, that are all connected to the output node (directly or indirectly)

func (Network) Copy

func (net Network) Copy() *Network

Copy a Network to a new network

func (*Network) Depth

func (net *Network) Depth() int

Depth returns the maximum connection distance from the output node

func (*Network) Evaluate

func (net *Network) Evaluate(inputValues []float64) float64

Evaluate will return a weighted sum of the input nodes, using the .Value field if it is set and no input nodes are available. A shared weight can be given.

func (*Network) Exists

func (net *Network) Exists(ni NeuronIndex) bool

Exists checks if the given NeuronIndex exists in this Network

func (*Network) ForEachConnected

func (net *Network) ForEachConnected(f func(n *Neuron))

ForEachConnected will only go through nodes that are connected to the output node (directly or indirectly) Unconnected input nodes are not covered.

func (*Network) ForEachConnectedNodeIndex

func (net *Network) ForEachConnectedNodeIndex(f func(ni NeuronIndex))

ForEachConnectedNodeIndex will only go through nodes that are connected to the output node (directly or indirectly) Unconnected input nodes are not covered.

func (*Network) Get

func (net *Network) Get(i NeuronIndex) *Neuron

Get returns a pointer to a neuron, based on the given NeuronIndex

func (*Network) GetRandomInputNode

func (net *Network) GetRandomInputNode() NeuronIndex

GetRandomInputNode returns a random input node

func (*Network) GetRandomNode

func (net *Network) GetRandomNode() NeuronIndex

GetRandomNode will select a random neuron. This can be any node, including the output node.

func (*Network) InsertNode

func (net *Network) InsertNode(a, b NeuronIndex, newNodeIndex NeuronIndex) error

InsertNode takes two neurons and inserts a third neuron between them Assumes that a is the leftmost node and the b is the rightmost node.

Example
rand.Seed(commonSeed)
net := NewNetwork(&Config{
	inputs:                 3,
	InitialConnectionRatio: 1.0,
})
fmt.Println("Before insertion:")
fmt.Println(net)
_, nodeIndex := net.NewNeuron()
err := net.InsertNode(0, 1, nodeIndex)
if err != nil {
	fmt.Println("error: " + err.Error())
}
fmt.Println("After insertion:")
fmt.Println(net)
Output:

Before insertion:
Network (4 nodes, 3 input nodes, 1 output node)
	Connected inputs to output node: 3
	Output node ID 0 has these input connections: [1 2 3]
	 Input node ID 1 has these input connections: []
	 Input node ID 2 has these input connections: []
	 Input node ID 3 has these input connections: []

After insertion:
Network (5 nodes, 3 input nodes, 1 output node)
	Connected inputs to output node: 3
	Output node ID 0 has these input connections: [2 3 4]
	 Input node ID 1 has these input connections: []
	 Input node ID 2 has these input connections: []
	 Input node ID 3 has these input connections: []
	       Node ID 4 has these input connections: [1]

func (*Network) InsertRandomNode

func (net *Network) InsertRandomNode() bool

InsertRandomNode will try the given number of times to insert a node in a random location, replacing an existing connection between two nodes. `a -> b` will then become `a -> newNode -> b` Returns true if one was inserted or false if the randomly chosen location wasn't fruitful

func (*Network) IsInput

func (net *Network) IsInput(ni NeuronIndex) bool

IsInput checks if the given node is an input node

func (*Network) LeftRight

func (net *Network) LeftRight(a, b NeuronIndex) (NeuronIndex, NeuronIndex, bool)

LeftRight returns two neurons, such that the first on is the one that is most to the left (towards the input neurons) and the second one is most to the right (towards the output neuron). Assumes that a and b are not equal. The returned bool is true if there is no order (if the nodes are equal, both are output nodes or both are input nodes)

func (*Network) Modify

func (net *Network) Modify(maxIterations int)

Modify the network using one of the three methods outlined in the paper: * Insert node * Add connection * Change activation function

func (*Network) NewBlankNeuron

func (net *Network) NewBlankNeuron() (*Neuron, NeuronIndex)

NewBlankNeuron creates a new Neuron, with the Step activation function as the default

func (*Network) NewInputNode

func (net *Network) NewInputNode(activationFunction ActivationFunctionIndex, connectToOutput bool) error

NewInputNode creates a new input node for this network, optionally connecting it to the output node

func (*Network) NewNeuron

func (net *Network) NewNeuron() (*Neuron, NeuronIndex)

NewNeuron creates a new *Neuron, with a randomly chosen activation function

func (*Network) OutputNodeStatementX

func (net *Network) OutputNodeStatementX(functionName string) string

OutputNodeStatementX returns a statement for the output node, using "x" for the variable

Example
rand.Seed(1)
net := NewNetwork(&Config{
	inputs:                 5,
	InitialConnectionRatio: 0.7,
	sharedWeight:           0.5,
})
fmt.Println(net.OutputNodeStatementX("f"))
//fmt.Println(net.Score())
Output:

f := math.Pow(x, 2.0)
Example (First)
// First create a network with only one output node, that has a step function
net := NewNetwork()
net.AllNodes[net.OutputNode].ActivationFunction = Step

fmt.Println(net.OutputNodeStatementX("score"))
Output:

score := func(s float64) float64 {
	if s >= 0 {
		return 1
	} else {
		return 0
	}
}(x)
Example (Fourth)
rand.Seed(1111113)
net := NewNetwork(&Config{
	inputs:                 5,
	InitialConnectionRatio: 0.7,
	sharedWeight:           0.5,
})
fmt.Println(net.OutputNodeStatementX("score"))
Output:

score := func(r float64) float64 {
	if r >= 0 {
		return r
	} else {
		return 0
	}
}(x)
Example (Second)
// Then create a network with an input node that has a sigmoid function and an output node that has an invert function
net := NewNetwork()
net.NewInputNode(Sigmoid, true)
net.AllNodes[net.OutputNode].ActivationFunction = Inv

// Output a Go expression for this network, using the given input variable names
fmt.Println(net.OutputNodeStatementX("score"))
Output:

score := -(x)
Example (Third)
rand.Seed(999)
net := NewNetwork(&Config{
	inputs:                 1,
	InitialConnectionRatio: 0.7,
	sharedWeight:           0.5,
})
fmt.Println(net.OutputNodeStatementX("score"))
Output:

score := math.Exp(-(math.Pow(x, 2.0)) / 2.0)

func (*Network) OutputSVG

func (net *Network) OutputSVG(w io.Writer) (int, error)

OutputSVG will output the current network as an SVG image to the given io.Writer TODO: Clean up and refactor

func (*Network) RandomizeActivationFunctionForRandomNeuron

func (net *Network) RandomizeActivationFunctionForRandomNeuron()

RandomizeActivationFunctionForRandomNeuron randomizes the activation function for a randomly selected neuron

func (Network) SetInputValues

func (net Network) SetInputValues(inputValues []float64)

SetInputValues will assign the given values to the network input nodes

func (*Network) SetWeight

func (net *Network) SetWeight(weight float64)

SetWeight will set a shared weight for the entire network

func (*Network) StatementWithInputDataVariables

func (net *Network) StatementWithInputDataVariables() (*jen.Statement, error)

StatementWithInputDataVariables traces the entire network, using statements for the input numbers

func (*Network) StatementWithInputValues

func (net *Network) StatementWithInputValues() (*jen.Statement, error)

StatementWithInputValues traces the entire network

func (Network) String

func (net Network) String() string

String creates a simple and not very useful ASCII representation of the input nodes and the output node. Nodes that are not input nodes are skipped. Input nodes that are not connected directly to the output node are drawn as non-connected, even if they are connected via another node.

func (*Network) Unconnected

func (net *Network) Unconnected() []NeuronIndex

Unconnected returns a slice of all unconnected neurons

func (*Network) UpdateNetworkPointers

func (net *Network) UpdateNetworkPointers()

UpdateNetworkPointers will update all the node.Net pointers to point to this network

func (*Network) WriteSVG

func (net *Network) WriteSVG(filename string) error

WriteSVG saves a drawing of the current network as an SVG file

type Neuron

type Neuron struct {
	Net                *Network
	InputNodes         []NeuronIndex // pointers to other neurons
	ActivationFunction ActivationFunctionIndex
	Value              *float64
	// contains filtered or unexported fields
}

Neuron is a list of input-neurons, and an activation function.

func NewUnconnectedNeuron

func NewUnconnectedNeuron() *Neuron

NewUnconnectedNeuron returns a new unconnected neuron with neuronIndex -1 and net pointer set to nil

func (*Neuron) AddInput

func (neuron *Neuron) AddInput(ni NeuronIndex) error

AddInput will add an input neuron

func (*Neuron) AddInputNeuron

func (neuron *Neuron) AddInputNeuron(n *Neuron) error

AddInputNeuron both adds a neuron to this network (if needed) and also adds its neuron index to the neuron.InputNeurons

func (*Neuron) Connect

func (neuron *Neuron) Connect(net *Network)

Connect this neuron to a network, overwriting any existing connections. This will also clear any input nodes to this neuron, since the net is different. TODO: Find the input nodes from the neuron.Net, save those and re-assign if there are matches?

func (Neuron) Copy

func (neuron Neuron) Copy(net *Network) Neuron

Copy a Neuron to a new Neuron, and assign the pointer to the given network to .Net

func (*Neuron) FindInput

func (neuron *Neuron) FindInput(e NeuronIndex) (int, bool)

FindInput checks if the given neuron is an input neuron to this one, and also returns the index to InputNeurons, if found.

func (*Neuron) GetActivationFunction

func (neuron *Neuron) GetActivationFunction() func(float64) float64

GetActivationFunction returns the activation function for this neuron

func (*Neuron) HasInput

func (neuron *Neuron) HasInput(e NeuronIndex) bool

HasInput checks if the given neuron is an input neuron to this one

func (*Neuron) In

func (neuron *Neuron) In(collection []NeuronIndex) bool

In checks if this neuron is in the given collection

func (*Neuron) InputNeuronsAreGood

func (neuron *Neuron) InputNeuronsAreGood() bool

InputNeuronsAreGood checks if all input neurons of this neuron exists in neuron.Net

func (*Neuron) InputStatement

func (neuron *Neuron) InputStatement() (*jen.Statement, error)

InputStatement returns a statement like "inputData[0]", if this node is a network input node

Example

ExampleNeuron_InputStatement

rand.Seed(1)
net := NewNetwork(&Config{
	inputs:                 6,
	InitialConnectionRatio: 0.7,
	sharedWeight:           0.5,
})

// 1.234 should not appear in the output statement
net.SetInputValues([]float64{1.234, 1.234, 1.234, 1.234, 1.234, 1.234})

inputStatement2, err := net.AllNodes[net.InputNodes[2]].InputStatement()
if err != nil {
	panic(err)
}
fmt.Println(Render(inputStatement2))
Output:

inputData[2]

func (*Neuron) Is

func (neuron *Neuron) Is(e NeuronIndex) bool

Is check if the given NeuronIndex points to this neuron

func (*Neuron) IsInput

func (neuron *Neuron) IsInput() bool

IsInput returns true if this is an input node or not Returns false if nil

func (*Neuron) IsOutput

func (neuron *Neuron) IsOutput() bool

IsOutput returns true if this is an output node or not Returns false if nil

func (Neuron) NetworkStatementWithInputDataVariables

func (neuron Neuron) NetworkStatementWithInputDataVariables(visited *[]NeuronIndex) (*jen.Statement, error)

NetworkStatementWithInputDataVariables will print out a trace of visiting all nodes from output and to the left, but with the given slice of statements instead of using the input values

func (Neuron) NetworkStatementWithInputValues

func (neuron Neuron) NetworkStatementWithInputValues(visited *[]NeuronIndex) (*jen.Statement, error)

NetworkStatementWithInputValues will print out a trace of visiting all nodes from output and to the left

func (*Neuron) RandomizeActivationFunction

func (neuron *Neuron) RandomizeActivationFunction()

RandomizeActivationFunction will choose a random activation function for this neuron

func (*Neuron) RemoveInput

func (neuron *Neuron) RemoveInput(e NeuronIndex) error

RemoveInput will remove an input neuron

func (*Neuron) SetValue

func (neuron *Neuron) SetValue(x float64)

SetValue can be used for setting a value for this neuron instead of using input neutrons. This changes how the Evaluation function behaves.

func (*Neuron) String

func (neuron *Neuron) String() string

String will return a string containing both the pointer address and the number of input neurons

type NeuronIndex

type NeuronIndex int

NeuronIndex is an index into the AllNodes slice

func Combine

func Combine(a, b []NeuronIndex) []NeuronIndex

Combine will combine two lists of indices

Example
ac := []NeuronIndex{0, 1, 2, 3, 4}
bc := []NeuronIndex{5, 6, 7, 8, 9}
fmt.Println(Combine(ac, bc))
Output:

[0 1 2 3 4 5 6 7 8 9]

func (NeuronIndex) In

func (ni NeuronIndex) In(nodes *[]NeuronIndex) bool

In returns true if this NeuronIndex is in the given *[]NeuronIndex slice

type NormalizationInfo

type NormalizationInfo struct {
	// contains filtered or unexported fields
}

NormalizationInfo contains if and how the score function should be normalized

func NewNormalizationInfo

func NewNormalizationInfo(enable bool) *NormalizationInfo

NewNormalizationInfo returns a new struct, containing if and how the score function should be normalized

func (*NormalizationInfo) Disable

func (norm *NormalizationInfo) Disable()

Disable signifies that normalization is disabled when this struct is used

func (*NormalizationInfo) Enable

func (norm *NormalizationInfo) Enable()

Enable signifies that normalization is enabled when this struct is used

func (*NormalizationInfo) Get

func (norm *NormalizationInfo) Get() (float64, float64)

Get retrieves the multiplication and addition numbers that can be used for normalization

func (*NormalizationInfo) Set

func (norm *NormalizationInfo) Set(mul, add float64)

Set sets the multiplication and addition numbers that can be used for normalization

type Pair

type Pair struct {
	Key   int
	Value float64
}

Pair is used for sorting dictionaries by value. Thanks https://stackoverflow.com/a/18695740/131264

type PairList

type PairList []Pair

PairList is a slice of Pair

func SortByValue

func SortByValue(m map[int]float64) PairList

SortByValue sorts a map[int]float64 by value

func (PairList) Len

func (p PairList) Len() int

func (PairList) Less

func (p PairList) Less(i, j int) bool

func (PairList) Swap

func (p PairList) Swap(i, j int)

Directories

Path Synopsis
cmd

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL