gophy

package module
v0.0.0-...-c430e3d Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 25, 2023 License: GPL-3.0 Imports: 23 Imported by: 3

README

GoPHY

This is a development toolkit for phylogenetics. There are many different main packages that are typically in subdirectories. So, for example, the bipartition analyzer is in gophy/bp/ and is called bp.go. To get this code and compile things, just do go get github.com/FePhyFoFum/gophy. Then you can build it with go build github.com/FePhyFoFum/gophy/bp/bp.go. There are other mains in other subdirectories with more to come.

You can find more information here.

Documentation

Index

Constants

View Source
const (
	Nucleotide DataType = "nuc"
	AminoAcid           = "aa"
	MultiState          = "mult"
)

datatype constants

Variables

This section is empty.

Functions

func AdjustBLNR

func AdjustBLNR(node *Node, x *DiscreteModel, patternvals []float64, t *Tree, wks int, threshold float64)

AdjustBLNR This is a single edge NR

func AdjustBLNRMult

func AdjustBLNRMult(node *Node, models []*DiscreteModel, nodemodels map[*Node]int, patternvals []float64, t *Tree, wks int, threshold float64)

AdjustBLNRMult This is a single edge NR

func BipartSliceContains

func BipartSliceContains(bps []Bipart, bp Bipart) (ind int)

BipartSliceContains checks to see if the bipart slice contains the bipart and returns the index

func CalcAIC

func CalcAIC(ln float64, k float64) (x float64)

CalcAIC k=numparams

func CalcAICC

func CalcAICC(lnl float64, k float64, n int) (x float64)

CalcAICC k=numparams,n=samplesize

func CalcAncStates

func CalcAncStates(x *DiscreteModel, tree *Tree, patternval []float64) (retstates map[*Node][][]float64)

CalcAncStates for each node based on the calculations above

func CalcBIC

func CalcBIC(ln float64, k float64, n int) (x float64)

CalcBIC k=numparams, n=samplesize

func CalcConfIntLgLike

func CalcConfIntLgLike(nparams float64, perc float64) float64

CalcConfIntLgLike calculates the distance from the ML this is based on pg. 66 from In all Likelihood -.5*s.chi2(df).ppf(1-i) or -.5*s.norm(df,2*df).ppf(1-i) for large df

func CalcExpPDFLog

func CalcExpPDFLog(val, loc, scale float64) float64

func CalcLikeFrontBack

func CalcLikeFrontBack(x *DiscreteModel, tree *Tree, patternval []float64)

CalcLikeFrontBack ...

func CalcLikeFrontBackMult

func CalcLikeFrontBackMult(models []*DiscreteModel, nodemodels map[*Node]int, tree *Tree, patternval []float64)

CalcLikeFrontBackMult ...

func CalcLikeNode

func CalcLikeNode(nd *Node, model *DiscreteModel, site int)

CalcLikeNode calculate the likelihood of a node

func CalcLikeNodeGamma

func CalcLikeNodeGamma(nd *Node, model *DiscreteModel, site int, gammav float64)

CalcLikeNodeGamma calculate the likelihood of a node

func CalcLikeOneSite

func CalcLikeOneSite(t *Tree, x *DiscreteModel, site int) float64

CalcLikeOneSite just one site

func CalcLikeOneSiteGamma

func CalcLikeOneSiteGamma(t *Tree, x *DiscreteModel, site int) float64

CalcLikeOneSiteGamma just one site

func CalcLikeOneSiteGammaMul

func CalcLikeOneSiteGammaMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, site int) float64

CalcLikeOneSiteGamma just one site

func CalcLikeOneSiteMarked

func CalcLikeOneSiteMarked(t *Tree, x *DiscreteModel, site int) float64

CalcLikeOneSiteMarked this uses the marked machinery to recalculate

func CalcLikeOneSiteMarkedGamma

func CalcLikeOneSiteMarkedGamma(t *Tree, x *DiscreteModel, site int) float64

CalcLikeOneSiteMarkedGamma this uses the marked machinery to recalculate

func CalcLikeOneSiteMarkedMul

func CalcLikeOneSiteMarkedMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, site int) float64

CalcLikeOneSiteMarkedMul this uses the marked machinery to recalculate

func CalcLikeWork

func CalcLikeWork(t *Tree, x *DiscreteModel, jobs <-chan int, results chan<- LikeResult)

CalcLikeWork this is the worker

func CalcLikeWorkGamma

func CalcLikeWorkGamma(t *Tree, x *DiscreteModel, jobs <-chan int, results chan<- LikeResult)

CalcLikeWorkGamma ...

func CalcLikeWorkGammaMul

func CalcLikeWorkGammaMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, jobs <-chan int, results chan<- LikeResult)

func CalcLikeWorkMarked

func CalcLikeWorkMarked(t *Tree, x *DiscreteModel, jobs <-chan int, results chan<- LikeResult)

CalcLikeWorkMarked this is intended to calculate only on the marked nodes back to teh root

func CalcLikeWorkMarkedGamma

func CalcLikeWorkMarkedGamma(t *Tree, x *DiscreteModel, jobs <-chan int, results chan<- LikeResult)

CalcLikeWorkMarkedGamma this is intended to calculate only on the marked nodes back to teh root

func CalcLikeWorkMarkedMul

func CalcLikeWorkMarkedMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, jobs <-chan int, results chan<- LikeResult)

CalcLikeWorkMarkedMul this is intended to calculate only on the marked nodes back to teh root

func CalcLogLikeNode

func CalcLogLikeNode(nd *Node, model *DiscreteModel, site int)

CalcLogLikeNode calculates likelihood for node

func CalcLogLikeNodeGamma

func CalcLogLikeNodeGamma(nd *Node, model *DiscreteModel, site int, gammav float64)

CalcLogLikeNodeGamma calculates likelihood for node

func CalcLogLikeOneSite

func CalcLogLikeOneSite(t *Tree, x *DiscreteModel, site int) float64

CalcLogLikeOneSite just calculate the likelihood of one site probably used to populate the PDict in the DNA Model so that we can reuse the calculations

func CalcLogLikeOneSiteBack

func CalcLogLikeOneSiteBack(t *Tree, nb *Node, x *DiscreteModel, site int) float64

CalcLogLikeOneSiteBack like the one above but from nb to the root only

func CalcLogLikeOneSiteGamma

func CalcLogLikeOneSiteGamma(t *Tree, x *DiscreteModel, site int) float64

CalcLogLikeOneSiteGamma ...

func CalcLogLikeOneSiteGammaMul

func CalcLogLikeOneSiteGammaMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, site int) float64

CalcLogLikeOneSiteGammaMul just one site

func CalcLogLikeOneSiteMarked

func CalcLogLikeOneSiteMarked(t *Tree, x *DiscreteModel, site int) float64

CalcLogLikeOneSiteMarked this uses the marked machinery to recalculate

func CalcLogLikeOneSiteMul

func CalcLogLikeOneSiteMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, site int) float64

CalcLogLikeOneSiteMul just one site

func CalcLogLikeWork

func CalcLogLikeWork(t *Tree, x *DiscreteModel, jobs <-chan int, results chan<- LikeResult)

CalcLogLikeWork this is intended for a worker that will be executing this per site

func CalcLogLikeWorkBack

func CalcLogLikeWorkBack(t *Tree, nb *Node, x *DiscreteModel, jobs <-chan int, results chan<- float64)

CalcLogLikeWorkBack this is intended for a worker that will be executing this per site

func CalcLogLikeWorkGamma

func CalcLogLikeWorkGamma(t *Tree, x *DiscreteModel, jobs <-chan int, results chan<- LikeResult)

CalcLogLikeWorkGamma this is intended for a worker that will be executing this per site

func CalcLogLikeWorkGammaMul

func CalcLogLikeWorkGammaMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, jobs <-chan int, results chan<- LikeResult)

func CalcLogLikeWorkMarked

func CalcLogLikeWorkMarked(t *Tree, x *DiscreteModel, jobs <-chan int, results chan<- float64)

CalcLogLikeWorkMarked this is intended to calculate only on the marked nodes back to teh root

func CalcLogLikeWorkMul

func CalcLogLikeWorkMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, jobs <-chan int, results chan<- LikeResult)

CalcLogLikeWorkMul this is the worker

func CalcLogNormLocScalePDF

func CalcLogNormLocScalePDF(val, shape, loc, scale float64) float64

func CalcLogNormPDF

func CalcLogNormPDF(val, shape float64) float64

func CalcLogNormPDFLog

func CalcLogNormPDFLog(val, mn, std float64) float64

func CalcNormPDF

func CalcNormPDF(val, mn, std float64) float64

func CalcNormPDFLog

func CalcNormPDFLog(val, mn, std float64) float64

func CalcSankParsAncStateSingleSite

func CalcSankParsAncStateSingleSite(t *Tree, numstates int, site int)

CalcSankParsAncStateSingleSite .... assumes bifurcating for now

func CalcSankParsNode

func CalcSankParsNode(nd *Node, numstates int, site int)

CalcSankParsNode ...

func CalcSankParsWork

func CalcSankParsWork(t *Tree, numstates int, jobs <-chan int, results chan<- ParsResult)

CalcSankParsWork ...

func CalcSliceIntDifference

func CalcSliceIntDifference(a, b []int) []int

CalcSliceIntDifference calculate the difference (set) between two int slices

func CalcSliceIntDifferenceInt

func CalcSliceIntDifferenceInt(a, b []int) int

CalcSliceIntDifferenceInt calculate the size of the difference (set) between two int slices

func CalcStochMap

func CalcStochMap(x *DiscreteModel, tree *Tree, patternval []float64, time bool, from int, to int) (retstates map[*Node][][]float64)

CalcStochMap this has been taken from the multistate code so make sure it works for others

func CalcSupLikeNode

func CalcSupLikeNode(nd *Node, model *DiscreteModel, site int)

func CalcSupLikeWork

func CalcSupLikeWork(t *Tree, x *DiscreteModel, jobs <-chan int, results chan<- LikeSupResult)

CalcLikeWork this is the worker

func CompareTreeToBiparts

func CompareTreeToBiparts(bps []Bipart, comptreebps []Bipart, workers int, mapints map[int]string, verbose bool, treeverbose bool)

CompareTreeToBiparts take biparts from a set , comparetreebps, and compre them to another set bps this one is complicated so keep with it

func ConfInt95TF

func ConfInt95TF(nums []float64) (float64, float64)

ConfInt95TF returns 95% conf int t-stat

func EstParsBL

func EstParsBL(t *Tree, numstates int, patternval []float64, totalsites int)

EstParsBL estimate the parsimony branch lengths

func GetEmpiricalBaseFreqs

func GetEmpiricalBaseFreqs(seqs map[string]string) (bf []float64)

GetEmpiricalBaseFreqs get the empirical base freqs from the seqs

func GetEmpiricalBaseFreqsMS

func GetEmpiricalBaseFreqsMS(seqs []MSeq, numstates int) (bf []float64)

GetEmpiricalBaseFreqsMS get the empirical base freqs from the seqs

func GetEmpiricalBaseFreqsProt

func GetEmpiricalBaseFreqsProt(seqs map[string]string) (bf []float64)

GetEmpiricalBaseFreqsProt get the empirical base freqs from the seqs

func GetGammaCats

func GetGammaCats(alpha float64, cats int, median bool) []float64

GetGammaCats for likelihood calculators

func GetMap

func GetMap(numStates int) (charMap map[string][]int)

GetMap based on states without the MultStateModel struct

func GetMultMap

func GetMultMap(numStates int) (charMap map[string][]int)

GetMultMap based on states without the MultStateModel struct

func GetNucMap

func GetNucMap() (charMap map[string][]int)

GetNucMap get the int map for DNA with ambiguities

func GetProtMap

func GetProtMap() (charMap map[string][]int)

GetProtMap get the int map for DNA with ambiguities

func GetRevNucMap

func GetRevNucMap() (charMap map[int]string)

GetRevNucMap ...

func GetSitePatterns

func GetSitePatterns(seqs map[string]string, nsites int, seqnames []string) (patterns map[string][]int,
	patternsint map[int]float64, gapsites []int, constant []int, uninformative []int, fullpattern []int)

GetSitePatterns return site pattens when the datatype for the alignment is a map[string]string

func GetSitePatternsMS

func GetSitePatternsMS(seqs []MSeq, charMap map[string][]int, numstates int) (patterns map[string][]int,
	patternsint map[int]float64, gapsites []int, constant []int, uninformative []int, fullpattern []int)

GetSitePatternsMS return site pattens when the datatype for the alignment is a map[string]string

func GetSitePatternsProt

func GetSitePatternsProt(seqs map[string]string, nsites int, seqnames []string) (patterns map[string][]int,
	patternsint map[int]float64, gapsites []int, constant []int, uninformative []int, fullpattern []int)

GetSitePatternsProt return site pattens when the datatype for the alignment is a map[string]string

func IntMapDifferenceRet

func IntMapDifferenceRet(a, b map[int]bool) []int

IntMapDifferenceRet calculate the difference (set) between two int slices

func IntMapIntersects

func IntMapIntersects(s1 map[int]bool, s2 map[int]bool) (in bool)

IntMapIntersects checks to see if the two map[int]bool intersect (in the set sense)

func IntMapIntersects2

func IntMapIntersects2(s1 map[int]bool, s2 map[int]bool) (in bool)

IntMapIntersects2 checks to see if the two map[int]bool intersect (in the set sense) with at least 2 matches

func IntMapIntersectsRet

func IntMapIntersectsRet(s1, s2 map[int]bool) (r []int)

IntMapIntersectsRet checks to see if the two map[int]bool intersect and returns the intersection (in the set sense)

func IntMapSetString

func IntMapSetString(intmap map[int]bool) (s string)

IntMapSetString get a string for printing off a set

func IntSliceContains

func IntSliceContains(is []int, s int) (rb bool)

IntSliceContains checks to see if the int slice contains an int and returns the bool

func IntSliceIntersects

func IntSliceIntersects(a, b []int) (rb bool)

IntSliceIntersects checks to see whether two int slices intersect

func Log1exp

func Log1exp(x float64) float64

Log1exp returns log(0+e^x) when e^x is in range

func LogFact

func LogFact(k float64) float64

LogFact calculate the log factorial - this is based on Stirling's approx and is faster than LogFactorial

func LogFactorial

func LogFactorial(val int) (x float64)

LogFactorial slow method to calculate log factorial log(1) + log(2) + ... + log(n)

func MapContinuous

func MapContinuous(t *Tree, traitfl string)

MapContinuous maps the traits from a file to the tips of a tree and initializes slices of the same length for the internal nodes

func MaxF

func MaxF(n []float64) float64

MaxF max

func MedianF

func MedianF(n []float64) float64

MedianF calculate the "median" value

func MinF

func MinF(n []float64) float64

MinF max

func NNIMoves

func NNIMoves(tr *Tree) [][]*Node

NNIMoves looks at the root and returns the NNIs

func NW

func NW(seqs []Seq, in1 int, in2 int)

NW toy example, scores are all default

func NodeNamesSliceIntersects

func NodeNamesSliceIntersects(a, b []*Node) (rb bool)

NodeNamesSliceIntersects checks to see whether two node slices intersect by name

func NodeSliceContains

func NodeSliceContains(s []*Node, e *Node) bool

NodeSliceContains tells you whether the e string is in the slice

func NodeSlicePosition

func NodeSlicePosition(sl []*Node, nd *Node) (x int)

NodeSlicePosition take a *[]Node slice and teh get the index of the element node

func OptimizeAACompSharedRM

func OptimizeAACompSharedRM(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int,
	patternvals []float64, log bool, wks int)

OptimizeAACompSharedRM optimize GTR base composition but share rate matrix for the different parts

func OptimizeAACompSharedRMNLOPT

func OptimizeAACompSharedRMNLOPT(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int,
	patternvals []float64, log bool, wks int)

func OptimizeAACompSharedRMSingleModel

func OptimizeAACompSharedRMSingleModel(t *Tree, models []*DiscreteModel,
	nodemodels map[*Node]int, usemodelvals bool, whichmodel int,
	patternvals []float64, log bool, wks int)

OptimizeAACompSharedRMSingleModel optimize AA base comp but share rate matrix for different parts

func OptimizeBF

func OptimizeBF(t *Tree, x *DiscreteModel, patternvals []float64, log bool, wks int)

OptimizeBF optimizing the basefreq model but for a clade

func OptimizeBFDNARMSubClade

func OptimizeBFDNARMSubClade(t *Tree, n *Node, excl bool, x *DiscreteModel, patternvals []float64, wks int)

OptimizeBFDNARMSubClade optimizing the basefreq model but for a subclade

func OptimizeBFSubClade

func OptimizeBFSubClade(t *Tree, n *Node, excl bool, x *DiscreteModel, patternvals []float64, log bool, wks int)

OptimizeBFSubClade optimizing the basefreq model but for a subclade

func OptimizeBFSubCladeNLOPT

func OptimizeBFSubCladeNLOPT(t *Tree, n *Node, excl bool, x *DiscreteModel, patternvals []float64, log bool, wks int)

func OptimizeBL

func OptimizeBL(nd *Node, t *Tree, x *DiscreteModel, patternvals []float64, wks int)

OptimizeBL This uses the standard gonum optimizers. Not great.

func OptimizeBLNR

func OptimizeBLNR(t *Tree, x *DiscreteModel, patternvals []float64, wks int)

OptimizeBLNR Newton-Raphson for each branch. Does 4 passes

func OptimizeBLNRGN

func OptimizeBLNRGN(t *Tree, x *DiscreteModel, patternvals []float64, wks int)

func OptimizeBLNRMult

func OptimizeBLNRMult(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, patternvals []float64, wks int)

OptimizeBLNRMult Newton-Raphson for each branch. Does 4 passes

func OptimizeBLS

func OptimizeBLS(t *Tree, x *DiscreteModel, patternvals []float64, wks int)

OptimizeBLS optimize all branch lengths

func OptimizeBLSCLock

func OptimizeBLSCLock(t *Tree, x *DiscreteModel, patternvals []float64, wks int)

OptimizeBLSCLock optimize all branch lengths assuming a clock

func OptimizeBLSCLockNL

func OptimizeBLSCLockNL(t *Tree, x *DiscreteModel, patternvals []float64, wks int)

OptimizeBLSCLockNL optimize all branch lengths assuming a clock

func OptimizeBLSMul

func OptimizeBLSMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, patternvals []float64, wks int) float64

OptimizeBLSMul optimize all branch lengths

func OptimizeBLSNL

func OptimizeBLSNL(t *Tree, x *DiscreteModel, patternvals []float64, wks int)

OptimizeBLS optimize all branch lengths

func OptimizeGTRBPDNAMul

func OptimizeGTRBPDNAMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, usemodelvals bool,
	patternvals []float64, log bool, wks int)

OptimizeGTRBPDNAMul optimize GTR and base composition for the different parts

func OptimizeGTRDNA

func OptimizeGTRDNA(t *Tree, x *DiscreteModel, patternvals []float64,
	sup bool, wks int) []float64

OptimizeGTRDNA optimize GTR

func OptimizeGTRDNACompSharedRM

func OptimizeGTRDNACompSharedRM(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int,
	usemodelvals bool, patternvals []float64, log bool, wks int)

OptimizeGTRDNACompSharedRM optimize GTR base composition but share rate matrix for the different parts

func OptimizeGTRDNACompSharedRMNL

func OptimizeGTRDNACompSharedRMNL(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int,
	usemodelvals bool, patternvals []float64, log bool, wks int)

func OptimizeGTRDNACompSharedRMSingleModel

func OptimizeGTRDNACompSharedRMSingleModel(t *Tree, models []*DiscreteModel,
	nodemodels map[*Node]int, usemodelvals bool, whichmodel int,
	patternvals []float64, log bool, wks int)

OptimizeGTRDNACompSharedRMSingleModel optimize GTR base composition but share rate matrix for the different parts

you send an int that will identify which model can be adjusted and only that one will be

func OptimizeGTRDNACompSharedRMSubClade

func OptimizeGTRDNACompSharedRMSubClade(t *Tree, n *Node, excl bool, models []*DiscreteModel, nodemodels map[*Node]int,
	usemodelvals bool, patternvals []float64, wks int)

OptimizeGTRDNACompSharedRMSubClade optimize GTR base composition but share rate matrix for the different parts

func OptimizeGTRDNAMul

func OptimizeGTRDNAMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, patternvals []float64, wks int)

OptimizeGTRDNAMul optimize GTR

func OptimizeGTRDNASubClade

func OptimizeGTRDNASubClade(t *Tree, n *Node, excl bool, x *DiscreteModel, patternvals []float64, wks int)

OptimizeGTRDNASubClade optimizing the GTR model but for a subclade

func OptimizeGamma

func OptimizeGamma(t *Tree, x *DiscreteModel, patternvals []float64, log bool, wks int)

OptimizeGamma ...

func OptimizeGammaAndBL

func OptimizeGammaAndBL(t *Tree, x *DiscreteModel, patternvals []float64, log bool, wks int)

OptimizeGammaAndBL ...

func OptimizeGammaAndBLMult

func OptimizeGammaAndBLMult(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int,
	patternvals []float64, log bool, wks int)

func OptimizeGammaBLS

func OptimizeGammaBLS(t *Tree, x *DiscreteModel, patternvals []float64, wks int)

OptimizeGammaBLS optimize all branch lengths

func OptimizeGammaBLSMult

func OptimizeGammaBLSMult(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, patternvals []float64, wks int)

OptimizeGammaBLS optimize all branch lengths

func OptimizeGammaBLSNL

func OptimizeGammaBLSNL(t *Tree, x *DiscreteModel, patternvals []float64, wks int)

func OptimizeGammaBLSNLMult

func OptimizeGammaBLSNLMult(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int,
	patternvals []float64, log bool, wks int)

func OptimizeGammaMult

func OptimizeGammaMult(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int,
	patternvals []float64, log bool, wks int)

func OptimizeMKMS

func OptimizeMKMS(t *Tree, x *DiscreteModel, startv float64, patternvals []float64, sym bool, wks int)

OptimizeMKMS optimize GTR

symmetrical and scale the last rate to 1

func OptimizeMS1R

func OptimizeMS1R(t *Tree, x *DiscreteModel, patternvals []float64, wks int)

OptimizeMS1R ... hold over from other things, probably change this is for specific multistate models

func OptimizeScaleNL

func OptimizeScaleNL(t *Tree, x *DiscreteModel, patternvals []float64, wks int)

OptimizeScaleNL scale tree using a single scalar

func OptimizeSharedGammaAndBLMult

func OptimizeSharedGammaAndBLMult(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int,
	patternvals []float64, log bool, wks int)

func OutputEdges

func OutputEdges(mapints map[int]string, bps []Bipart, ntrees int, verb bool)

OutputEdges just print the edges mapints are int to string names for the taxa bps list of biparts ntrees number of trees

func PCalcLike

func PCalcLike(t *Tree, x *DiscreteModel, nsites int, wks int) (fl float64)

PCalcLike parallel calculate likelihood

func PCalcLikePatterns

func PCalcLikePatterns(t *Tree, x *DiscreteModel, patternval []float64, wks int) (fl float64)

PCalcLikePatterns parallel caclulation of likelihood with patterns

func PCalcLikePatternsGamma

func PCalcLikePatternsGamma(t *Tree, x *DiscreteModel, patternval []float64, wks int) (fl float64)

PCalcLikePatternsGamma parallel caclulation of likelihood with patterns with gamma

func PCalcLikePatternsGammaMul

func PCalcLikePatternsGammaMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int,
	patternval []float64, wks int) (fl float64)

PCalcLikePatternsGammaMul parallel caclulation of likelihood with patterns with gamma

func PCalcLikePatternsMarked

func PCalcLikePatternsMarked(t *Tree, x *DiscreteModel, patternval []float64, wks int) (fl float64)

PCalcLikePatternsMarked parallel likelihood caclulation with patterns and just update the values

func PCalcLikePatternsMarkedGamma

func PCalcLikePatternsMarkedGamma(t *Tree, x *DiscreteModel, patternval []float64, wks int) (fl float64)

PCalcLikePatternsMarkedGamma parallel likelihood caclulation with patterns and just update the values

func PCalcLikePatternsMarkedMul

func PCalcLikePatternsMarkedMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, patternval []float64, wks int) (fl float64)

PCalcLikePatternsMarkedMul parallel likelihood caclulation with patterns and just update the values

func PCalcLikePatternsMul

func PCalcLikePatternsMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int,
	patternval []float64, wks int) (fl float64)

PCalcLikePatternsMul parallel caclulation of likelihood with patterns

func PCalcLikePatternsMulSubClade

func PCalcLikePatternsMulSubClade(t *Tree, n *Node, excl bool, models []*DiscreteModel,
	nodemodels map[*Node]int, patternval []float64, wks int) (fl float64)

PCalcLikePatternsMulSubClade parallel log likeliohood calculation including patterns

func PCalcLikePatternsSubClade

func PCalcLikePatternsSubClade(t *Tree, n *Node, excl bool, x *DiscreteModel, patternval []float64, wks int) (fl float64)

PCalcLikePatternsSubClade parallel log likeliohood calculation including patterns

func PCalcLogLike

func PCalcLogLike(t *Tree, x *DiscreteModel, nsites int, wks int) (fl float64)

PCalcLogLike this will calculate log like in parallel

func PCalcLogLikeBack

func PCalcLogLikeBack(t *Tree, n *Node, x *DiscreteModel, nsites int, wks int) (fl float64)

PCalcLogLikeBack a bit of a shortcut. Could do better, but walks back from the n node to the root

func PCalcLogLikeMarked

func PCalcLogLikeMarked(t *Tree, x *DiscreteModel, nsites int, wks int) (fl float64)

PCalcLogLikeMarked parallel calculation of loglike with just updating

func PCalcLogLikePatterns

func PCalcLogLikePatterns(t *Tree, x *DiscreteModel, patternval []float64, wks int) (fl float64)

PCalcLogLikePatterns parallel log likeliohood calculation including patterns

func PCalcLogLikePatternsGamma

func PCalcLogLikePatternsGamma(t *Tree, x *DiscreteModel, patternval []float64, wks int) (fl float64)

PCalcLogLikePatternsGamma parallel log likeliohood calculation including patterns

func PCalcLogLikePatternsGammaMul

func PCalcLogLikePatternsGammaMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int,
	patternval []float64, wks int) (fl float64)

func PCalcLogLikePatternsMul

func PCalcLogLikePatternsMul(t *Tree, models []*DiscreteModel, nodemodels map[*Node]int, patternval []float64, wks int) (fl float64)

PCalcLogLikePatternsMul ...

func PCalcLogLikePatternsSubClade

func PCalcLogLikePatternsSubClade(t *Tree, n *Node, excl bool, x *DiscreteModel, patternval []float64, wks int) (fl float64)

PCalcLogLikePatternsSubClade parallel log likeliohood calculation including patterns

func PCalcRFDistancesPartial

func PCalcRFDistancesPartial(bpts map[int][]int, bps []Bipart, jobs <-chan []int, results chan<- []int)

PCalcRFDistancesPartial calculates the partial rf, bpts is the tree index, bipart list, bps is the list of biparts

func PCalcRFDistancesPartialWeighted

func PCalcRFDistancesPartialWeighted(bpts map[int][]int, tippenalty bool, bps []Bipart, jobs <-chan []int, results chan<- Rfwresult)

PCalcRFDistancesPartialWeighted includes the branch lengths

func PCalcSankParsPatterns

func PCalcSankParsPatterns(t *Tree, numstates int, patternval []float64, wks int) (fl float64)

PCalcSankParsPatterns parallel caclulation of parsimony costs with patterns

func PCalcSliceIntDifferenceInt

func PCalcSliceIntDifferenceInt(bpts map[int][]int, jobs <-chan []int, results chan<- []int)

PCalcSliceIntDifferenceInt calculate the size of the difference (set) between two int slices in parallel used for: RF distance where the bpts is the tree index -> bipart index list map

func PCalcSupLikePatterns

func PCalcSupLikePatterns(t *Tree, x *DiscreteModel, patternval []float64, wks int) (fl float64)

func PConcordance

func PConcordance(bps []Bipart, jobs <-chan []int, results chan<- []int)

PConcordance is similar to the other comparison code but for concordance. The input jobs are the i, j for the bipart comparisons. The results are the i, j, and 0 for not concordant and 1 for concordant

func PConcordanceTwoSets

func PConcordanceTwoSets(comp []Bipart, bps []Bipart, jobs <-chan []int, results chan<- []int)

PConcordanceTwoSets same as the one above but where there are two sets

func PConflicts

func PConflicts(bps []Bipart, jobs <-chan []int, results chan<- []int)

PConflicts is a parallel conflict check. The slice is sent. The jobs are the two indices to check. The results are the two indicies and an int 1 for conflict 0 for no conflict

func PConflictsCompTree

func PConflictsCompTree(bps []Bipart, comptreebps []Bipart, jobs <-chan []int, results chan<- []int)

PConflictsCompTree is similar to PConflict but the first index refers to the first Bipart slice and the second referts to the second Bipart slice

func PNW

func PNW(seqs []Seq, jobs <-chan []int, results chan<- float32)

PNW NW but parallel

func PreparePatternVecs

func PreparePatternVecs(t *Tree, patternsint map[int]float64, seqs map[string]string) (patternval []float64, patternvec []int)

PreparePatternVecs for tree calculations

func PreparePatternVecsMS

func PreparePatternVecsMS(t *Tree, patternsint map[int]float64, seqs map[string][]string,
	charMap map[string][]int, numstates int) (patternval []float64, patternvec []int)

PreparePatternVecsMS for tree calculations

func PreparePatternVecsProt

func PreparePatternVecsProt(t *Tree, patternsint map[int]float64, seqs map[string]string) (patternval []float64, patternvec []int)

PreparePatternVecsProt for tree calculations

func RTMultconditionals

func RTMultconditionals(models []*DiscreteModel, nodemodels map[*Node]int, node *Node, patternval []float64)

RTMultconditionals ...

func RTconditionals

func RTconditionals(x *DiscreteModel, node *Node, patternval []float64)

RTconditionals tipconds calculated to the rt (including BL)

func RVMultconditionals

func RVMultconditionals(models []*DiscreteModel, nodemodels map[*Node]int, node *Node, patternval []float64)

RVMultconditionals ...

func RVTPconditionals

func RVTPconditionals(x *DiscreteModel, node *Node, patternval []float64)

RVTPconditionals ...

func RVconditionals

func RVconditionals(x *DiscreteModel, node *Node, patternval []float64)

RVconditionals take par RvTpConds and put get bl

func ReadLine

func ReadLine(path string) (ln []string)

ReadLine is like the Python readline() and readlines()

func ReadPatternsMSeqsFromFile

func ReadPatternsMSeqsFromFile(sfn string) (seqs map[string][]string,
	patternsint map[int]float64, nsites int, bf []float64, numstates int)

ReadPatternsMSeqsFromFile return the seqs and patternsint

func ReadPatternsSeqsFromFile

func ReadPatternsSeqsFromFile(sfn string, nucleotide bool) (seqs map[string]string, patternsint map[int]float64, nsites int, bf []float64)

ReadPatternsSeqsFromFile return the seqs and patternsint

func Reroot

func Reroot(inroot *Node, tr *Tree)

Reroot basic reroot function

func Round

func Round(val float64, roundOn float64, places int) (newVal float64)

Round to the nearest place probably val=num, roundOn = 0.5 places = 5

func SetHeights

func SetHeights(tree *Tree)

SetHeights set tree height

func StdFloatVec

func StdFloatVec(x []float64) (s float64)

func StochasticNNI

func StochasticNNI()

StochasticNNI should make NNI moves

func StringSliceContains

func StringSliceContains(s []string, e string) bool

StringSliceContains tells you whether the e string is in the slice

func SumFloatVec

func SumFloatVec(x []float64) (s float64)

SumFloatVec sum the float vectors

func SumLogExp

func SumLogExp(a, b float64) float64

SumLogExp sum log of exps

func SwapBranch

func SwapBranch(nd1 *Node, nd2 *Node) bool

SwapBranch will to get the branches that would be NNIs, use the function and then send the relevant nodes here

func TPconditionals

func TPconditionals(x *DiscreteModel, node *Node, patternval []float64)

TPconditionals regular tip conditionals

func TritomyRoot

func TritomyRoot(tr *Tree)

TritomyRoot makes the root a tritomy

Types

type Bipart

type Bipart struct {
	Lt          map[int]bool
	Rt          map[int]bool
	Ct          int           // counts
	TreeIndices []int         // index of which trees this is in
	Nds         []*Node       // nodes associated with the bipart
	NdsM        map[int]*Node // nodes associated with the bipart by treeindex
	Index       int           // just a unique id
}

Bipart are represented as map[int]bools, one for the left and one for the right

func (Bipart) CompatibleWith

func (b Bipart) CompatibleWith(ib Bipart) (con bool)

CompatibleWith checks that it isn't conflicting but can be nested

func (Bipart) ConcordantWith

func (b Bipart) ConcordantWith(ib Bipart) (con bool)

ConcordantWith tests whether something is concordant (not conflicting or nested, etc)

func (Bipart) ConflictsWith

func (b Bipart) ConflictsWith(ib Bipart) (con bool)

ConflictsWith checks whether two biparts conflict

func (Bipart) Equals

func (b Bipart) Equals(ib Bipart) (eq bool)

Equals trying to be faster

func (Bipart) NewickWithNames

func (b Bipart) NewickWithNames(nmmap map[int]string) (ret string)

NewickWithNames does similar things to StringWithNames but sends a newick back

func (Bipart) StringWithNames

func (b Bipart) StringWithNames(nmmap map[int]string) (ret string)

StringWithNames converts the ints to the the strings from nmmap

type DNAModel

type DNAModel struct {
	M DiscreteModel
}

func NewDNAModel

func NewDNAModel() *DNAModel

func (*DNAModel) DeepCopyDNAModel

func (d *DNAModel) DeepCopyDNAModel() *DNAModel

DeepCopyDNAModel ...

type DataType

type DataType string

DataType type for alphabet

type DiscreteModel

type DiscreteModel struct {
	Alph      DataType  // nuc, prot or multstate model
	BF        []float64 // base frequencies, order is A,C,G,T or A,R,N,D,C,Q,E,G,H,I,L,K,M,F,P,S,T,W,Y,V
	MBF       []float64 // model base frequencies
	EBF       []float64 // empirical base freqs
	R         *mat.Dense
	Q         *mat.Dense // common use
	Ex        string     // PAML-formatted exchangeabilities for Prot models
	CharMap   map[string][]int
	NumStates int
	//sync.RWMutex
	Ps  map[float64]*mat.Dense
	PsL map[float64]*mat.Dense
	X   *mat.Dense
	P   mat.Dense
	//for decomposing
	QS         *mat.Dense
	EigenVals  []float64  // to be exponentiated
	EigenVecs  *mat.Dense //
	EigenVecsI *mat.Dense
	X1         *mat.Dense
	X2         *mat.Dense
	GammaAlpha float64
	GammaNCats int
	GammaCats  []float64
}

DiscreteModel overall model struct

func NewDiscreteModel

func NewDiscreteModel() *DiscreteModel

NewDiscreteModel get new model pointer

func (*DiscreteModel) DecomposeQ

func (d *DiscreteModel) DecomposeQ()

DecomposeQ this is just for NR optimization for branch lengths

func (*DiscreteModel) DeepCopyDiscreteModel

func (d *DiscreteModel) DeepCopyDiscreteModel() *DiscreteModel

func (*DiscreteModel) EmptyPDict

func (d *DiscreteModel) EmptyPDict()

EmptyPDict save memory perhaps?

func (*DiscreteModel) EmptyPLDict

func (d *DiscreteModel) EmptyPLDict()

EmptyPLDict the logged one

func (*DiscreteModel) ExpValue

func (d *DiscreteModel) ExpValue(iv []float64, blen float64)

ExpValue used for the matrix exponential

func (*DiscreteModel) ExpValueFirstD

func (d *DiscreteModel) ExpValueFirstD(blen float64) (x *mat.Dense)

ExpValueFirstD get the first derivaite for NR

func (*DiscreteModel) ExpValueSecondD

func (d *DiscreteModel) ExpValueSecondD(blen float64) (x *mat.Dense)

ExpValueSecondD get the second derivaite for NR

func (*DiscreteModel) GetBF

func (d *DiscreteModel) GetBF() []float64

GetBF return the base frequencies

func (*DiscreteModel) GetCharMap

func (d *DiscreteModel) GetCharMap() map[string][]int

GetCharMap get the int map for states with ambiguities

func (*DiscreteModel) GetNumStates

func (d *DiscreteModel) GetNumStates() int

GetNumStates return the number of states

func (*DiscreteModel) GetPCalc

func (d *DiscreteModel) GetPCalc(blen float64) *mat.Dense

GetPCalc calculate P matrix

func (*DiscreteModel) GetPMap

func (d *DiscreteModel) GetPMap(blen float64) *mat.Dense

GetPMap get the Ps from the dictionary

func (*DiscreteModel) GetPMapLogged

func (d *DiscreteModel) GetPMapLogged(blen float64) *mat.Dense

GetPMapLogged get the Ps from the dictionary

func (*DiscreteModel) GetStochMapMatrices

func (d *DiscreteModel) GetStochMapMatrices(dur float64, from int, to int) (summed *mat.Dense, summedR *mat.Dense)

GetStochMapMatrices return matrices for stochastic mapping

func (*DiscreteModel) SetBaseFreqs

func (d *DiscreteModel) SetBaseFreqs(basefreq []float64)

SetBaseFreqs needs to be done before doing SetupQGTR

func (*DiscreteModel) SetEmpiricalBF

func (d *DiscreteModel) SetEmpiricalBF()

SetEmpiricalBF set all to empirical

func (*DiscreteModel) SetEqualBF

func (d *DiscreteModel) SetEqualBF()

SetEqualBF set all state frequencies equal

func (*DiscreteModel) SetModelBF

func (d *DiscreteModel) SetModelBF()

SetModelBF set all to frequencies from empirical model

func (*DiscreteModel) SetP

func (d *DiscreteModel) SetP(blen float64)

SetP use the standard spectral decom

func (*DiscreteModel) SetPSimple

func (d *DiscreteModel) SetPSimple(blen float64)

SetPSimple use the gonum matrixexp (seems faster)

func (*DiscreteModel) SetRateMatrix

func (d *DiscreteModel) SetRateMatrix(params []float64)

SetRateMatrix length is (((numstates*numstates)-numstates)/2) - 1 or (numstates * numstates) - numstates this is for scaled branch lengths and matrices CHANGE THIS TO UNSCALED

func (*DiscreteModel) SetScaledRateMatrix

func (d *DiscreteModel) SetScaledRateMatrix(params []float64, sym bool)

SetScaledRateMatrix needs to be done before doing SetupQGTR just send along the rates and this will make them the whole matrix the scaled is that this is assuming that the last rate is 1 THIS IS THE SAME? AS SETRATEMATRIX?

func (*DiscreteModel) SetupQGTR

func (d *DiscreteModel) SetupQGTR()

SetupQGTR sets up scaled GTR for (mainly) Could also be used to estimate PROTGTR but that's a bad idea

func (*DiscreteModel) SetupQJC

func (d *DiscreteModel) SetupQJC()

SetupQJC setup Q matrix

This is scaled so that change is reflected in the branch lengths
You don't need to use the SetScaledRateMatrix

func (*DiscreteModel) SetupQJC1Rate

func (d *DiscreteModel) SetupQJC1Rate(rt float64)

SetupQJC1Rate setup Q matrix with one rate, probably for anc multi state

These are unscaled so the branch lengths are going to be time or something else
and not relative to these rates
Will take BF from something else

func (*DiscreteModel) SetupQMk

func (d *DiscreteModel) SetupQMk(rt []float64, sym bool)

SetupQMk setup Q matrix

This is unscaled (so the branch lengths are going to be proportion to some other change
and not to these branch lengths)
Will take the BF from something else

type LikeFunction

type LikeFunction struct {
	// contains filtered or unexported fields
}

func (LikeFunction) EvaluateFunction

func (sf LikeFunction) EvaluateFunction(pa []float64) float64

func (LikeFunction) EvaluateGradient

func (sf LikeFunction) EvaluateGradient(point []float64) (gradient []float64)

type LikeResult

type LikeResult struct {
	// contains filtered or unexported fields
}

LikeResult likelihood value and site

type LikeSupResult

type LikeSupResult struct {
	// contains filtered or unexported fields
}

type MSeq

type MSeq struct {
	NM  string
	SQ  string
	SQs []string //SQ seperated by spaces
}

MSeq minimal seq struct

func ReadMSeqsFromFile

func ReadMSeqsFromFile(filen string) (seqs []MSeq, numstates int)

ReadMSeqsFromFile obvious file should be fasta in format with starts seperated by spaces and starting at 0 going to whatever

type MultStateModel

type MultStateModel struct {
	M DiscreteModel
}

MultStateModel multistate model struct

func NewMultStateModel

func NewMultStateModel(numstates int) *MultStateModel

NewMultStateModel get new MULTModel pointer

type Node

type Node struct {
	Par       *Node   //parent
	Chs       []*Node //childs
	Nam       string  //name
	SData     map[string]string
	FData     map[string]float64
	IData     map[string]int
	Num       int
	Len       float64     //branch length
	Data      [][]float64 // [site][states]
	BData     [][]*SupFlo //[site][states]
	ContData  []float64   //[site] cont
	ContData2 []float64   //[site] another cont
	Mis       []bool      //[site] missing site
	Marked    bool        //just for like calculations
	Height    float64
	MarkedMap map[float64]bool //the float is for an id for the query
	//anc+bl
	// X------>--X
	// TP        RT
	//
	// X--<------X
	// RvTp      RV
	TpConds   [][]float64 //[site][states]
	RtConds   [][]float64
	RvConds   [][]float64
	RvTpConds [][]float64
	//TODO: need comments for these below
	//FAD         float64
	//LAD         float64
	//FINDS       float64
	//TimeLen     float64
	PruneLen    float64
	ConPruneLen []float64 // prevent race condition when calculating BM likelihood
	BMLen       float64
	Anc         bool
	ClustLen    map[int]float64
}

Node minimal node struct

func GetMrca

func GetMrca(nds []*Node, root *Node) (mrca *Node)

GetMrca get the mrca

func GetSubtreeRoot

func GetSubtreeRoot(n *Node, higherNode *Node) (subtreeRoot *Node)

GetSubtreeRoot will return the root of the subtree to which a node belongs that descends from a higher node (up to the root)

func ReadNewickString

func ReadNewickString(ts string) (root *Node)

ReadNewickString given a string it will return a pointer to the root node

func (Node) BMPhylogram

func (n Node) BMPhylogram() (ret string)

BMPhylogram returns a string newick with brownian motion branch lengths

func (*Node) GetBackbone

func (n *Node) GetBackbone(higherNode *Node) (backbone []*Node)

GetBackbone TODO: what is this

func (*Node) GetSib

func (n *Node) GetSib() *Node

GetSib returns the sibling of a node

func (Node) GetTipNames

func (n Node) GetTipNames() (tips []string)

GetTipNames returns a slice with node pointers

func (Node) GetTips

func (n Node) GetTips() (tips []*Node)

GetTips returns a slice with node pointers

func (Node) Newick

func (n Node) Newick(bl bool) (ret string)

Newick returns a string newick

func (Node) NewickFData

func (n Node) NewickFData(bl bool, FD string) (ret string)

NewickFData returns a string newick

func (Node) NewickFloatBL

func (n Node) NewickFloatBL(fl string) (ret string)

NewickFloatBL returns a string newick with branch lengths of the data in FData[fl]

func (Node) NewickPaint

func (n Node) NewickPaint(bl bool, rid float64) (ret string)

NewickPaint returns a string newick

func (*Node) NumIntNodes

func (n *Node) NumIntNodes(count *int)

NumIntNodes is a helper method that will return the number of internal nodes descending from n (including n)

func (*Node) PostorderArray

func (n *Node) PostorderArray() (ret []*Node)

PostorderArray will return a postordered array of all the nodes starting at n

func (*Node) PostorderArrayExcl

func (n *Node) PostorderArrayExcl(x *Node) (ret []*Node)

PostorderArrayExcl will return a postordered array of all the nodes starting at n

excluding node x

func (*Node) PreorderArray

func (n *Node) PreorderArray() (ret []*Node)

PreorderArray will return a preordered array of all the nodes in a tree

func (*Node) Reroot

func (n *Node) Reroot(oldroot *Node) *Node

Reroot reroots all the nodes represented in a graph on n

func (*Node) RerootBM

func (n *Node) RerootBM(oldroot *Node) *Node

RerootBM reroots and moves the BMLen

func (Node) String

func (n Node) String() string

type NodeStack

type NodeStack struct {
	// contains filtered or unexported fields
}

NodeStack makes a node stack for pushing and pulling.

func NewNodeStack

func NewNodeStack() *NodeStack

NewNodeStack returns a pointer to a node stack

func (*NodeStack) Empty

func (s *NodeStack) Empty() (ret bool)

Empty is the node stack empty?

func (*NodeStack) Pop

func (s *NodeStack) Pop() (*Node, error)

Pop a node off the stack

func (*NodeStack) Push

func (s *NodeStack) Push(v *Node)

Push push the node into the stack

type PLObj

type PLObj struct {
	Rates                 []float64
	Means                 []float64
	Stds                  []float64
	Durations             []float64
	Dates                 []float64
	Mins                  []float64
	PenMins               []float64
	PenMaxs               []float64
	Maxs                  []float64
	CharDurations         []float64
	CharDurationsM        [][]float64
	LogFactCharDurations  []float64
	LogFactCharDurationsM [][]float64
	GMMWeights            [][]float64
	GMMMix                []float64
	GMMMeans              []float64
	GMMStds               []float64
	CharMLogFactM         bool
	ParentsNdsInts        []int
	ChildrenVec           [][]int
	FreeNodes             []int
	FreeNodesM            map[int]bool
	NodesMap              map[int]*Node
	NumNodes              int
	LogPen                bool
	PenaltyBoundary       float64
	Smoothing             float64
	RateGroups            map[int]int //[nodenum]rategroup
	RateGroupsN           []int
	NumRateGroups         int
	CVNode                int
	Tree                  Tree
	Logs                  map[float64]float64
	Disp                  float64 // dispersion for neg bin
}

PLObj things for PL

func (*PLObj) CalcLF

func (p *PLObj) CalcLF(params []float64, free bool) float64

CalcLF ...

func (*PLObj) CalcLNorm

func (p *PLObj) CalcLNorm(params []float64, free bool) float64

calculate the likelihood with the normal distribution for rates

func (*PLObj) CalcLNormClade

func (p *PLObj) CalcLNormClade(params []float64, group int, root *Node, free bool) float64

func (*PLObj) CalcLNormGMM

func (p *PLObj) CalcLNormGMM(params []float64, free bool) float64

calculate the likelihood with the normal distribution for rates

func (*PLObj) CalcMultLF

func (p *PLObj) CalcMultLF(params []float64, free bool) float64

CalcMultLF ...

func (*PLObj) CalcMultPL

func (p *PLObj) CalcMultPL(params []float64, free bool) float64

CalcMultPL ...

func (*PLObj) CalcNB

func (p *PLObj) CalcNB(params []float64, free bool) float64

CalcNB..

func (*PLObj) CalcNormRateLogLike

func (p *PLObj) CalcNormRateLogLike() (ll float64)

func (*PLObj) CalcNormRateLogLikeMixed

func (p *PLObj) CalcNormRateLogLikeMixed() (ll float64)

CalcNormRateLogLikeMixed for the gmm

func (*PLObj) CalcPL

func (p *PLObj) CalcPL(params []float64, free bool) float64

CalcPL ...

func (*PLObj) CalcPLJustLike

func (p *PLObj) CalcPLJustLike(params []float64, free bool) float64

func (*PLObj) CalcPenalty

func (p *PLObj) CalcPenalty() (rkt float64)

CalcPenalty ...

func (*PLObj) CalcRateGroupMean

func (p *PLObj) CalcRateGroupMean(group int) float64

func (*PLObj) CalcRateLogLike

func (p *PLObj) CalcRateLogLike() (ll float64)

CalcRateLogLike ...

func (*PLObj) CalcRateLogLikeMT

func (p *PLObj) CalcRateLogLikeMT() (ll float64)

* multiple trees

func (*PLObj) CalcRateLogLikeNegBin

func (p *PLObj) CalcRateLogLikeNegBin() (ll float64)

CalcRateLogLikeNegBin ...

func (*PLObj) CalcRateLogLikeNegBinMT

func (p *PLObj) CalcRateLogLikeNegBinMT() (ll float64)

func (*PLObj) CalcRateLogLikeP

func (p *PLObj) CalcRateLogLikeP() (ll float64)

CalcRateLogLikeP ... NOT FASTER

func (*PLObj) CalcRoughnessPenalty

func (p *PLObj) CalcRoughnessPenalty() (su float64)

CalcRoughnessPenalty ...

func (*PLObj) CalcRoughnessPenaltyMult

func (p *PLObj) CalcRoughnessPenaltyMult() (su float64)

CalcRoughnessPenaltyMult ...

func (*PLObj) DivTimeGmm

func (p *PLObj) DivTimeGmm()

func (*PLObj) DivTimeGmmEStep

func (p *PLObj) DivTimeGmmEStep() float64

func (*PLObj) DivTimeGmmMStep

func (p *PLObj) DivTimeGmmMStep()

func (*PLObj) OptimizePreOrder

func (p *PLObj) OptimizePreOrder(params []float64, likeFunc func([]float64, bool) float64,
	LF bool, PL bool, alg int, verbose bool) float64

* preorder move and optimize to get around constraint issues

OptimizePostOrder ...

func (*PLObj) OptimizeRD

func (p *PLObj) OptimizeRD(params []float64, likeFunc func([]float64, bool) float64,
	LF bool, PL bool, MULT bool) *optimize.Result

OptimizeRD optimization

func (*PLObj) OptimizeRDBOUNDED

func (p *PLObj) OptimizeRDBOUNDED(params []float64, likeFunc func([]float64, bool) float64,
	LF bool, PL bool, alg int, verbose bool) ([]float64, float64, error)

OptimizeRDBOUNDED NLOPT

func (*PLObj) OptimizeRDBOUNDEDIE

func (p *PLObj) OptimizeRDBOUNDEDIE(params []float64, likeFunc func([]float64, bool) float64,
	LF bool, PL bool, alg int, verbose bool, paramnodemap map[int]int) ([]float64, float64, error)

OptimizeRDBOUNDEDIE NLOPT with inequality constraints

func (*PLObj) OptimizeRDN

func (p *PLObj) OptimizeRDN(params []float64, alg int,
	verbose bool, paramnodemap map[int]int) ([]float64, float64, error)

OptimizeRDN optimization

func (*PLObj) OptimizeRDNClade

func (p *PLObj) OptimizeRDNClade(params []float64, alg int, group int, root *Node,
	verbose bool) ([]float64, float64, error)

func (*PLObj) OptimizeRDNGMM

func (p *PLObj) OptimizeRDNGMM(params []float64, alg int,
	verbose bool, paramnodemap map[int]int) ([]float64, float64, error)

OptimizeRDN optimization

func (*PLObj) PCalcRateLogLike

func (p *PLObj) PCalcRateLogLike(jobs <-chan int, results chan<- float64)

func (*PLObj) PrintNewickDurations

func (p *PLObj) PrintNewickDurations(t Tree) string

PrintNewickDurations uses the NewickFloatBL

func (*PLObj) PrintNewickRates

func (p *PLObj) PrintNewickRates(t Tree) string

func (*PLObj) RunCV

func (p *PLObj) RunCV(params []float64, verbose bool, likeFunc func([]float64, bool) float64)

RunCV ... this is just doing cross validation LOOCV

func (*PLObj) RunLF

func (p *PLObj) RunLF(startRate float64, verbose bool) *optimize.Result

RunLF langley fitch run with starting float, probably numsites/20.

func (*PLObj) RunLNorm

func (p *PLObj) RunLNorm(startRate float64, verbose bool) (float64, []float64)

run norm l

func (*PLObj) RunLNormClade

func (p *PLObj) RunLNormClade(startRate float64, group int, root *Node, verbose bool) (float64, []float64)

run norm l

func (*PLObj) RunLNormGMM

func (p *PLObj) RunLNormGMM(verbose bool) (float64, []float64)

RunLNormGMM the gmm optimization algorithm

func (*PLObj) RunMLF

func (p *PLObj) RunMLF(startRate float64, mrcagroups []*Node, t Tree,
	verbose bool) *optimize.Result

RunMLF multiple rate langley fitch with starting float, probably x.X[0] from LF and mrcagroup

func (*PLObj) RunMPL

func (p *PLObj) RunMPL(mrcagroups []*Node, t Tree, verbose bool) *optimize.Result

RunMPL penalized likelihood run with a starting float from x.X[0] from LF and mrcagroup -- assuming that p.RateGroups has already been setup

func (*PLObj) RunNB

func (p *PLObj) RunNB(startRate float64, verbose bool) *optimize.Result

RunNB penalized likelihood run with a starting float, probably x.X[0] from LF

func (*PLObj) RunPL

func (p *PLObj) RunPL(startRate float64, verbose bool) *optimize.Result

run norm l RunPL penalized likelihood run with a starting float, probably x.X[0] from LF

func (*PLObj) SetDurations

func (p *PLObj) SetDurations() (success bool)

SetDurations for nodes

func (*PLObj) SetMultTreeValues

func (p *PLObj) SetMultTreeValues(t Tree, numsites []float64)

SetMultTreeData assuming that you have already done the SetValues, this will populate the chardurations and logfact from the node ContData where each index is a datapoint

func (*PLObj) SetRateGroups

func (p *PLObj) SetRateGroups(tree Tree, mrcagroups []*Node)

SetRateGroups send the mrcagroups that will identify the rate shifts these go preorder so have the deepest first if they are nested

func (*PLObj) SetValues

func (p *PLObj) SetValues(t Tree, numsites float64, minmap map[*Node]float64,
	maxmap map[*Node]float64, verbose bool)

SetValues ...

func (*PLObj) SetupGMM

func (p *PLObj) SetupGMM(startmeans []float64)

type ParsResult

type ParsResult struct {
	// contains filtered or unexported fields
}

ParsResult gives the parsimony score (value) for a site

type ProteinModel

type ProteinModel struct {
	M DiscreteModel
}

ProteinModel protein model struct

func NewProteinModel

func NewProteinModel() *ProteinModel

NewProteinModel get new PROTModel pointer

func (*ProteinModel) DeepCopyProteinModel

func (d *ProteinModel) DeepCopyProteinModel() *ProteinModel

DeepCopyProteinModel ...

func (*ProteinModel) SetRateMatrixJTT

func (d *ProteinModel) SetRateMatrixJTT()

SetRateMatrixJTT set up JTT exchangeabilities

func (*ProteinModel) SetRateMatrixLG

func (d *ProteinModel) SetRateMatrixLG()

SetRateMatrixLG set up LG exchangeabilities

func (*ProteinModel) SetRateMatrixWAG

func (d *ProteinModel) SetRateMatrixWAG()

SetRateMatrixWAG set up WAG exchangeabilities

type Quartet

type Quartet struct {
	Lt    map[int]bool
	Rt    map[int]bool
	Lts   []map[int]bool
	Rts   []map[int]bool
	Len   float64
	Index int
}

Quartet are represented as map[int]bools, one for the left and one for the right

func GetQuartet

func GetQuartet(nd *Node, tree Tree, maptips map[string]int) (Quartet, error)

GetQuartet for node

func GetQuartets

func GetQuartets(tree Tree, maptips map[string]int) (bps []Quartet)

GetQuartets return slice of quartets

func (Quartet) ConflictsWith

func (b Quartet) ConflictsWith(ib Quartet) (con bool)

ConflictsWith checks whether two biparts conflict

func (Quartet) Match

func (b Quartet) Match(inq Quartet) (mat bool)

Match for matching quartets

func (Quartet) StringWithNames

func (b Quartet) StringWithNames(nmmap map[int]string) (ret string)

StringWithNames converts the ints to the the strings from nmmap

type Rfwresult

type Rfwresult struct {
	Tree1  int
	Tree2  int
	Weight float64
	MaxDev float64
}

Rfwresult struct for holding info from rfwp analysis

type Seq

type Seq struct {
	NM string
	SQ string
}

Seq minimal seq struct

func ReadSeqsFromFile

func ReadSeqsFromFile(filen string) (seqs []Seq)

ReadSeqsFromFile obvious

func (Seq) GetFasta

func (s Seq) GetFasta() (ret string)

GetFasta return a fasta string

type SortedIntIdxSlice

type SortedIntIdxSlice struct {
	sort.IntSlice
	Idx []int
}

SortedIntIdxSlice for sorting indices of ints

func NewSortedIdxSlice

func NewSortedIdxSlice(n []int) *SortedIntIdxSlice

NewSortedIdxSlice usage

func NewSortedIdxSliceD

func NewSortedIdxSliceD(n ...int) *SortedIntIdxSlice

NewSortedIdxSliceD usage

s := NewSlice(1, 25, 3, 5, 4)

sort.Sort(s) will give s.IntSlice = [1 3 4 5 25] s.idx = [0 2 4 3 1]

func (SortedIntIdxSlice) Swap

func (s SortedIntIdxSlice) Swap(i, j int)

Swap for sorting indices

type SupFlo

type SupFlo struct {
	// contains filtered or unexported fields
}

SupFlo super float

func CalcSupLikeOneSite

func CalcSupLikeOneSite(t *Tree, x *DiscreteModel, site int) *SupFlo

func NewSupFlo

func NewSupFlo(m float64, e int) *SupFlo

NewSupFlo get a new one

func (*SupFlo) Abs

func (s *SupFlo) Abs() *SupFlo

Abs absolute value

func (*SupFlo) Add

func (s *SupFlo) Add(x *SupFlo) *SupFlo

Add s+x return result

func (*SupFlo) AddEq

func (s *SupFlo) AddEq(x *SupFlo)

AddEq +=

func (*SupFlo) Div

func (s *SupFlo) Div(x *SupFlo) *SupFlo

Div s/x return sup

func (*SupFlo) DivEq

func (s *SupFlo) DivEq(x *SupFlo)

DivEq /=

func (*SupFlo) Float64

func (s *SupFlo) Float64() float64

Float64 just get it

func (*SupFlo) GetExp

func (s *SupFlo) GetExp() int

GetExp just get it

func (*SupFlo) GetLn

func (s *SupFlo) GetLn() *SupFlo

GetLn ln

func (*SupFlo) GetMant

func (s *SupFlo) GetMant() float64

GetMant just get it

func (SupFlo) Greater

func (s SupFlo) Greater(x SupFlo) bool

Greater >

func (SupFlo) GreaterEq

func (s SupFlo) GreaterEq(x SupFlo) bool

GreaterEq >=

func (SupFlo) Less

func (s SupFlo) Less(x SupFlo) bool

Less <

func (SupFlo) LessEq

func (s SupFlo) LessEq(x SupFlo) bool

LessEq <=

func (*SupFlo) MinMin

func (s *SupFlo) MinMin()

MinMin --

func (*SupFlo) Mul

func (s *SupFlo) Mul(x *SupFlo) *SupFlo

Mul s * x return result

func (*SupFlo) MulEq

func (s *SupFlo) MulEq(x *SupFlo)

MulEq *=

func (*SupFlo) MulEqFloat

func (s *SupFlo) MulEqFloat(x float64)

MulEqFloat *=

func (*SupFlo) MulFloat

func (s *SupFlo) MulFloat(x float64) *SupFlo

MulFloat s * x return result and x is a float64

func (*SupFlo) PlusPlus

func (s *SupFlo) PlusPlus()

PlusPlus ++

func (*SupFlo) SetFloat64

func (s *SupFlo) SetFloat64(m float64)

SetFloat64 set it to a float64 value

func (*SupFlo) SetMantExp

func (s *SupFlo) SetMantExp(mant float64, exp int)

func (*SupFlo) Sub

func (s *SupFlo) Sub(x *SupFlo) *SupFlo

Sub s-x return result

func (*SupFlo) SubEq

func (s *SupFlo) SubEq(x SupFlo)

SubEq -=

func (*SupFlo) SwitchSign

func (s *SupFlo) SwitchSign()

SwitchSign change the sign

type Tree

type Tree struct {
	Rt    *Node
	Post  []*Node
	Pre   []*Node
	Tips  []*Node
	Index int // if there is an identifying index
}

Tree minimal phylogenetic tree struct don't need this, can use root but here if you want these convienence functions below

func NewTree

func NewTree() *Tree

NewTree return a tree

func ReadTreeFromFile

func ReadTreeFromFile(tfn string) (tree *Tree)

ReadTreeFromFile read a single tree from a file

func ReadTreesFromFile

func ReadTreesFromFile(tfn string) (trees []*Tree)

ReadTreesFromFile read multi single tree from a file

func (*Tree) GetTipByName

func (t *Tree) GetTipByName(name string) (*Node, error)

GetTipByName get

func (*Tree) Instantiate

func (t *Tree) Instantiate(rt *Node)

Instantiate will preorder and postorder

Directories

Path Synopsis
bp is a tool for analyzing bipartitions from trees.
bp is a tool for analyzing bipartitions from trees.
calcmsbl will calculate the branch lengths for multistate characters
calcmsbl will calculate the branch lengths for multistate characters
parsbl is a tool to apply parsimony branch lengths to a tree.
parsbl is a tool to apply parsimony branch lengths to a tree.
phytest is just for testing different functions in the phylogenetics stuff
phytest is just for testing different functions in the phylogenetics stuff

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL