crawler

package
v0.0.0-...-7c12c5a Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 21, 2015 License: GPL-2.0, GPL-3.0 Imports: 7 Imported by: 0

Documentation

Index

Constants

View Source
const (
	CheckQueueBufferSize  = 100
	NodeQueueBufferSize   = 100
	GetPeersTickerSeconds = 5
)

Variables

This section is empty.

Functions

This section is empty.

Types

type Crawler

type Crawler struct {
	QuitService
	// contains filtered or unexported fields
}

A crawler has a local node, a set of potential nodes in the nodePool, and connected nodes. Maps are only accessed by one go-routine, mediated by the checkQueue

func NewCrawler

func NewCrawler(host string, port uint16) *Crawler

Create a new Crawler using the local RPC server at addr

func (*Crawler) OnStart

func (c *Crawler) OnStart() error

type Node

type Node struct {
	Host    string
	P2PPort uint16
	RPCPort uint16

	LastSeen     time.Time
	ChainID      string
	BlockHeight  int
	BlockHistory map[int]time.Time // when we saw each block
	NetInfo      *ctypes.ResultNetInfo

	Validator bool
	// contains filtered or unexported fields
}

A node is a peer on the network

func (*Node) Address

func (n *Node) Address() string

func (*Node) SetInfo

func (n *Node) SetInfo(status *ctypes.ResultStatus, netinfo *ctypes.ResultNetInfo)

Set the basic status and chain_id info for a node from RPC responses

type NodeClient

type NodeClient struct {
	// contains filtered or unexported fields
}

A node client is used to talk to a node over rpc and websockets

func NewNodeClient

func NewNodeClient(addr string) *NodeClient

Create a new client for the node at the given addr

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL