cache: github.com/lrita/cache Index | Files | Directories

package cache

import "github.com/lrita/cache"

Index

Package Files

bufcache.go cache.go nocopy.go pprof.go

func SetProfileFraction Uses

func SetProfileFraction(rate int) int

SetProfileFraction controls the fraction of cache get missing events that are reported in the "github.com/lrita/cache" profile. On average 1/rate events are reported. The previous rate is returned.

To turn off profiling entirely, pass rate 0. To just read the current rate, pass rate < 0. (For n>1 the details of sampling may change.)

type BufCache Uses

type BufCache struct {

    // New optionally specifies a function to generate
    // a value when Get would otherwise return nil.
    // It may not be changed concurrently with calls to Get.
    New func() []byte
    // Size optinally specifies the max items in the per-P local lists.
    Size int64
    // contains filtered or unexported fields
}

BufCache is a set of temporary bytes buffer that may be individually saved and retrieved.

A BufCache is safe for use by multiple goroutines simultaneously.

BufCache's purpose is to cache allocated but unused items for later reuse, relieving pressure on the garbage collector. That is, it makes it easy to build efficient, thread-safe free lists. However, it is not suitable for all free lists.

An appropriate use of a BufCache is to manage a group of temporary items silently shared among and potentially reused by concurrent independent clients of a package. BufCache provides a way to amortize allocation overhead across many clients.

The difference with std-lib sync.Pool is that the items in BufCache does not be deallocated by GC, and there are multi slot in per-P and per-NUMA NODE storage. The free list in BufCache maintained as parts of a long-lived object aim for a long process logic. The users can twist the per-NUMA free lists size(BufCache.Size) to make minimum allocation by the profile.

A BufCache must not be copied after first use.

Assigning a slice of byte to a interface{} will cause a allocation, so we specialize a implementants from Cache.

func (*BufCache) Get Uses

func (c *BufCache) Get() (x []byte)

Get selects an arbitrary item from the BufCache, removes it from the BufCache, and returns it to the caller. Get may choose to ignore the pool and treat it as empty. Callers should not assume any relation between values passed to Put and the values returned by Get.

If Get would otherwise return nil and p.New is non-nil, Get returns the result of calling p.New.

func (*BufCache) Put Uses

func (c *BufCache) Put(x []byte)

Put adds x to the BufCache.

type Cache Uses

type Cache struct {

    // New optionally specifies a function to generate
    // a value when Get would otherwise return nil.
    // It may not be changed concurrently with calls to Get.
    New func() interface{}
    // Size optinally specifies the max items in the per-NUMA NODE lists.
    Size int64
    // contains filtered or unexported fields
}

Cache is a set of temporary objects that may be individually saved and retrieved.

A Cache is safe for use by multiple goroutines simultaneously.

Cache's purpose is to cache allocated but unused items for later reuse, relieving pressure on the garbage collector. That is, it makes it easy to build efficient, thread-safe free lists. However, it is not suitable for all free lists.

An appropriate use of a Cache is to manage a group of temporary items silently shared among and potentially reused by concurrent independent clients of a package. Cache provides a way to amortize allocation overhead across many clients.

The difference with std-lib sync.Pool is that the items in Cache does not be deallocated by GC, and there are multi slot in per-P storage and per-NUMA node storage. The free list in Cache maintained as parts of a long-lived object aim for a long process logic. The users can twist the per-NUMA node size(Cache.Size) to make minimum allocation by profile.

A Cache must not be copied after first use.

func (*Cache) Get Uses

func (c *Cache) Get() (x interface{})

Get selects an arbitrary item from the Cache, removes it from the Cache, and returns it to the caller. Get may choose to ignore the pool and treat it as empty. Callers should not assume any relation between values passed to Put and the values returned by Get.

If Get would otherwise return nil and p.New is non-nil, Get returns the result of calling p.New.

func (*Cache) Put Uses

func (c *Cache) Put(x interface{})

Put adds x to the Cache.

Directories

PathSynopsis
race

Package cache imports 12 packages (graph) and is imported by 1 packages. Updated 2019-03-16. Refresh now. Tools for package owners.