lrucache

package module
v0.0.0-...-17052bf Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 30, 2020 License: AGPL-3.0 Imports: 2 Imported by: 5

README

LRUCACHE

Lrucache is a powerful key/value store for Go.

You can use it almost the same way as an in-memory dictionary (map[string]interface{}) but there are many important differences in implementation.

Features

map[string]interface{} lrucache
thread-safe no yes
maximum size no yes
OnMiss handler no yes
  • purges least recently used element when full
  • elements can report their own size
  • everything is cacheable (interface{})
  • is a front for your persistent storage (S3, disk, ...) by using OnMiss hooks

Examples and API are on godoc:

http://godoc.org/github.com/hraban/lrucache

The licensing terms are described in the file LICENSE.

Copyright © 2012-2020 Hraban Luyat hraban@0brg.net

Documentation

Overview

Light-weight in-memory LRU (object) cache library for Go.

To use this library, first create a cache:

c := lrucache.New(1234)

Then, optionally, define a type that implements some of the interfaces:

type cacheableInt int

func (i cacheableInt) OnPurge(why lrucache.PurgeReason) {
    fmt.Printf("Purging %d\n", i)
}

Finally:

for i := 0; i < 2000; i++ {
    c.Set(strconv.Itoa(i), cacheableInt(i))
}

This will generate the following output:

Purging 0
Purging 1
...
Purging 764
Purging 765

Note:

* The unit of item sizes is not defined; whatever it is, once the sum exceeds the maximum cache size, elements start getting purged until it drops below the threshold again.

* These integers are passed by value. Caching pointers is, of course, Okay, but be careful when caching a memory location that holds two different values at different points in time; updating the value of a pointer after caching it will change the cached value.

Index

Constants

This section is empty.

Variables

View Source
var ErrNotFound = errors.New("Key not found in cache")

Functions

func Delete

func Delete(id string)

Delete an item from the shared cache.

func MaxSize

func MaxSize(size int64)

A shared cache is available immediately for all users of this library. By default, there is no size limit. Use this function to change that.

func Set

func Set(id string, c Cacheable)

Put an object in the shared cache (requires no configuration).

Types

type Cache

type Cache struct {
	// contains filtered or unexported fields
}

Cache is a single object containing the full state of a cache.

All access to this object through its public methods is gated through a single mutex. It is, therefore, safe for concurrent use, although it will not actually offer any parallel performance benefits.

func New

func New(maxsize int64) *Cache

Create and initialize a new cache, ready for use.

func (*Cache) Close

func (c *Cache) Close() error

Close is an obsolete explicit closer method.

Kept around for backwards compatibility, but not necessary anymore.

func (*Cache) Delete

func (c *Cache) Delete(id string)

func (*Cache) Get

func (c *Cache) Get(id string) (Cacheable, error)

Get fetches an element from the cache.

Updates the cache to mark this element as least recently used. If no element is found for this id, a registered onmiss handler will be called.

func (*Cache) Init

func (c *Cache) Init(maxsize int64)

func (*Cache) MaxSize

func (c *Cache) MaxSize(i int64)

MaxSize updates the maximum size of all cached elements.

The size of the cache is the sum of calling .Size() on every individual element. If an element has no such method, the default is 1. Notably, the size has nothing to do with bytes in memory (unless the individual cached entries have a .Size method which returns their size, in bytes). If (roughly) all cached items are going to be (roughly) the same size it makes sense to set maxSize to the maximum number of elements you want to allow in cache. To remove the limit altogether set a maximum size of 0. No elements will be purged with reason CACHEFULL until the next call to MaxSize.

Can be changed at any point during the cache's lifetime.

func (*Cache) OnMiss

func (c *Cache) OnMiss(f OnMissHandler)

OnMiss stores a callback for handling Gets to unknown keys.

Say you're looking for entry "bob", but there is no such entry in your cache. Do you always handle that in the same way? Get "bob" from disk or S3? Then you could register an OnMiss handler which does this for you. Make this your "persistent storage lookup" function, hook it up to your cache right here and it will be called automatically next ime you're looking for bob. The advantage is that you can expect Get() calls to resolve.

The Get() call invoking this OnMiss will always return whatever value is returned from the OnMiss handler, error or not.

If the function returns a non-nil error, that error is directly returned from the Get() call that caused it to be invoked. Otherwise, if the function return value is not nil, it is stored in cache.

To remove a previously set OnMiss handler, call OnMiss(nil).

Return (nil, nil) to indicate the specific key could not be found. It will be treated as a Get() to an unknown key without an OnMiss handler set.

The synchronization lock which controls access to the entire cache is released before calling this function. The benefit is that a long running OnMiss handler call won't block the cache. The downside is that calling Get concurrently with the same key, before the first OnMiss call returns, will invoke another OnMiss call; the last one to return will have its value stored in the cache. To avoid this, wrap the OnMiss handler in a NoConcurrentDupes.

func (*Cache) Set

func (c *Cache) Set(id string, p Cacheable)

Set stores an item in cache. Panics if the cacheable is nil. It can, however, be an interface pointer to nil. TODO: write a test for the above.

func (*Cache) Size

func (c *Cache) Size() int64

type Cacheable

type Cacheable interface{}

Anything can be cached!

func Get

func Get(id string) (Cacheable, error)

Get an element from the shared cache.

type NotifyPurge

type NotifyPurge interface {
	// Called once when the element is purged from cache. The argument
	// indicates why.
	//
	// Example use-case: a session cache where sessions are not stored in a
	// database until they are purged from the memory cache. As long as the
	// memory cache is large enough to hold all of them, they expire before the
	// cache grows too large and no database connection is ever needed. This
	// OnPurge implementation would store items to a database iff reason ==
	// CACHEFULL.
	//
	// Called from within a private goroutine, but never called concurrently
	// with other elements' OnPurge(). The entire cache is blocked until this
	// function returns. By all means, feel free to launch a fresh goroutine
	// and return immediately.
	OnPurge(why PurgeReason)
}

Optional interface for cached objects

type OnMissHandler

type OnMissHandler func(string) (Cacheable, error)

A function that generates a fresh entry on "cache miss". See the Cache.OnMiss method.

func NoConcurrentDupes

func NoConcurrentDupes(f OnMissHandler) (OnMissHandler, chan<- bool)

Concurrent duplicate calls (same arg) are unified into one call. The result is returned to all callers by the wrapper. Intended for wrapping OnMiss handlers.

The second return value is the quit channel. Send any value down that channel to stop the wrapper. Running operations will complete but it is an error to invoke this function after that. Not panic, just an error.

func ThrottleConcurrency

func ThrottleConcurrency(f OnMissHandler, maxconcurrent uint) OnMissHandler

Wrapper function that limits the number of concurrent calls to f. Intended for wrapping OnMiss handlers.

type PurgeReason

type PurgeReason int

Reasons for a cached element to be deleted from the cache

const (
	// Cache is growing too large and this is the least used item
	CACHEFULL PurgeReason = iota
	// This item was explicitly deleted using Cache.Delete(id)
	EXPLICITDELETE
	// A new element with the same key is stored (usually indicates an update)
	KEYCOLLISION
)

type SizeAware

type SizeAware interface {
	// See Cache.MaxSize() for an explanation of the semantics. Please report a
	// constant size; the cache does not expect objects to change size while
	// they are cached. Items are trusted to report their own size accurately.
	Size() int64
}

Optional interface for cached objects. If this interface is not implemented, an element is assumed to have size 1.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL