lru

package module
v1.0.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 20, 2022 License: LGPL-3.0 Imports: 1 Imported by: 3

README

lru

A Go implementation of a least-recently-used cache.

Use lru.New(size) to create a new LRU cache that will hold up to size entries. Note that the LRU object is not thread safe, so you will need to add a mutex if it is accessed from multiple goroutines.

Documentation

Overview

LRU implements a least-recently-used cache, which tracks what items have been accessed, and evicts them when adding a new item if it hasn't been used recently.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type HitCounts

type HitCounts struct {
	Hit, Miss int64
}

HitCounts is used to track how well this cache is working

type LRU

type LRU struct {
	// contains filtered or unexported fields
}

LRU implements a least-recently-used cache, evicting items from the cache if they have not been accessed in a while.

func New

func New(size int) *LRU

Create a new LRU cache that will hold no more than the given number of items,

func (*LRU) Add

func (lru *LRU) Add(key, value interface{})

Add a new entry into the LRU cache

func (*LRU) Get

func (lru *LRU) Get(key interface{}) (interface{}, bool)

Get returns the Value associated with key, and a boolean as to whether it actually exists in the cache. If it does exist in the cache, then it is treated as recently accessed.

func (*LRU) Len

func (lru *LRU) Len() int

Len gives the number of items in the cache

func (*LRU) Peek

func (lru *LRU) Peek(key interface{}) (interface{}, bool)

Peek is just like Get() except it doesn't affect if it was 'recently accessed'

type StringCache

type StringCache struct {
	// contains filtered or unexported fields
}

StringCache tracks a limited number of strings. Use Intern() to get a saved version of the string, such that

x := cache.Intern(s1)
y := cache.Intern(s2)

Now x and y will use the same underlying memory if s1 == s2. We track a map into a doubly linked list, moving accessed (or recently added) strings to the front of the list, and using the expiry at the end of the list to maintain the size of the cache. Note that StringCache is *not* thread safe, some form of mutex is necessary if you want to access it from multiple threads.

func NewStringCache

func NewStringCache(size int) *StringCache

NewStringCache creates a cache for string objects that will hold no-more than 'size' strings.

func (*StringCache) Contains

func (sc *StringCache) Contains(v string) bool

Contains returns true if the string is in the cache. It does not change information about recently-used.

func (*StringCache) HitCounts

func (sc *StringCache) HitCounts() HitCounts

HitCounts gives information about accesses to the cache. The total number of calls to Intern can be computed by adding Hit and Miss.

func (*StringCache) Intern

func (sc *StringCache) Intern(v string) string

Intern takes a string, and returns either the cached copy of the string, or caches the string and returns it back. It also updates how recently the string was seen, so that strings aren't cached forever.

func (*StringCache) Len

func (sc *StringCache) Len() int

Len returns how many strings are currently cached

func (*StringCache) Prealloc

func (sc *StringCache) Prealloc()

Prealloc allocates a maxSize buffer immediately, rather than slowly growing the buffer to maxSize. If you know that you need the full buffer size, this can make initial loading of the buffer 2-3x faster.

func (*StringCache) Validate

func (sc *StringCache) Validate() error

Validate checks invariants to make sure the double-linked list is properly linked, and that the values map to the correct element.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL