loadingcache

package
v0.0.11 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 7, 2024 License: Apache-2.0 Imports: 7 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type LoadFunc

type LoadFunc func(context.Context, *LoadOpts) error

LoadFunc computes a value. It should respect cancellation (return with cancellation error).

type LoadOpts

type LoadOpts struct {
	// contains filtered or unexported fields
}

LoadOpts configures how long a LoadFunc result should be cached. Cache settings overwrite each other; last write wins. Default is don't cache at all. Callers should synchronize their calls themselves if using multiple goroutines (this is not expected).

func (*LoadOpts) CacheFor

func (o *LoadOpts) CacheFor(d time.Duration)

func (*LoadOpts) CacheForever

func (o *LoadOpts) CacheForever()

type Map

type Map struct {
	// contains filtered or unexported fields
}

Map is a keyed collection of Values. Map{} is ready to use. (*Map)(nil) is valid and never caches or shares results for any key. Maps are concurrency-safe. They must not be copied.

Implementation notes:

Compared to sync.Map, this Map is not sophisticated in terms of optimizing for high concurrency with disjoint sets of keys. It could probably be improved.

Loading on-demand, without repeated value computation, is reminiscent of Guava's LoadingCache: https://github.com/google/guava/wiki/CachesExplained

func (*Map) DeleteAll

func (m *Map) DeleteAll()

func (*Map) GetOrCreate

func (m *Map) GetOrCreate(key interface{}) *Value

GetOrCreate returns an existing or new Value associated with key. Note: If m == nil, returns nil, a never-caching Value.

type Value

type Value struct {
	// contains filtered or unexported fields
}

Value manages the loading (calculation) and storing of a cache value. It's designed for use cases where loading is slow. Concurrency is well-supported:

  1. Only one load is in progress at a time, even if concurrent callers request the value.
  2. Cancellation is respected for loading: a caller's load function is invoked with their context. If it respects cancellation and returns an error immediately, the caller's GetOrLoad does, too.
  3. Cancellation is respected for waiting: if a caller's context is canceled while they're waiting for another in-progress load (not their own), the caller's GetOrLoad returns immediately with the cancellation error.

Simpler mechanisms (like just locking a sync.Mutex when starting computation) don't achieve all of these (in the mutex example, cancellation is not respected while waiting on Lock()).

The original use case was reading rarely-changing data via RPC while letting users cancel the operation (Ctrl-C in their terminal). Very different uses (very fast computations or extremely high concurrency) may not work as well; they're at least not tested. Memory overhead may be quite large if small values are cached.

Value{} is ready to use. (*Value)(nil) is valid and just never caches or shares a result (every get loads). Value must not be copied.

Time-based expiration is optional. See LoadFunc and LoadOpts.

func (*Value) GetOrLoad

func (v *Value) GetOrLoad(ctx context.Context, dataPtr interface{}, load LoadFunc) error

GetOrLoad either copies a cached value to dataPtr or runs load and then copies dataPtr's value into the cache. A properly-written load writes dataPtr's value. Example:

var result string
err := value.GetOrLoad(ctx, &result, func(ctx context.Context, opts *loadingcache.LoadOpts) error {
	var err error
	result, err = doExpensiveThing(ctx)
	opts.CacheFor(time.Hour)
	return err
})

dataPtr must be a pointer to a copyable value (slice, int, struct without Mutex, etc.).

Value does not cache errors. Consider caching a value containing an error, like struct{result int; err error} if desired.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL