stampede

package module
v0.6.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 12, 2024 License: MIT Imports: 11 Imported by: 7

README

Stampede

Prevents cache stampede https://en.wikipedia.org/wiki/Cache_stampede by only running a single data fetch operation per expired / missing key regardless of number of requests to that key.

Example 1: HTTP Middleware

import (
	"net/http"
	"time"

	"github.com/go-chi/chi/v5"
	"github.com/go-chi/chi/v5/middleware"
	"github.com/go-chi/stampede"
)

func main() {
	r := chi.NewRouter()
	r.Use(middleware.Logger)
	r.Use(middleware.Recoverer)

	r.Get("/", func(w http.ResponseWriter, r *http.Request) {
		w.Write([]byte("index"))
	})

	cached := stampede.Handler(512, 1 * time.Second)

	r.With(cached).Get("/cached", func(w http.ResponseWriter, r *http.Request) {
		// processing..
		time.Sleep(1 * time.Second)

		w.WriteHeader(200)
		w.Write([]byte("...hi"))
	})

	http.ListenAndServe(":3333", r)
}

Example 2: Raw

import (
	"net/http"

	"github.com/go-chi/stampede"
)

var (
	reqCache = stampede.NewCache(512, 5*time.Second, 10*time.Second)
)

func handler(w http.ResponseWriter, r *http.Request) {	
	data, err := reqCache.Get(r.Context(), r.URL.Path, fetchData)
	if err != nil {	
		w.WriteHeader(503)
		return	
	}

	w.Write(data.([]byte))
}

func fetchData(ctx context.Context) (interface{}, error) {
	// fetch from remote source.. or compute/render..
	data := []byte("some response data")

	return data, nil	
}

Notes

  • Requests passed through the stampede handler will be batched into a single request when there are parallel requests for the same endpoint/resource. This is also known as request coalescing.
  • Parallel requests for the same endpoint / resource, will be just a single handler call and the remaining requests will receive the response of the first request handler.
  • The response payload for the endpoint / resource will then be cached for up to ttl time duration for subequence requests, which offers further caching. You may also use a ttl value of 0 if you want the response to be as fresh as possible, and still prevent a stampede scenario on your handler.
  • Security note: response headers will be the same for all requests, so make sure to not include anything sensitive or user specific. In the case you require user-specific stampede handlers, make sure you pass a custom keyFunc to the stampede.Handler and split the cache by an account's id.

See example for a variety of examples.

LICENSE

MIT

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func BytesToHash

func BytesToHash(b ...[]byte) uint64

func Handler

func Handler(cacheSize int, ttl time.Duration, paths ...string) func(next http.Handler) http.Handler

func HandlerWithKey

func HandlerWithKey(cacheSize int, ttl time.Duration, keyFunc func(r *http.Request) uint64, paths ...string) func(next http.Handler) http.Handler

func StringToHash

func StringToHash(s ...string) uint64

Types

type Cache

type Cache[K comparable, V any] struct {
	// contains filtered or unexported fields
}

func NewCache

func NewCache(size int, freshFor, ttl time.Duration) *Cache[any, any]

func NewCacheKV added in v0.6.0

func NewCacheKV[K comparable, V any](size int, freshFor, ttl time.Duration) *Cache[K, V]

func (*Cache[K, V]) Get

func (c *Cache[K, V]) Get(ctx context.Context, key K, fn singleflight.DoFunc[V]) (V, error)

func (*Cache[K, V]) GetFresh

func (c *Cache[K, V]) GetFresh(ctx context.Context, key K, fn singleflight.DoFunc[V]) (V, error)

func (*Cache[K, V]) Set

func (c *Cache[K, V]) Set(ctx context.Context, key K, fn singleflight.DoFunc[V]) (V, bool, error)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL