care

package module
v0.0.0-...-df197c9 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 18, 2024 License: MIT Imports: 15 Imported by: 0

README

build workflow Go Reference

go-care

A library for gRPC response memoization in Go aims to improve common performance of any project having made real response computation lower cost.

go-care is an abbreviation of the 'Cached Response in Go'.

Caching might be almost a codeless solution.

Adding a couple gRPC interceptors to the project enables the caching usage at the both sides (client-, server-side) out of the box. Pay more attention to the logic and less on the secondary (caching at least).

A few examples will make the start easier.

Introduction

Having faced an issue of lack of the memoization in grpc this library has been written as a solution. Maybe there was a solution, but I had not been searching for the one enough well. Tricky question... Nonetheless, let's say a bit about the go-care package.

The package aims to improve performance via caching response for respective request params. Having included additional information from incoming data there is a possibility to make the most unique key for the cached response.

Essential gist of the approach is to make an interceptor wherein compute a key --a hash of the incoming data-- and memorize/restore a respective response, having been making the lower computation cost at the expense of the decreased number of the real computation.

The package can be used with built-in LRU cache or with an external cache like Redis, Memcached, or something else. In order to use an external cache there is only the need to implement a caching interface.

Features

  • server- and client-side response memoization
  • flexibility and room space for customization
  • loosely coupled architecture
  • thread safety
  • robust hash/key (any order of the same request data won't affect the hash/key)
  • easy to use :)

Note

  • ‘Robust key’ means that if a request contains a sequence --like array, slice, map-- the key will not be affected by the items' order within the sequence. All requests with the same data in spite of any order will be cached with the same response and there will be only one response computation for the first request.
  • Built-in in-memory cache implementation doesn't support an eviction policy by TTL. It has developed only for demo and small MVP. In production, you could use go-care with Redis, Memcached, or other cache. That might be done having implemented the 'Cache' interface and provided via 'Options'.

Usage

  1. Add the package to the project
  2. On the server-side it might be articulated like below (in pseudocode)
package main

import (
...
"github.com/pantheon-lab/go-care"
...
)

// Your server implementation
// type server struct {
//   api.UnimplementedYourServerServiceServer
// }

// ...

func main() {
	// Other your code
	...

	// Creating the options
	opts := care.NewOptions()

	// Adding methods for the memoization. 
	// You need to define the methods pool 
	// for response memoization.
	opts.Methods.Add("/api.GreeterService/SayHello", time.Second*60)

	// Other customization might be done via the opts variable.
	// See the examples.

	// Creating the server-side interceptor.
	unary := care.NewServerUnaryInterceptor(opts)
	// Providing / applying the interceptor
	grpcsrv := grpc.NewServer(unary)
	// Other your code

	...
}
  1. On the client-side the similar way
package main

import (
...
"github.com/pantheon-lab/go-care"
...
)

func main() {
   ...

   opts := care.NewDefaultOptions()
   opts.Methods.Add("/api.GreeterService/SayHello", time.Second*60)

   unary := care.NewClientUnaryInterceptor(opts)

   conn, err := grpc.Dial(
   	fmt.Sprintf("%s:%d", *host, *port),
   	grpc.WithTransportCredentials(insecure.NewCredentials()),
   	unary,
   )

   if err != nil {
   	log.Fatalf...
   }

   defer conn.Close()

   ...
}

More details you'll find among the examples

Examples

The examples demonstrate go-care package's features. Having run server and client with different param-set you can try out all features.

  • 'Greeter' is close to the canonical 'Hello World' example, demonstrating all basic features.
  • 'Redis Greeter' is the same, but with Redis like an external cache is.

Compiler and OS

The package has been developed and tested in Go 1.19 within Ubuntu 20.04. Hopefully, many other OS and compiler versions will be appropriate quite well.

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func NewClientDialOption

func NewClientDialOption(opts *Options) grpc.DialOption

func NewClientUnaryInterceptor

func NewClientUnaryInterceptor(opts *Options) grpc.UnaryClientInterceptor

NewClientUnaryInterceptor makes a new unary client interceptor. There will be panic if options is an empty pointer.

func NewServerOption

func NewServerOption(opts *Options) grpc.ServerOption

func NewServerUnaryInterceptor

func NewServerUnaryInterceptor(opts *Options) grpc.UnaryServerInterceptor

NewServerUnaryInterceptor - makes a new unary server interceptor. There will be panic if options is an empty pointer.

Types

type Cache

type Cache interface {
	// Put data into the cache by key.
	Put(ctx context.Context, key string, val []byte, ttl time.Duration) error
	// Get data from the cache by key.
	Get(ctx context.Context, key string) ([]byte, error)
}

Cache represents a common interface for responses caching. It can be implemented for many caches like Redis, Memcached, etc,

func NewInMemoryCache

func NewInMemoryCache(capacity uint) Cache

NewInMemoryCache - makes a built-in LRU in-memory cache implementation aimed for small projects, MVP, etc.

type Hash

type Hash interface {
	Calc(key string) (string, error)
}

type Headers

type Headers struct {
	// Headers for the key computation.
	Allowed []string
	// Omitted headers from the key computation.
	Disallowed []string
}

Headers is a pool for filtering.

type MetaFilter

type MetaFilter interface {
	// Allowed - returns true allowing to include the header in
	// the key computation, otherwise returns false.
	Allowed(key string, val []string) bool
}

MetaFilter represents an interface for filtering the headers before including the once in the key computation.

It can be useful if you need to pick up only a few header which have to be included in to the key, making the one more unique and filter the noise. For instance, request-id, trace-id, etc are the noise, meanwhile jwt-token (and others according your app logic) is an important header.

Having implemented your own version, you can control the headers which will be involved in the key computation process.

func NewMetaFilter

func NewMetaFilter(headers Headers) MetaFilter

func NewZeroMetaFilter

func NewZeroMetaFilter(allowed bool) MetaFilter

NewZeroMetaFilter - makes a zero-filter implementation. It can be used if you don't have any rule for header filtering.

allowed - defines common behaviour for any header

true - allows all header to be included in the key computation.
false - disallows all header to be included in the key computation.

type Methods

type Methods interface {
	// Cacheable
	// method - full method name
	//
	// Returns true and caching timeout if the method is found,
	// otherwise false and timeout in this case does not matter.
	Cacheable(method string) (bool, time.Duration)

	// Add - allows to add method for caching.
	// Returns the 'Methods' in order to be
	// more convenient methods adding like a chain.
	Add(method string, ttl time.Duration) Methods
	// Remove the method from allowed to be cached.
	Remove(method string)
	// Clean - removes all methods.
	Clean()
}

Methods is representing an interface that allows to define the service's methods which responses you want to cache.

type Options

type Options struct {
	// The memoization feature is turned on/off.
	Switch Switch
	// A pool of methods for memorizing responses.
	Methods Methods
	// Header filter for including the ones in the key computation.
	MetaFilter MetaFilter
	// Used for the key computation.
	Hash Hash
	// A cache
	Cache Cache
}

Options - memoization options. There are some options you can redefine by your own implementation in order to have more flexibility. 'Options' gives enough room space to do that.

func NewOptions

func NewOptions() *Options

NewOptions - makes a default options set, having filled all items by thread-safe implementations.

type Switch

type Switch interface {
	// TurnOn - turns on the feature.
	TurnOn()
	// TurnOff - turns off the feature.
	TurnOff()

	// IsTurnedOn - returns a state of the feature (turned on/off).
	IsTurnedOn() bool
}

Switch is a switch feature.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL