A simple generic in-memory caching layer

sc

GitHub release CI main codecov Go Reference

sc is a simple in-memory caching layer for golang.

Usage

Wrap your function with sc - it will automatically cache the values for specified amount of time, with minimal overhead.

type HeavyData struct {
	Data string
	// and all the gazillion fields you may have in your data
}

func retrieveHeavyData(_ context.Context, name string) (*HeavyData, error) {
	// Query to database or something...
	return &HeavyData{
		Data: "my-data-" + name,
	}, nil
}

func main() {
	// Wrap your data retrieval function.
	cache, _ := sc.New[string, *HeavyData](retrieveHeavyData, 1*time.Minute, 2*time.Minute, sc.WithLRUBackend(500))
	// It will automatically call the given function if value is missing.
	foo, _ := cache.Get(context.Background(), "foo")
}

For a more detailed guide, see reference.

Notable Features

  • Simple to use: wrap your function with New() and just call Get().
    • There is no Set() method. Calling Get() will automatically retrieve the value for you.
    • This prevents cache stampede problem idiomatically (see below).
  • Supports 1.18 generics - both key and value are generic.
    • No interface{} or any used other than in type parameters, even in internal implementations.
  • All methods are safe to be called from multiple goroutines.
  • Allows 'graceful cache replacement' (if freshFor < ttl) - a single goroutine is launched in the background to re-fetch a fresh value while serving stale value to readers.
  • Allows strict request coalescing (EnableStrictCoalescing() option) - ensures that all returned values are fresh (a niche use-case).

Supported cache backends (cache replacement policy)

The default backend is the built-in map. This is ultra-lightweight, but does not evict items. You should only use the built-in map backend if your key's cardinality is finite, and you are comfortable holding all values in-memory.

Otherwise, you should use LRU or 2Q backend which automatically evicts overflown items.

  • Built-in map (default)
  • LRU (Least Recently Used)
  • 2Q (Two Queue Cache)

The design

Why no Set() method? / Why cannot I dynamically provide load function to Get() method?

Short answer: sc is designed as a foolproof 'cache layer', not an overly complicated 'cache library'.

Long answer:

sc is designed as a simple, foolproof 'cache layer'. Users of sc simply wrap data-retrieving functions and retrieve values via the cache. By doing so, sc automatically reuses retrieved values and minimizes load on your data-store.

Now, let's imagine how users would use a more standard cache library with Set() method. One could use Get() and Set() method to build the following logic:

  1. Get() from the cache.
  2. If the value is not in the cache, retrieve it from the source.
  3. Set() the value.

This is probably the most common use-case, and it is fine for most applications. But if you do not write it properly, the following problems may occur:

  • If data flow is large, cache stampede might occur.
  • Accidentally using different keys for Get() and Set().
  • Over-caching or under-caching by using inappropriate keys.

sc solves the problems mentioned above by acting as a 'cache layer'.

  • sc will manage the requests for you - no risk of accidentally writing a bad caching logic and overloading your data-store with cache stampede.
  • No manual Set() needed - no risk of accidentally using different keys.
  • Only the cache key is passed to the pre-provided replacement function - no risk of over-caching or under-caching.

This is why sc does not have a Set() method, and forces you to provide replacement function on setup. In this way, there is no risk of cache stampede and possible bugs described above - sc will handle it for you.

But I still want to manually Set() value on update!

By the nature of the design, sc is a no-write-allocate type cache. You update the value on the data-store, and then call Forget() to clear the value on the cache. sc will automatically load the value next time Get() is called.

One could design another cache layer library with Set() method which automatically calls the pre-provided update function which updates the data-store, then updates the value on the cache. But that would add whole another level of complexity - sc aims to be a simple cache layer.

Inspirations from

Similar Resources

An in-memory key:value store/cache (similar to Memcached) library for Go, suitable for single-machine applications.

go-cache go-cache is an in-memory key:value store/cache similar to memcached that is suitable for applications running on a single machine. Its major

Dec 29, 2022

A REST-API service that works as an in memory key-value store with go-minimal-cache library.

A REST-API service that works as an in memory key-value store with go-minimal-cache library.

Aug 25, 2022

🦉owlcache is a lightweight, high-performance, non-centralized, distributed Key/Value memory-cached data sharing application written by Go

 🦉owlcache is a lightweight, high-performance, non-centralized, distributed Key/Value memory-cached data sharing application written by Go

🦉owlcache is a lightweight, high-performance, non-centralized, distributed Key/Value memory-cached data sharing application written by Go . keyword : golang cache、go cache、golang nosql

Nov 5, 2022

In Memory cache in GO Lang Api

In memory key-value store olarak çalışan bir REST-API servisi Standart Kütüphaneler kullanılmıştır Özellikler key ’i set etmek için bir endpoint key ’

Dec 16, 2021

An in-memory key:value store/cache library written in Go 1.18 generics

go-generics-cache go-generics-cache is an in-memory key:value store/cache that is suitable for applications running on a single machine. This in-memor

Dec 27, 2022

Gocodecache - An in-memory cache library for code value master in Golang

gocodecache An in-memory cache library for code master in Golang. Installation g

Jun 23, 2022

Ristretto - A high performance memory-bound Go cache

Ristretto Ristretto is a fast, concurrent cache library built with a focus on pe

Dec 5, 2022

A zero-dependency cache library for storing data in memory with generics.

Memory Cache A zero-dependency cache library for storing data in memory with generics. Requirements Golang 1.18+ Installation go get -u github.com/rod

May 26, 2022

a wrapper around BadgerDB providing a simple API.

Carbon Cache A wrapper around BadgerDB providing a simple API. NOTE This package is provided "as is" with no guarantee. Use it at your own risk and al

Sep 27, 2022
Design and Implement an in-memory caching library for general use

Cache Implementation in GoLang Problem Statement Design and Implement an in-memory caching library for general use. Must Have Support for multiple Sta

Dec 28, 2021
API Cache is a simple caching server, using grpc to accept messages.

API Cache is a simple caching server, using grpc to accept messages. It allows to store key-value pairs, where key is string and value is []byte.

Nov 16, 2021
Concurrency-safe Go caching library with expiration capabilities and access counters

cache2go Concurrency-safe golang caching library with expiration capabilities. Installation Make sure you have a working Go environment (Go 1.2 or hig

Jan 1, 2023
groupcache is a caching and cache-filling library, intended as a replacement for memcached in many cases.

groupcache Summary groupcache is a distributed caching and cache-filling library, intended as a replacement for a pool of memcached nodes in many case

Dec 31, 2022
MySQL to Redis caching made easy

redisql MySQL to Redis caching made easy

Sep 4, 2022
Multi-level caching service in Go
Multi-level caching service in Go

IgoVIUM Multi-level caching service in Go. Specifically: Distributed in-memory cache (L1) DB-based cache (L2) Long term historization on persistent vo

Nov 9, 2022
POC de caching en Go en utilisant go-redis/cache

Test-web POC de caching en Go en utilisant go-redis/cache, cette lib permet d'avoir un cache local et un cache redis (appel cache local puis cache red

Nov 19, 2021
Cachy is a simple and lightweight in-memory cache api.
Cachy is a simple and lightweight in-memory cache api.

cachy Table of Contents cachy Table of Contents Description Features Structure Configurability settings.json default values for backup_file_path Run o

Apr 24, 2022
go-pmem is a project that adds native persistent memory support to Go.

Introduction go-pmem is a project that adds native persistent memory support to Go. This is achieved through a combination of language extensions, com

Dec 28, 2022
An in-memory cache library for golang. It supports multiple eviction policies: LRU, LFU, ARC

GCache Cache library for golang. It supports expirable Cache, LFU, LRU and ARC. Features Supports expirable Cache, LFU, LRU and ARC. Goroutine safe. S

May 31, 2021