Thread-safe LRU cache with permanency and context-based expiration

go-wlru

Thread-safe LRU cache with permanency and context-based expiration

Go

Operational Complexity (Time)

Operation Best Average Worst
Access Θ(1) Θ(1) O(1)
Search Θ(1) Θ(1) O(n)
Insertion Θ(1) Θ(1) O(n)
Deletion Θ(1) Θ(1) O(n)
Snapshot Θ(n) Θ(n) Θ(n)

Operation Complexity (Space)

Complexity Value
Best Ω(2n)
Average Ω(2n)
Worst Ω(n + n log(n))

Usage

This is a simple example LRU cache structure made with API request lookup caching in mind. If you decide to use this, do so at your own peril.

Thread Safety

It should be thread-safe on all operations.

Benchmarks

BenchmarkSet100k

Running tool: /usr/local/go/bin/go test -benchmem -run=^$ -bench ^BenchmarkSet100k$ github.com/heucuva/go-wlru

goos: linux
goarch: amd64
pkg: github.com/heucuva/go-wlru
cpu: Intel(R) Core(TM) i7-10710U CPU @ 1.10GHz
BenchmarkSet100k-4   	1000000000	         0.05225 ns/op	       0 B/op	       0 allocs/op
PASS
ok  	github.com/heucuva/go-wlru	0.436s

As long as the data set is relatively small in size (<131072 items), the performance stays fairly good. Once this size is passed, the time to perform access/search/insert/delete operations increase significantly - this is due to the design of the underlying sync.Map implementation, as shown below:

BenchmarkSet1M

Running tool: /usr/local/go/bin/go test -benchmem -run=^$ -bench ^BenchmarkSet1M$ github.com/heucuva/go-wlru

goos: linux
goarch: amd64
pkg: github.com/heucuva/go-wlru
cpu: Intel(R) Core(TM) i7-10710U CPU @ 1.10GHz
BenchmarkSet1M-4   	1000000000	         0.7867 ns/op	       0 B/op	       0 allocs/op
PASS
ok  	github.com/heucuva/go-wlru	42.616s
Owner
Similar Resources

Solution for Leetcode problem: 146. LRU Cache

Solution for Leetcode problem: 146. LRU Cache link My solution for the above lee

Jan 30, 2022

A high performance gin middleware to cache http response. Compared to gin-contrib/cache, It has a huge performance improvement. 高性能gin缓存中间件,相比于官方版本,有明显性能提升。

A high performance gin middleware to cache http response. Compared to gin-contrib/cache, It has a huge performance improvement. 高性能gin缓存中间件,相比于官方版本,有明显性能提升。

A high performance gin middleware to cache http response. Compared to gin-contrib/cache. It has a huge performance improvement.

Dec 28, 2022

Package cache is a middleware that provides the cache management for Flamego.

cache Package cache is a middleware that provides the cache management for Flamego. Installation The minimum requirement of Go is 1.16. go get github.

Nov 9, 2022

A mem cache base on other populator cache, add following feacture

memcache a mem cache base on other populator cache, add following feacture add lazy load(using expired data, and load it asynchronous) add singlefligh

Oct 28, 2021

Cache - A simple cache implementation

Cache A simple cache implementation LRU Cache An in memory cache implementation

Jan 25, 2022

Gin-cache - Gin cache middleware with golang

Gin-cache - Gin cache middleware with golang

Nov 28, 2022

🧩 Redify is the optimized key-value proxy for quick access and cache of any other database throught Redis and/or HTTP protocol.

Redify (Any database as redis) License Apache 2.0 Redify is the optimized key-value proxy for quick access and cache of any other database throught Re

Sep 25, 2022

groupcache is a caching and cache-filling library, intended as a replacement for memcached in many cases.

groupcache Summary groupcache is a distributed caching and cache-filling library, intended as a replacement for a pool of memcached nodes in many case

Dec 31, 2022
Related tags
Lru - A simple LRU cache using go generics

LRU Cache A simple LRU cache using go generics. Examples Basic usage. func main(

Nov 9, 2022
lru: the most concise and efficient LRU algorithm based on golang

lru This package of lru is the most concise and efficient LRU algorithm based on golang. Example Quick start: package main import ( "fmt" "github.

Dec 27, 2021
Cache library for golang. It supports expirable Cache, LFU, LRU and ARC.
Cache library for golang. It supports expirable Cache, LFU, LRU and ARC.

GCache Cache library for golang. It supports expirable Cache, LFU, LRU and ARC. Features Supports expirable Cache, LFU, LRU and ARC. Goroutine safe. S

Dec 30, 2022
Least-recently-used-LRU- - Design CacheEvictionPolicy with 2 strategy LRU(Least recently used)

Least-recently-used-LRU- Design CacheEvictionPolicy with 2 strategy LRU(Least re

Jan 4, 2022
fastcache - fast thread-safe inmemory cache for big number of entries in Go

Fast thread-safe inmemory cache for big number of entries in Go. Minimizes GC overhead

Dec 27, 2022
Light weight thread safe cache for golang

go-cache Light weight thread safe LRU cache Getting started import( "fmt" "github.com/tak1827/go-cache/lru" ) func main() { size := 2 cache := l

Dec 12, 2021
Concurrency-safe Go caching library with expiration capabilities and access counters

cache2go Concurrency-safe golang caching library with expiration capabilities. Installation Make sure you have a working Go environment (Go 1.2 or hig

Jan 1, 2023
LRU-based cache package for Go.

cache is LRU-based cache package written in vanilla Go - with no package dependency. LRU stands for Least Recently Used and it is one of the famous cache replacement algorithm

Sep 8, 2022
LevelDB style LRU cache for Go, support non GC object.

Go语言QQ群: 102319854, 1055927514 凹语言(凹读音“Wa”)(The Wa Programming Language): https://github.com/wa-lang/wa LRU Cache Install go get github.com/chai2010/c

Jul 5, 2020
An in-memory cache library for golang. It supports multiple eviction policies: LRU, LFU, ARC

GCache Cache library for golang. It supports expirable Cache, LFU, LRU and ARC. Features Supports expirable Cache, LFU, LRU and ARC. Goroutine safe. S

May 31, 2021