Python in-memory cache with time to live
- Louise McMahon
- 2015-08-02 11:12
- 6
I have multiple threads running the same process that need to be able to to notify each other that something should not be worked on for the next n seconds its not the end of the world if they do however.
My aim is to be able to pass a string and a TTL to the cache and be able to fetch all the strings that are in the cache as a list. The cache can live in memory and the TTL's will be no more than 20 seconds.
Does anyone have a any suggestions for how this can be accomplished?
6 Answers
The OP is using python 2.7 but if you're using python 3, ExpiringDict
mentioned in the accepted answer is currently, well, expired. The last commit to the github repo was June 17, 2017 and there is an open issue that it doesn't work with Python 3.5
There is a more recently maintained project cachetools (last commit Jun 14, 2018)
pip install cachetools
from cachetools import TTLCache cache = TTLCache(maxsize=10, ttl=360) cache['apple'] = 'top dog' ... >>> cache['apple'] 'top dog' ... after 360 seconds... >>> cache['apple'] KeyError exception thrown
ttl
is the time to live in seconds.
python cachetools, cachetools is available from PyPI and can be installed by running: pip install cachetools cachetools is available from PyPI and can be installed by running: pip install cachetools
User
2018-09-01 13:06
In case you don't want to use any 3rd libraries, you can add one more parameter to your expensive function: ttl_hash=None
. This new parameter is so-called "time sensitive hash", its the only purpose is to affect lru_cache
.
For example:
from functools import lru_cache import time @lru_cache() def my_expensive_function(a, b, ttl_hash=None): del ttl_hash # to emphasize we don't use it and to shut pylint up return a + b # horrible CPU load... def get_ttl_hash(seconds=3600): """Return the same value withing `seconds` time period""" return round(time.time() / seconds) # somewhere in your code... res = my_expensive_function(2, 2, ttl_hash=get_ttl_hash()) # cache will be updated once in an hour
python cache class, Python 3.8 includes the functools.cached_property decorator. Transform a method of a class into a property whose value is computed once and then cached as a normal attribute for the life of the instance. Similar to property(), with the addition of caching. Useful for expensive computed properties of instances that are otherwise effectively immutable. Python 3.8 includes the functools.cached_property decorator. Transform a method of a class into a property whose value is computed once and then cached as a normal attribute for the life of the instance. Similar to property(), with the addition of caching. Useful for expensive computed properties of instances that are otherwise effectively immutable.
iutinvg
2019-11-05 13:29
Regarding an expiring in-memory cache, for general purpose use, a common design pattern to typically do this is not via a dictionary, but via a function or method decorator. A cache dictionary is managed behind the scenes. As such, this answer somewhat complements the answer by User which uses a dictionary rather than a decorator.
The ttl_cache
decorator in cachetools==3.1.0
works a lot like functools.lru_cache
, but with a time to live.
import cachetools.func @cachetools.func.ttl_cache(maxsize=128, ttl=10 * 60) def example_function(key): return get_expensively_computed_value(key) class ExampleClass: EXP = 2 @classmethod @cachetools.func.ttl_cache() def example_classmethod(cls, i): return i * cls.EXP @staticmethod @cachetools.func.ttl_cache() def example_staticmethod(i): return i * 3
python lru cache, Python Functools – lru_cache () The functools module in Python deals with higher-order functions, that is, functions operating on (taking as arguments) or returning functions and other such callable objects. The functools module provides a wide array of methods such as cached_property (func), cmp_to_key (func), lru_cache (func), wraps (func), etc. It is worth noting that these methods take functions as arguments. Python Functools – lru_cache () The functools module in Python deals with higher-order functions, that is, functions operating on (taking as arguments) or returning functions and other such callable objects. The functools module provides a wide array of methods such as cached_property (func), cmp_to_key (func), lru_cache (func), wraps (func), etc. It is worth noting that these methods take functions as arguments.
Acumenus
2019-04-29 16:27
I absolutely love the idea from @iutinvg, I just wanted to take it a little further. Decouple it from having to know to pass the ttl
and just make it a decorator so you don't have to think about it. If you have django
, py3
and don't feel like pip installing any dependencies, try this out.
import time from django.utils.functional import lazy from functools import lru_cache, partial, update_wrapper def lru_cache_time(seconds, maxsize=None): """ Adds time aware caching to lru_cache """ def wrapper(func): # Lazy function that makes sure the lru_cache() invalidate after X secs ttl_hash = lazy(lambda: round(time.time() / seconds), int)() @lru_cache(maxsize) def time_aware(__ttl, *args, **kwargs): """ Main wrapper, note that the first argument ttl is not passed down. This is because no function should bother to know this that this is here. """ def wrapping(*args, **kwargs): return func(*args, **kwargs) return wrapping(*args, **kwargs) return update_wrapper(partial(time_aware, ttl_hash), func) return wrapper @lru_cache_time(seconds=10) def meaning_of_life(): """ This message should show up if you call help(). """ print('this better only show up once!') return 42 @lru_cache_time(seconds=10) def mutiply(a, b): """ This message should show up if you call help(). """ print('this better only show up once!') return a * b # This is a test, prints a `.` for every second, there should be 10s # beween each "this better only show up once!" *2 because of the two functions. for _ in range(20): meaning_of_life() mutiply(50, 99991) print('.') time.sleep(1)
python cache file, Python 3 contains a caching decorator: functools.lru_cache. Here's the source. Here's the source. def lru_cache(maxsize=100): """Least-recently-used cache decorator. Python 3 contains a caching decorator: functools.lru_cache. Here's the source. Here's the source. def lru_cache(maxsize=100): """Least-recently-used cache decorator.
Javier Buzzi
2019-08-01 00:14
Something like that ?
from time import time, sleep import itertools from threading import Thread, RLock import signal class CacheEntry(): def __init__(self, string, ttl=20): self.string = string self.expires_at = time() + ttl self._expired = False def expired(self): if self._expired is False: return (self.expires_at < time()) else: return self._expired class CacheList(): def __init__(self): self.entries = [] self.lock = RLock() def add_entry(self, string, ttl=20): with self.lock: self.entries.append(CacheEntry(string, ttl)) def read_entries(self): with self.lock: self.entries = list(itertools.dropwhile(lambda x:x.expired(), self.entries)) return self.entries def read_entries(name, slp, cachelist): while True: print "{}: {}".format(name, ",".join(map(lambda x:x.string, cachelist.read_entries()))) sleep(slp) def add_entries(name, ttl, cachelist): s = 'A' while True: cachelist.add_entry(s, ttl) print("Added ({}): {}".format(name, s)) sleep(1) s += 'A' if __name__ == "__main__": signal.signal(signal.SIGINT, signal.SIG_DFL) cl = CacheList() print_threads = [] print_threads.append(Thread(None, read_entries, args=('t1', 1, cl))) # print_threads.append(Thread(None, read_entries, args=('t2', 2, cl))) # print_threads.append(Thread(None, read_entries, args=('t3', 3, cl))) adder_thread = Thread(None, add_entries, args=('a1', 2, cl)) adder_thread.start() for t in print_threads: t.start() for t in print_threads: t.join() adder_thread.join()
python cache decorator, The decorator also provides a cache_clear() function for clearing or invalidating the cache. The original underlying function is accessible through the __wrapped__ attribute. This is useful for introspection, for bypassing the cache, or for rewrapping the function with a different cache. The decorator also provides a cache_clear() function for clearing or invalidating the cache. The original underlying function is accessible through the __wrapped__ attribute. This is useful for introspection, for bypassing the cache, or for rewrapping the function with a different cache.
Dawid Gosławski
2015-08-02 12:57
You can use the
expiringdict
module:In the description they do not talk about multithreading, so in order not to mess up, use a
Lock
.cachetools, You can use the expiringdict module: The core of the library is ExpiringDict class which is an ordered dictionary with auto-expiring values for LRU Cache implementation with per-item time-to-live (TTL) value. only at the next mutating operation, e.g. __setitem__() or __delitem__() , and therefore may still claim memory. """Retrieve text of a Python Enhancement Proposal""" url
enrico.bacis
2019-01-23 21:43