Sunday, November 17, 2024
Google search engine
HomeLanguagesCachetools module in Python

Cachetools module in Python

Cachetools is a Python module which provides various memoizing collections and decorators. It also includes variants from the functools’ @lru_cache decorator. To use it, first, we need to install it using pip.

pip install cachetools

Cachetools provides us five main function.

  • cached
  • LRUCache
  • TTLCache
  • LFUCache
  • RRCache

Let’s look at each of the following functions in detail and with examples.

Cached

cached is used as a decorator. When we call cache, it will cache the function for later use. This will by default perform a simple cache.

Syntax:

@cached(cache = {})
def some_fun():
    pass

Example: Let’s see it using an example. We are going to use the time module to see the efficiency of our module.




from cachetools import cached
import time
  
  
# Without cached
def fib(n):
    return n if n<2 else fib(n-1) + fib(n-2)
  
s = time.time()
print(old_fib(35))
print("Time Taken: ", time.time() - s)
  
# Now using cached
s = time.time()
  
# Use this decorator to enable caching
@cached(cache ={})
def fib(n):
    return n if n<2 else fib(n-1) + fib(n-2)
  
print(fib(35))
print("Time Taken(cached): ", time.time() - s)


Output:

9227465
Time Taken:  4.553245782852173
9227465
Time Taken(cached):  0.0003821849822998047

LRUCache

LRUCache is used inside the cached decorator. LRU cache means “Least Recently Used” cache. It takes a parameter “maxsize” which states that how recent functions should be cached.

Syntax:

@cached(cache= LRUCache(maxsize= 3))
def some_fun():
    pass

Example:




from cachetools import cached, LRUCache
import time
  
  
# cache using LRUCache
@cached(cache = LRUCache(maxsize = 3))
def myfun(n):
      
    # This delay resembles some task
    s = time.time()
    time.sleep(n)
    print("\nTime Taken: ", time.time() - s)
    return (f"I am executed: {n}")
  
  
# Takes 3 seconds
print(myfun(3))
  
# Takes no time
print(myfun(3))
  
# Takes 2 seconds
print(myfun(2))
  
# Takes 1 second
print(myfun(1))
  
# Takes 4 seconds
print(myfun(4))
  
# Takes no time
print(myfun(1))
  
# Takes 3 seconds because maxsize = 3 
# and the 3 recent used functions had 1,
# 2 and 4.
print(myfun(3))


Output:

Time Taken:  3.0030977725982666
I am executed: 3
I am executed: 3

Time Taken:  2.002072334289551
I am executed: 2

Time Taken:  1.001115083694458
I am executed: 1

Time Taken:  4.001702070236206
I am executed: 4
I am executed: 1

Time Taken:  3.0030171871185303
I am executed: 3

Note: LRUCache can also be called from the standard Python package – functools. It can see imported as

from functools import lru_cache
@lru_cache
def myfunc():
    pass

TTLCache

TTLCache or “Time To Live” cache is the third function that is included in cachetools module. It takes two parameters – “maxsize” and “TTL”. The use of “maxsize” is the same as LRUCache but here the value of “TTL” states for how long the cache should be stored. The value is in seconds.

Syntax:

@cached(cache= TTLCache(maxsize= 33, ttl = 600))
def some_fun():
    pass

Example:




from cachetools import cached, TTLCache
import time
  
# Here recent 32 functions 
# will we stored for 1 minutes
@cached(cache = TTLCache(maxsize = 32, ttl = 60))        
def myfun(n):
      
    # This delay resembles some task
    s = time.time()
    time.sleep(n)
    print("\nTime Taken: ", time.time() - s)
    return (f"I am executed: {n}")
  
print(myfun(3))
print(myfun(3))
time.sleep(61)
print(myfun(3))


Output:

Time Taken:  3.0031025409698486
I am executed: 3
I am executed: 3

Time Taken:  3.0029332637786865
I am executed: 3

LFUCache

LFUCache or “Least Frequently Used” cache is another type of caching technique that retrieves how often an item is called. It discards the items which are called least often to make space when necessary. It takes one parameter – “maxsize” which is the same as in LRUCache.

Syntax:

@cached(cache= LFUCache(maxsize= 33))
def some_fun():
    pass

Example:




from cachetools import cached, LFUCache
import time
  
# Here if a particular item is not called 
# within 5 successive call of the function,
# it will be discarded
@cached(cache = LFUCache(maxsize = 5))
def myfun(n):
      
    # This delay resembles some task
    s = time.time()
    time.sleep(n)
    print("\nTime Taken: ", time.time() - s)
    return (f"I am executed: {n}")
  
print(myfun(3))
print(myfun(3))
print(myfun(2))
print(myfun(4))
print(myfun(1))
print(myfun(1))
print(myfun(3))
print(myfun(3))
print(myfun(4))


Output:


Time Taken:  3.002413272857666
I am executed: 3
I am executed: 3

Time Taken:  2.002107620239258
I am executed: 2

Time Taken:  4.003819465637207
I am executed: 4

Time Taken:  1.0010886192321777
I am executed: 1
I am executed: 1
I am executed: 3
I am executed: 3
I am executed: 4

RRCache

RRCache or “Random Replacement” cache is another type of caching technique that randomly chooses items in the cache and discards them to free up space when necessary. It takes one parameter – “maxsize” which is the same as in LRUCache. It also has a parameter choice which is by default set to “random.choice”.

Syntax:

@cached(cache= RRCache(maxsize= 33))
def some_fun():
    pass

Example:




from cachetools import cached, RRCache
import time
  
# Here if a particular item is not called
# within 5 successive call of the function,
# it will be discarded
@cached(cache = RRCache(maxsize = 5))
def myfun(n):
      
    # This delay resembles some task
    s = time.time()
    time.sleep(n)
    print("\nTime Taken: ", time.time() - s)
    return (f"I am executed: {n}")
  
print(myfun(3))
print(myfun(3))
print(myfun(2))
print(myfun(4))
print(myfun(1))
print(myfun(1))
print(myfun(3))
print(myfun(2))
print(myfun(3))


Output:

Time Taken:  3.003124713897705
I am executed: 3
I am executed: 3

Time Taken:  2.0021231174468994
I am executed: 2

Time Taken:  4.004120588302612
I am executed: 4

Time Taken:  1.0011250972747803
I am executed: 1
I am executed: 1
I am executed: 3
I am executed: 2
I am executed: 3

RELATED ARTICLES

Most Popular

Recent Comments