A dictionary is often a cache. With a key we retrieve a stored value. We can use a dictionary, or functools, to perform caching.
In a method call, we can turn the method parameters into a key, and store and retrieve values in a dictionary. This is called memoization.
Dictionary exampleThis program introduces compute(). This method receives one number and returns another. Compute() could be called many times with identical arguments.
compute() must do its slow computations to calculate its result.compute(), we must only do these computations once per key.def compute(n):
# Check the cache.
if n in cache:
return cache[n]
# Do a computation.
if n > 10:
n = 10
v = n ** n
if v > 1000:
v /= 2
# Store the result.
cache[n] = v
return v
# Cache is a dictionary.
cache = {}
# Test.
print(compute(1))
print(compute(2))
print(compute(3))
print(compute(4))
print(compute(5))
print(compute(6))
print(compute(7))
print(compute(8))
print(compute(9))
print(compute(10))
# These use the cache.
print(compute(1))
print(compute(2))
print(compute(3))
print(compute(4))
print(compute(5))1
4
27
256
1562.5
23328.0
411771.5
8388608.0
193710244.5
5000000000.0
1
4
27
256
1562.5Lru_cacheWe can use functools to implement caching (memoization) in a more compact and easy-to-maintain way. We can specify the max number of cache entries with maxsize.
import functools
@functools.lru_cache(maxsize=12)
def compute(n):
# We can test the cache with a print statement.
if n > 10:
n = 10
v = n ** n
if v > 1000:
v /= 2
return v
# Fill up the cache.
print(compute(1))
print(compute(2))
print(compute(3))
print(compute(4))
print(compute(5))
# Cache is hit now.
print(compute(1))
print(compute(2))
print(compute(3))
print(compute(4))
print(compute(5))1
4
27
256
1562.5
1
4
27
256
1562.5Methods that are computationally intensive, use few different arguments and many identical calls, often become faster with memoization.
Memoization is sometimes a useful optimization. And other times, it lends itself to excess memory use and slower execution times.