Home


6.7.1 Cache sizing formula

Choosing the size of the memory cache, in terms of the number of entries, should be done based on how much memory is available for caching.

The average memory, in bytes, that is used by the system to reference a cached object with its dependency IDs can be computed as:

size = o + c + ( k* ( dp + tm + 128) )

o = the average size of the object
c = the average size of the cache ID
k = is 4 for 32-bit platforms and 8 for 64-bit platforms
dp = the number of templates
tm = dependency IDs that are associated with this object

The number of entries that is specified should be large enough to hold the cache entries that are associated with the popular or more frequently used categories. The memory cache and therefore memory dedicated for the cache should be large enough not only to cache content belonging to categories that have higher business value, but also enough additional entries to form a working set in order to minimize the amount of thrashing due to Least Recent Used (LRU) eviction.

The Java Virtual Machine (JVM) heap settings should also be set. The recommended setting for the JVM heap is to have 40% of free heap after caching. This tuning involves either increasing the size of JVM or reducing the size of the in-memory cache (or cache objects that require less memory). There are lots of trade-offs here, such as higher JVM causing longer garbage collection. It is a fine balance that can only be determined with proper testing.

DynaCache cleans expired entries from the memory cache in the background. The daemon responsible for this cleanup will wake up every five seconds. This is sufficient for most deployments. On the other hand, for deployments that do infrequent invalidation and possibly invalidate entries only once a day, this can be set higher. If the deployment has a lot of automated or trigger-driven invalidation, the cleanup interval should be set lower.

+

Search Tips   |   Advanced Search