Welcome to Part 2 of Effectively Using Caching While Developing APIs in Mule!
There are multiple approaches that can be used in mule to implement caching for APIs, based on what data you decide to cache.
The two prominent approaches are:
- Caching the whole API response using “HTTP Caching” policy in API Manager
- Caching a specific back-end response within the API code using “Cache Scope” (We'll discuss this in Part 2!)
Approach #1 was discussed in the first series of this post, you can read it here. Below, we'll be diving into Approach #2:
2. Caching within the mule code using Cache Scope:
When it is not feasible to cache whole API response, there can still be possibilities to improve the overall API response time by applying caching within the processing flows at appropriate places. The goal is to cache (safely – without generating stale data) the output of an expensive and time-consuming code snippet, in order to improve overall API response time. Note that the normal processing time (during “cache miss”) of the code snippet being cached has to be substantially larger than the cost of caching (time mule runtime takes to retrieve the cached data from underlying object store), in order to realize the benefits of caching.
Mule provides “Cache Scope” as one of the core components for implementing caching of frequently used data within the flow processing.
Here is the link to the detailed MuleSoft documentation for this core component.
Mule runtime engine caches and stores the output of the code snippet that is encapsulated within cache scope, against a configurable cache key. For any subsequent calls with same set of input (cache key), the runtime simply skips the code snippet execution, and returns the previously cached output. This can result in substantial execution time saving if the code to be cache-scoped is chosen correctly.
Note that this caching works only for non-consumable payloads
When would you use the cache scope in your mule code?
Some of the obvious candidates for caching are file operations and external system interactions (database, web services etc). However sometimes it is also beneficial to apply caching on other code elements that involve complex internal processing.
In general, the main factors to be considered for identifying caching candidates are,
- A code to be cached should generates same output for a given set of inputs (or cache key) for a predictable time interval
- The execution time and resource consumption of the code is much more expensive than the cost of caching (cost of caching includes is the overhead due to amount of memory used for storing in-memory cache and/or the processing time for retrieving the cache data from persistent storage)
- The code to be cached should be high usage code, meaning it gets invoked frequently in live environment.
- It should be possible to define a cache key that will result substantially more cache hits than cache misses in the live environment. Very frequent cache misses in live environment indicate poor or infeasible cache implementation.
Here are some of the common use cases where cache scope can be used quite effectively,
- Static data lookups: Lets say you are developing an API that needs to lookup names of cities in a given state from a back end master data system via web services. Since this data rarely changes in the back end system, caching this lookup output will save good amount of API processing time. Further, setting the state name/id as a cache key and also by setting a finite TTL (considerably longer period), will help in storing only the most frequented states data, thus minimizing the cost of the caching.
- Back end data retrievals: In the cases when your API needs to invoke external systems to read data that changes rarely or changes only after a predictable time interval, the data retrieval operation can be added into cache scope. This scenario could include any time consuming back end calls such as web services, REST services, file or database access etc.
- Caching short living data such as Access Tokens: Access tokens typically have a predefined lifespan. When invoking a back end system that requires its consumers to request access tokens (e.g. Oauth2) before making actual transaction call, it may be beneficial to cache the access token retrieved once from the provider and reuse the same until its expired. Many third party systems (such as Salesforce, CVent) limit the number of access tokens to be generated for a client id in a day. The above caching technique can help greatly when interfacing with such back end systems.
- Caching embedded complex business logic/rules: Sometimes the APIs use complex business logic which generates same set of data for a given set of inputs however takes considerable amount of time and CPU resources due to their processing complexity. Such codes are ideal candidates for caching when their resource consumption is much more expensive than the cost of caching.
Lastly, though caching is a great way to achieve better API response time and improving performance; overdoing caching in an overall solution landscape should also be avoided, as it may result in redundant and wasted resources.
Here is a brief example:
Suppose you are developing an API ecosystem for your organization where you built one system API (say, GET flightHistory), and found out that it’s a good candidate for HTTP caching policy.
Now, suppose some other developer/s in your organization are building a few process APIs which are required to consume this system API GET flightHistory. Thinking of improving their process API performance, the developers encapsulated the HTTP call to the system API in a cache scope.
Assuming both the process APIs and system APIs are in the same runtime environment, this two level caching will provide little benefits over a single level (at system API) caching. Rather, the cost of maintaining the cache will be more than the benefits of the cache. Thus making process level cache in this case, redundant.
In summary, caching techniques can play significant role in improving the latency and performance of API implementations in mule, provided it is used in correct way.