Moleculer has a built-in caching solution to cache responses of service actions. To enable it, set a cacher type in broker option and set the cache: true in action definition what you want to cache.
Cached action example
| const { ServiceBroker } = require("moleculer"); | 
Console messages:
| [2017-08-18T13:04:33.845Z] INFO dev-pc/BROKER: Broker started. | 
As you can see, the Handler called message appears only once because the response of second request is returned from the cache.
Cache keys
The cacher generates key from service name, action name and the params of context.
The syntax of key is:
| <serviceName>.<actionName>:<parameters or hash of parameters> | 
So if you call the posts.list action with params { limit: 5, offset: 20 }, the cacher calculates a hash from the params. So the next time, when you call this action with the same params, it will find the entry in the cache by key.
Example hashed cache key for “posts.find” action
| posts.find:limit|5|offset|20 | 
However, the params can contain properties which is not relevant for the cache key. On the other hand, it can cause performance issues if the key is too long. Therefore it is recommended to set an object for cache property which contains a list of essential parameter names under the keys property.
Strict the list of params & meta properties for key generation
| { | 
PerformanceThis solution is pretty fast, so we recommend to use it in production.
Cache meta keys
To use meta keys in cache keys use the # prefix.
| broker.createService({ | 
Limiting cache key length
Occasionally, the key can be very long, which can cause performance issues. To avoid it, maximize the length of concatenated params in the key with maxParamsLength cacher option. When the key is longer than this configured limitvalue, the cacher calculates a hash (SHA256) from the full key and adds it to the end of the key.
The minimum of
maxParamsLengthis44(SHA 256 hash length in Base64).To disable this feature, set it to
0ornull.
Generate a full key from the whole params without limit
| cacher.getCacheKey("posts.find", { id: 2, title: "New post", content: "It can be very very looooooooooooooooooong content. So this key will also be too long" }); | 
Generate a limited-length key
| const broker = new ServiceBroker({ | 
Conditional caching
Conditional caching allows to bypass the cached response and execute an action in order to obtain “fresh” data.
To bypass the cache set ctx.meta.$cache to false before calling an action.
Example of turning off the caching for the greeter.hello action
| broker.call("greeter.hello", { name: "Moleculer" }, { meta: { $cache: false }})) | 
As an alternative, a custom function can be implemented to enable bypassing the cache. The custom function accepts as an argument the context (ctx) instance therefore it has access any params or meta data. This allows to pass the bypass flag within the request.
Example of a custom conditional caching function
| // greeter.service.js | 
TTL
Default TTL setting can be overriden in action definition.
| const broker = new ServiceBroker({ | 
Custom key-generator
To overwrite the built-in cacher key generator, set your own function as keygen in cacher options.
| const broker = new ServiceBroker({ | 
Manual caching
The cacher module can be used manually. Just call the get, set, del methods of broker.cacher.
| // Save to cache | 
Additionally, the complete ioredis client API is available at broker.cacher.client when using the built-in Redis cacher:
| // create an ioredis pipeline | 
Clear cache
When you create a new model in your service, sometimes you have to clear the old cached model entries.
Example to clean the cache inside actions
| { | 
Clear cache among multiple service instances
The best practice to clear cache entries among multiple service instances is that use broadcast events.
Example
| module.exports = { | 
Clear cache among different services
Common way is that your service depends on other ones. E.g. posts service stores information from users service in cached entries (in case of populating).
Example cache entry in posts service
| { | 
The author field is received from users service. So if the users service clears cache entries, the posts service has to clear own cache entries, as well. Therefore you should also subscribe to the cache.clear.users event in posts service.
To make it easier, create a CacheCleaner mixin and define in constructor the dependent services.
cache.cleaner.mixin.js
| module.exports = function(serviceNames) { | 
posts.service.js
| const CacheCleaner = require("./cache.cleaner.mixin"); | 
With this solution if the users service emits a cache.clean.users event, the posts service will also clear the own cache entries.
Cache locking
Moleculer also supports cache locking feature. For detailed info check the PR
Enable Lock
| const broker = new ServiceBroker({ | 
Enable with TTL
| const broker = new ServiceBroker({ | 
Disable Lock
| const broker = new ServiceBroker({ | 
Example for Redis cacher with redlock library
| const broker = new ServiceBroker({ | 
Built-in cachers
Memory cacher
MemoryCacher is a built-in memory cache module. It stores entries in the heap memory.
Enable memory cacher
| const broker = new ServiceBroker({ | 
Or
| const broker = new ServiceBroker({ | 
Enable with options
| const broker = new ServiceBroker({ | 
Options
| Name | Type | Default | Description | 
|---|---|---|---|
| ttl | Number | null | Time-to-live in seconds. | 
| clone | BooleanorFunction | false | Clone the cached data when return it. | 
| keygen | Function | null | Custom cache key generator function. | 
| maxParamsLength | Number | null | Maximum length of params in generated keys. | 
Cloning
The cacher uses the lodash _.cloneDeep method for cloning. To change it, set a Function to the clone option instead of a Boolean.
Custom clone function with JSON parse & stringify
| const broker = new ServiceBroker({ | 
Redis cacher
RedisCacher is a built-in Redis based distributed cache module. It uses ioredis library.
Use it, if you have multiple instances of services because if one instance stores some data in the cache, other instances will find it.
Enable Redis cacher
| const broker = new ServiceBroker({ | 
With connection string
| const broker = new ServiceBroker({ | 
With options
| const broker = new ServiceBroker({ | 
With MessagePack serializer
| const broker = new ServiceBroker({ | 
With Redis Cluster Client
| const broker = new ServiceBroker({ | 
Options
| Name | Type | Default | Description | 
|---|---|---|---|
| prefix | String | null | Prefix for generated keys. | 
| ttl | Number | null | Time-to-live in seconds. Disabled: 0 or null | 
| monitor | Boolean | false | Enable Redis client monitoring feature. If enabled, every client operation will be logged (on debug level) | 
| redis | Object | null | Custom Redis options. Will be passed to the new Redis()constructor. Read more. | 
| keygen | Function | null | Custom cache key generator function. | 
| maxParamsLength | Number | null | Maximum length of params in generated keys. | 
| serializer | String | null | Name of a built-in serializer. Default: "JSON" | 
| cluster | Object | null | Redis Cluster client configuration. More information | 
DependenciesTo be able to use this cacher, install the
ioredismodule with thenpm install ioredis --savecommand.
LRU memory cacher
LRU memory cacher is a built-in LRU cache module. It deletes the least-recently-used items.
Enable LRU cacher
| const broker = new ServiceBroker({ | 
With options
| let broker = new ServiceBroker({ | 
DependenciesTo be able to use this cacher, install the
lru-cachemodule with thenpm install lru-cache --savecommand.
Custom cacher
Custom cache module can be created. We recommend to copy the source of MemoryCacher or RedisCacher and implement the get, set, del and clean methods.
Create custom cacher
| const BaseCacher = require("moleculer").Cachers.Base; | 
Use custom cacher
| const { ServiceBroker } = require("moleculer"); |