Moleculer has a built-in caching solution to cache responses of service actions. To enable it, set a cacher
type in broker option and set the cache: true
in action definition what you want to cache.
Cached action example
const { ServiceBroker } = require("moleculer"); |
Console messages:
[2017-08-18T13:04:33.845Z] INFO dev-pc/BROKER: Broker started. |
As you can see, the Handler called
message appears only once because the response of second request is returned from the cache.
Cache keys
The cacher generates key from service name, action name and the params of context.
The syntax of key is:
<serviceName>.<actionName>:<parameters or hash of parameters> |
So if you call the posts.list
action with params { limit: 5, offset: 20 }
, the cacher calculates a hash from the params. So the next time, when you call this action with the same params, it will find the entry in the cache by key.
Example hashed cache key for “posts.find” action
posts.find:limit|5|offset|20 |
The params object can contain properties that are not relevant for the cache key. Also, it can cause performance issues if the key is too long. Therefore it is recommended to set an object for cache
property which contains a list of essential parameter names under the keys
property.
To use meta keys in cache keys
use the #
prefix.
Strict the list of params
& meta
properties for key generation
{ |
Performance tipThis solution is pretty fast, so we recommend to use it in production.
Limiting cache key length
Occasionally, the key can be very long, which can cause performance issues. To avoid it, maximize the length of concatenated params in the key with maxParamsLength
cacher option. When the key is longer than the configured limit value, the cacher calculates a hash (SHA256) from the full key and adds it to the end of the key.
The minimum of
maxParamsLength
is44
(SHA 256 hash length in Base64).To disable this feature, set it to
0
ornull
.
Generate a full key from the whole params without limit
cacher.getCacheKey("posts.find", { id: 2, title: "New post", content: "It can be very very looooooooooooooooooong content. So this key will also be too long" }); |
Generate a limited-length key
const broker = new ServiceBroker({ |
Conditional caching
Conditional caching allows to bypass the cached response and execute an action in order to obtain “fresh” data.
To bypass the cache set ctx.meta.$cache
to false
before calling an action.
Example of turning off the caching for the greeter.hello
action
broker.call("greeter.hello", { name: "Moleculer" }, { meta: { $cache: false }})) |
As an alternative, a custom function can be implemented to enable bypassing the cache. The custom function accepts as an argument the context (ctx
) instance therefore it has access any params or meta data. This allows to pass the bypass flag within the request.
Example of a custom conditional caching function
// greeter.service.js |
TTL
Default TTL setting can be overriden in action definition.
const broker = new ServiceBroker({ |
Custom key-generator
To overwrite the built-in cacher key generator, set your own function as keygen
in cacher options.
const broker = new ServiceBroker({ |
Manual caching
The cacher module can be used manually. Just call the get
, set
, del
methods of broker.cacher
.
// Save to cache |
Additionally, the complete ioredis client API is available at broker.cacher.client
when using the built-in Redis cacher:
// create an ioredis pipeline |
Clear cache
When you create a new model in your service, you have to clear the old cached model entries.
Example to clean the cache inside actions
{ |
Clear cache among multiple service instances
The best practice to clear cache entries among multiple service instances is to use broadcast events. Note this is only required for non-centralized cachers like Memory
or MemoryLRU
.
Example
module.exports = { |
Clear cache among different services
Service dependency is a common situation. E.g. posts
service stores information from users
service in cached entries (in case of populating).
Example cache entry in posts
service
{ |
The author
field is received from users
service. So if the users
service clears cache entries, the posts
service has to clear own cache entries, as well. Therefore you should also subscribe to the cache.clear.users
event in posts
service.
To make it easier, create a CacheCleaner
mixin and define in the dependent services schema.
cache.cleaner.mixin.js
module.exports = function(serviceNames) { |
posts.service.js
const CacheCleaner = require("./cache.cleaner.mixin"); |
With this solution if the users
service emits a cache.clean.users
event, the posts
service will also clear its own cache entries.
Cache locking
Moleculer also supports cache locking feature. For detailed info check this PR.
Enable Lock
const broker = new ServiceBroker({ |
Enable with TTL
const broker = new ServiceBroker({ |
Disable Lock
const broker = new ServiceBroker({ |
Example for Redis cacher with redlock
library
const broker = new ServiceBroker({ |
Built-in cachers
Memory cacher
MemoryCacher
is a built-in memory cache module. It stores entries in the heap memory.
Enable memory cacher
const broker = new ServiceBroker({ |
Or
const broker = new ServiceBroker({ |
Enable with options
const broker = new ServiceBroker({ |
Options
Name | Type | Default | Description |
---|---|---|---|
ttl |
Number |
null |
Time-to-live in seconds. |
clone |
Boolean or Function |
false |
Clone the cached data when return it. |
keygen |
Function |
null |
Custom cache key generator function. |
maxParamsLength |
Number |
null |
Maximum length of params in generated keys. |
lock |
Boolean or Object |
null |
Enable lock feature. |
Cloning
The cacher uses the lodash _.cloneDeep
method for cloning. To change it, set a Function
to the clone
option instead of a Boolean
.
Custom clone function with JSON parse & stringify
const broker = new ServiceBroker({ |
LRU memory cacher
LRU memory cacher
is a built-in LRU cache module. It deletes the least-recently-used items.
Enable LRU cacher
const broker = new ServiceBroker({ |
With options
let broker = new ServiceBroker({ |
Options
Name | Type | Default | Description |
---|---|---|---|
ttl |
Number |
null |
Time-to-live in seconds. |
max |
Number |
null |
Maximum items in the cache. |
clone |
Boolean or Function |
false |
Clone the cached data when return it. |
keygen |
Function |
null |
Custom cache key generator function. |
maxParamsLength |
Number |
null |
Maximum length of params in generated keys. |
lock |
Boolean or Object |
null |
Enable lock feature. |
DependenciesTo be able to use this cacher, install the
lru-cache
module with thenpm install lru-cache --save
command.
Redis cacher
RedisCacher
is a built-in Redis based distributed cache module. It uses ioredis
library.
Use it, if you have multiple instances of services because if one instance stores some data in the cache, other instances will find it.
Enable Redis cacher
const broker = new ServiceBroker({ |
With connection string
const broker = new ServiceBroker({ |
With options
const broker = new ServiceBroker({ |
With MessagePack serializer
You can define a serializer for Redis Cacher. By default, it uses the JSON serializer.
const broker = new ServiceBroker({ |
With Redis Cluster Client
const broker = new ServiceBroker({ |
Options
Name | Type | Default | Description |
---|---|---|---|
prefix |
String |
null |
Prefix for generated keys. |
ttl |
Number |
null |
Time-to-live in seconds. Disabled: 0 or null |
monitor |
Boolean |
false |
Enable Redis client monitoring feature. If enabled, every client operation will be logged (on debug level) |
redis |
Object |
null |
Custom Redis options. Will be passed to the new Redis() constructor. Read more. |
keygen |
Function |
null |
Custom cache key generator function. |
maxParamsLength |
Number |
null |
Maximum length of params in generated keys. |
serializer |
String |
"JSON" |
Name of a built-in serializer. |
cluster |
Object |
null |
Redis Cluster client configuration. More information |
lock |
Boolean or Object |
null |
Enable lock feature. |
pingInterval |
Number |
null |
Emit a Redis PING command every pingInterval milliseconds. Can be used to keep connections alive which may have idle timeouts. |
DependenciesTo be able to use this cacher, install the
ioredis
module with thenpm install ioredis --save
command.
Custom cacher
Custom cache module can be created. We recommend to copy the source of MemoryCacher or RedisCacher and implement the get
, set
, del
and clean
methods.
Create custom cacher
const BaseCacher = require("moleculer").Cachers.Base; |
Use custom cacher
const { ServiceBroker } = require("moleculer"); |