UNPKG

38.8 kBMarkdownView Raw
1# lru-cache
2
3A cache object that deletes the least-recently-used items.
4
5Specify a max number of the most recently used items that you
6want to keep, and this cache will keep that many of the most
7recently accessed items.
8
9This is not primarily a TTL cache, and does not make strong TTL
10guarantees. There is no preemptive pruning of expired items by
11default, but you _may_ set a TTL on the cache or on a single
12`set`. If you do so, it will treat expired items as missing, and
13delete them when fetched. If you are more interested in TTL
14caching than LRU caching, check out
15[@isaacs/ttlcache](http://npm.im/@isaacs/ttlcache).
16
17As of version 7, this is one of the most performant LRU
18implementations available in JavaScript, and supports a wide
19diversity of use cases. However, note that using some of the
20features will necessarily impact performance, by causing the
21cache to have to do more work. See the "Performance" section
22below.
23
24## Installation
25
26```bash
27npm install lru-cache --save
28```
29
30## Usage
31
32```js
33// hybrid module, either works
34import { LRUCache } from 'lru-cache'
35// or:
36const { LRUCache } = require('lru-cache')
37// or in minified form for web browsers:
38import { LRUCache } from 'http://unpkg.com/lru-cache@9/dist/mjs/index.min.mjs'
39
40// At least one of 'max', 'ttl', or 'maxSize' is required, to prevent
41// unsafe unbounded storage.
42//
43// In most cases, it's best to specify a max for performance, so all
44// the required memory allocation is done up-front.
45//
46// All the other options are optional, see the sections below for
47// documentation on what each one does. Most of them can be
48// overridden for specific items in get()/set()
49const options = {
50 max: 500,
51
52 // for use with tracking overall storage size
53 maxSize: 5000,
54 sizeCalculation: (value, key) => {
55 return 1
56 },
57
58 // for use when you need to clean up something when objects
59 // are evicted from the cache
60 dispose: (value, key) => {
61 freeFromMemoryOrWhatever(value)
62 },
63
64 // how long to live in ms
65 ttl: 1000 * 60 * 5,
66
67 // return stale items before removing from cache?
68 allowStale: false,
69
70 updateAgeOnGet: false,
71 updateAgeOnHas: false,
72
73 // async method to use for cache.fetch(), for
74 // stale-while-revalidate type of behavior
75 fetchMethod: async (
76 key,
77 staleValue,
78 { options, signal, context }
79 ) => {},
80}
81
82const cache = new LRUCache(options)
83
84cache.set('key', 'value')
85cache.get('key') // "value"
86
87// non-string keys ARE fully supported
88// but note that it must be THE SAME object, not
89// just a JSON-equivalent object.
90var someObject = { a: 1 }
91cache.set(someObject, 'a value')
92// Object keys are not toString()-ed
93cache.set('[object Object]', 'a different value')
94assert.equal(cache.get(someObject), 'a value')
95// A similar object with same keys/values won't work,
96// because it's a different object identity
97assert.equal(cache.get({ a: 1 }), undefined)
98
99cache.clear() // empty the cache
100```
101
102If you put more stuff in the cache, then less recently used items
103will fall out. That's what an LRU cache is.
104
105## `class LRUCache<K, V, FC = unknown>(options)`
106
107Create a new `LRUCache` object.
108
109When using TypeScript, set the `K` and `V` types to the `key` and
110`value` types, respectively.
111
112The `FC` ("fetch context") generic type defaults to `unknown`.
113If set to a value other than `void` or `undefined`, then any
114calls to `cache.fetch()` _must_ provide a `context` option
115matching the `FC` type. If `FC` is set to `void` or `undefined`,
116then `cache.fetch()` _must not_ provide a `context` option. See
117the documentation on `async fetch()` below.
118
119## Options
120
121All options are available on the LRUCache instance, making it
122safe to pass an LRUCache instance as the options argument to make
123another empty cache of the same type.
124
125Some options are marked read-only because changing them after
126instantiation is not safe. Changing any of the other options
127will of course only have an effect on subsequent method calls.
128
129### `max` (read only)
130
131The maximum number of items that remain in the cache (assuming no
132TTL pruning or explicit deletions). Note that fewer items may be
133stored if size calculation is used, and `maxSize` is exceeded.
134This must be a positive finite intger.
135
136At least one of `max`, `maxSize`, or `TTL` is required. This
137must be a positive integer if set.
138
139**It is strongly recommended to set a `max` to prevent unbounded
140growth of the cache.** See "Storage Bounds Safety" below.
141
142### `maxSize` (read only)
143
144Set to a positive integer to track the sizes of items added to
145the cache, and automatically evict items in order to stay below
146this size. Note that this may result in fewer than `max` items
147being stored.
148
149Attempting to add an item to the cache whose calculated size is
150greater that this amount will be a no-op. The item will not be
151cached, and no other items will be evicted.
152
153Optional, must be a positive integer if provided.
154
155Sets `maxEntrySize` to the same value, unless a different value
156is provided for `maxEntrySize`.
157
158At least one of `max`, `maxSize`, or `TTL` is required. This
159must be a positive integer if set.
160
161Even if size tracking is enabled, **it is strongly recommended to
162set a `max` to prevent unbounded growth of the cache.** See
163"Storage Bounds Safety" below.
164
165### `maxEntrySize`
166
167Set to a positive integer to track the sizes of items added to
168the cache, and prevent caching any item over a given size.
169Attempting to add an item whose calculated size is greater than
170this amount will be a no-op. The item will not be cached, and no
171other items will be evicted.
172
173Optional, must be a positive integer if provided. Defaults to
174the value of `maxSize` if provided.
175
176### `sizeCalculation`
177
178Function used to calculate the size of stored items. If you're
179storing strings or buffers, then you probably want to do
180something like `n => n.length`. The item is passed as the first
181argument, and the key is passed as the second argument.
182
183This may be overridden by passing an options object to
184`cache.set()`.
185
186Requires `maxSize` to be set.
187
188If the `size` (or return value of `sizeCalculation`) for a given
189entry is greater than `maxEntrySize`, then the item will not be
190added to the cache.
191
192### `fetchMethod` (read only)
193
194Function that is used to make background asynchronous fetches.
195Called with `fetchMethod(key, staleValue, { signal, options,
196context })`. May return a Promise.
197
198If `fetchMethod` is not provided, then `cache.fetch(key)` is
199equivalent to `Promise.resolve(cache.get(key))`.
200
201If at any time, `signal.aborted` is set to `true`, or if the
202`signal.onabort` method is called, or if it emits an `'abort'`
203event which you can listen to with `addEventListener`, then that
204means that the fetch should be abandoned. This may be passed
205along to async functions aware of AbortController/AbortSignal
206behavior.
207
208The `fetchMethod` should **only** return `undefined` or a Promise
209resolving to `undefined` if the AbortController signaled an
210`abort` event. In all other cases, it should return or resolve
211to a value suitable for adding to the cache.
212
213The `options` object is a union of the options that may be
214provided to `set()` and `get()`. If they are modified, then that
215will result in modifying the settings to `cache.set()` when the
216value is resolved, and in the case of `noDeleteOnFetchRejection`
217and `allowStaleOnFetchRejection`, the handling of `fetchMethod`
218failures.
219
220For example, a DNS cache may update the TTL based on the value
221returned from a remote DNS server by changing `options.ttl` in
222the `fetchMethod`.
223
224### `noDeleteOnFetchRejection`
225
226If a `fetchMethod` throws an error or returns a rejected promise,
227then by default, any existing stale value will be removed from
228the cache.
229
230If `noDeleteOnFetchRejection` is set to `true`, then this
231behavior is suppressed, and the stale value remains in the cache
232in the case of a rejected `fetchMethod`.
233
234This is important in cases where a `fetchMethod` is _only_ called
235as a background update while the stale value is returned, when
236`allowStale` is used.
237
238This is implicitly in effect when `allowStaleOnFetchRejection` is
239set.
240
241This may be set in calls to `fetch()`, or defaulted on the
242constructor, or overridden by modifying the options object in the
243`fetchMethod`.
244
245### `allowStaleOnFetchRejection`
246
247Set to true to return a stale value from the cache when a
248`fetchMethod` throws an error or returns a rejected Promise.
249
250If a `fetchMethod` fails, and there is no stale value available,
251the `fetch()` will resolve to `undefined`. Ie, all `fetchMethod`
252errors are suppressed.
253
254Implies `noDeleteOnFetchRejection`.
255
256This may be set in calls to `fetch()`, or defaulted on the
257constructor, or overridden by modifying the options object in the
258`fetchMethod`.
259
260### `allowStaleOnFetchAbort`
261
262Set to true to return a stale value from the cache when the
263`AbortSignal` passed to the `fetchMethod` dispatches an `'abort'`
264event, whether user-triggered, or due to internal cache behavior.
265
266Unless `ignoreFetchAbort` is also set, the underlying
267`fetchMethod` will still be considered canceled, and any value
268it returns will be ignored and not cached.
269
270Caveat: since fetches are aborted when a new value is explicitly
271set in the cache, this can lead to fetch returning a stale value,
272since that was the fallback value _at the moment the `fetch()` was
273initiated_, even though the new updated value is now present in
274the cache.
275
276For example:
277
278```ts
279const cache = new LRUCache<string, any>({
280 ttl: 100,
281 fetchMethod: async (url, oldValue, { signal }) => {
282 const res = await fetch(url, { signal })
283 return await res.json()
284 }
285})
286cache.set('https://example.com/', { some: 'data' })
287// 100ms go by...
288const result = cache.fetch('https://example.com/')
289cache.set('https://example.com/', { other: 'thing' })
290console.log(await result) // { some: 'data' }
291console.log(cache.get('https://example.com/')) // { other: 'thing' }
292```
293
294### `ignoreFetchAbort`
295
296Set to true to ignore the `abort` event emitted by the
297`AbortSignal` object passed to `fetchMethod`, and still cache the
298resulting resolution value, as long as it is not `undefined`.
299
300When used on its own, this means aborted `fetch()` calls are not
301immediately resolved or rejected when they are aborted, and
302instead take the full time to await.
303
304When used with `allowStaleOnFetchAbort`, aborted `fetch()` calls
305will resolve immediately to their stale cached value or
306`undefined`, and will continue to process and eventually update
307the cache when they resolve, as long as the resulting value is
308not `undefined`, thus supporting a "return stale on timeout while
309refreshing" mechanism by passing `AbortSignal.timeout(n)` as the
310signal.
311
312For example:
313
314```js
315const c = new LRUCache({
316 ttl: 100,
317 ignoreFetchAbort: true,
318 allowStaleOnFetchAbort: true,
319 fetchMethod: async (key, oldValue, { signal }) => {
320 // note: do NOT pass the signal to fetch()!
321 // let's say this fetch can take a long time.
322 const res = await fetch(`https://slow-backend-server/${key}`)
323 return await res.json()
324 },
325})
326
327// this will return the stale value after 100ms, while still
328// updating in the background for next time.
329const val = await c.fetch('key', { signal: AbortSignal.timeout(100) })
330```
331
332**Note**: regardless of this setting, an `abort` event _is still
333emitted on the `AbortSignal` object_, so may result in invalid
334results when passed to other underlying APIs that use
335AbortSignals.
336
337This may be overridden on the `fetch()` call or in the
338`fetchMethod` itself.
339
340### `dispose` (read only)
341
342Function that is called on items when they are dropped from the
343cache, as `this.dispose(value, key, reason)`.
344
345This can be handy if you want to close file descriptors or do
346other cleanup tasks when items are no longer stored in the cache.
347
348**NOTE**: It is called _before_ the item has been fully removed
349from the cache, so if you want to put it right back in, you need
350to wait until the next tick. If you try to add it back in during
351the `dispose()` function call, it will break things in subtle and
352weird ways.
353
354Unlike several other options, this may _not_ be overridden by
355passing an option to `set()`, for performance reasons.
356
357The `reason` will be one of the following strings, corresponding
358to the reason for the item's deletion:
359
360- `evict` Item was evicted to make space for a new addition
361- `set` Item was overwritten by a new value
362- `delete` Item was removed by explicit `cache.delete(key)` or by
363 calling `cache.clear()`, which deletes everything.
364
365The `dispose()` method is _not_ called for canceled calls to
366`fetchMethod()`. If you wish to handle evictions, overwrites,
367and deletes of in-flight asynchronous fetches, you must use the
368`AbortSignal` provided.
369
370Optional, must be a function.
371
372### `disposeAfter` (read only)
373
374The same as `dispose`, but called _after_ the entry is completely
375removed and the cache is once again in a clean state.
376
377It is safe to add an item right back into the cache at this
378point. However, note that it is _very_ easy to inadvertently
379create infinite recursion in this way.
380
381The `disposeAfter()` method is _not_ called for canceled calls to
382`fetchMethod()`. If you wish to handle evictions, overwrites,
383and deletes of in-flight asynchronous fetches, you must use the
384`AbortSignal` provided.
385
386### `noDisposeOnSet`
387
388Set to `true` to suppress calling the `dispose()` function if the
389entry key is still accessible within the cache.
390
391This may be overridden by passing an options object to
392`cache.set()`.
393
394Boolean, default `false`. Only relevant if `dispose` or
395`disposeAfter` options are set.
396
397### `ttl`
398
399Max time to live for items before they are considered stale.
400Note that stale items are NOT preemptively removed by default,
401and MAY live in the cache, contributing to its LRU max, long
402after they have expired.
403
404Also, as this cache is optimized for LRU/MRU operations, some of
405the staleness/TTL checks will reduce performance.
406
407This is not primarily a TTL cache, and does not make strong TTL
408guarantees. There is no pre-emptive pruning of expired items,
409but you _may_ set a TTL on the cache, and it will treat expired
410items as missing when they are fetched, and delete them.
411
412Optional, but must be a positive integer in ms if specified.
413
414This may be overridden by passing an options object to
415`cache.set()`.
416
417At least one of `max`, `maxSize`, or `TTL` is required. This
418must be a positive integer if set.
419
420Even if ttl tracking is enabled, **it is strongly recommended to
421set a `max` to prevent unbounded growth of the cache.** See
422"Storage Bounds Safety" below.
423
424If ttl tracking is enabled, and `max` and `maxSize` are not set,
425and `ttlAutopurge` is not set, then a warning will be emitted
426cautioning about the potential for unbounded memory consumption.
427(The TypeScript definitions will also discourage this.)
428
429### `noUpdateTTL`
430
431Boolean flag to tell the cache to not update the TTL when setting
432a new value for an existing key (ie, when updating a value rather
433than inserting a new value). Note that the TTL value is _always_
434set (if provided) when adding a new entry into the cache.
435
436This may be passed as an option to `cache.set()`.
437
438Boolean, default false.
439
440### `ttlResolution`
441
442Minimum amount of time in ms in which to check for staleness.
443Defaults to `1`, which means that the current time is checked at
444most once per millisecond.
445
446Set to `0` to check the current time every time staleness is
447tested.
448
449Note that setting this to a higher value _will_ improve
450performance somewhat while using ttl tracking, albeit at the
451expense of keeping stale items around a bit longer than intended.
452
453### `ttlAutopurge`
454
455Preemptively remove stale items from the cache.
456
457Note that this may _significantly_ degrade performance,
458especially if the cache is storing a large number of items. It
459is almost always best to just leave the stale items in the cache,
460and let them fall out as new items are added.
461
462Note that this means that `allowStale` is a bit pointless, as
463stale items will be deleted almost as soon as they expire.
464
465Use with caution!
466
467Boolean, default `false`
468
469### `allowStale`
470
471By default, if you set `ttl`, it'll only delete stale items from
472the cache when you `get(key)`. That is, it's not preemptively
473pruning items.
474
475If you set `allowStale:true`, it'll return the stale value as
476well as deleting it. If you don't set this, then it'll return
477`undefined` when you try to get a stale entry.
478
479Note that when a stale entry is fetched, _even if it is returned
480due to `allowStale` being set_, it is removed from the cache
481immediately. You can immediately put it back in the cache if you
482wish, thus resetting the TTL.
483
484This may be overridden by passing an options object to
485`cache.get()`. The `cache.has()` method will always return
486`false` for stale items.
487
488Boolean, default false, only relevant if `ttl` is set.
489
490### `noDeleteOnStaleGet`
491
492When using time-expiring entries with `ttl`, by default stale
493items will be removed from the cache when the key is accessed
494with `cache.get()`.
495
496Setting `noDeleteOnStaleGet` to `true` will cause stale items to
497remain in the cache, until they are explicitly deleted with
498`cache.delete(key)`, or retrieved with `noDeleteOnStaleGet` set
499to `false`.
500
501This may be overridden by passing an options object to
502`cache.get()`.
503
504Boolean, default false, only relevant if `ttl` is set.
505
506### `updateAgeOnGet`
507
508When using time-expiring entries with `ttl`, setting this to
509`true` will make each item's age reset to 0 whenever it is
510retrieved from cache with `get()`, causing it to not expire. (It
511can still fall out of cache based on recency of use, of course.)
512
513This may be overridden by passing an options object to
514`cache.get()`.
515
516Boolean, default false, only relevant if `ttl` is set.
517
518### `updateAgeOnHas`
519
520When using time-expiring entries with `ttl`, setting this to
521`true` will make each item's age reset to 0 whenever its presence
522in the cache is checked with `has()`, causing it to not expire.
523(It can still fall out of cache based on recency of use, of
524course.)
525
526This may be overridden by passing an options object to
527`cache.has()`.
528
529Boolean, default false, only relevant if `ttl` is set.
530
531## API
532
533### `new LRUCache<K, V, FC = unknown>(options)`
534
535Create a new LRUCache. All options are documented above, and are
536on the cache as public members.
537
538The `K` and `V` types define the key and value types,
539respectively. The optional `FC` type defines the type of the
540`context` object passed to `cache.fetch()`.
541
542Keys and values **must not** be `null` or `undefined`.
543
544### `cache.max`, `cache.maxSize`, `cache.allowStale`,
545
546`cache.noDisposeOnSet`, `cache.sizeCalculation`, `cache.dispose`,
547`cache.maxSize`, `cache.ttl`, `cache.updateAgeOnGet`,
548`cache.updateAgeOnHas`
549
550All option names are exposed as public members on the cache
551object.
552
553These are intended for read access only. Changing them during
554program operation can cause undefined behavior.
555
556### `cache.size`
557
558The total number of items held in the cache at the current
559moment.
560
561### `cache.calculatedSize`
562
563The total size of items in cache when using size tracking.
564
565### `set(key, value, [{ size, sizeCalculation, ttl, noDisposeOnSet, start, status }])`
566
567Add a value to the cache.
568
569Optional options object may contain `ttl` and `sizeCalculation`
570as described above, which default to the settings on the cache
571object.
572
573If `start` is provided, then that will set the effective start
574time for the TTL calculation. Note that this must be a previous
575value of `performance.now()` if supported, or a previous value of
576`Date.now()` if not.
577
578Options object may also include `size`, which will prevent
579calling the `sizeCalculation` function and just use the specified
580number if it is a positive integer, and `noDisposeOnSet` which
581will prevent calling a `dispose` function in the case of
582overwrites.
583
584If the `size` (or return value of `sizeCalculation`) for a given
585entry is greater than `maxEntrySize`, then the item will not be
586added to the cache.
587
588Will update the recency of the entry.
589
590Returns the cache object.
591
592For the usage of the `status` option, see **Status Tracking**
593below.
594
595If the value is `undefined`, then this is an alias for
596`cache.delete(key)`. `undefined` is never stored in the cache.
597See **Storing Undefined Values** below.
598
599### `get(key, { updateAgeOnGet, allowStale, status } = {}) => value`
600
601Return a value from the cache.
602
603Will update the recency of the cache entry found.
604
605If the key is not found, `get()` will return `undefined`.
606
607For the usage of the `status` option, see **Status Tracking**
608below.
609
610### `async fetch(key, options = {}) => Promise`
611
612The following options are supported:
613
614- `updateAgeOnGet`
615- `allowStale`
616- `size`
617- `sizeCalculation`
618- `ttl`
619- `noDisposeOnSet`
620- `forceRefresh`
621- `status` - See **Status Tracking** below.
622- `signal` - AbortSignal can be used to cancel the `fetch()`.
623 Note that the `signal` option provided to the `fetchMethod` is
624 a different object, because it must also respond to internal
625 cache state changes, but aborting this signal will abort the
626 one passed to `fetchMethod` as well.
627- `context` - sets the `context` option passed to the underlying
628 `fetchMethod`.
629
630If the value is in the cache and not stale, then the returned
631Promise resolves to the value.
632
633If not in the cache, or beyond its TTL staleness, then
634`fetchMethod(key, staleValue, { options, signal, context })` is
635called, and the value returned will be added to the cache once
636resolved.
637
638If called with `allowStale`, and an asynchronous fetch is
639currently in progress to reload a stale value, then the former
640stale value will be returned.
641
642If called with `forceRefresh`, then the cached item will be
643re-fetched, even if it is not stale. However, if `allowStale` is
644set, then the old value will still be returned. This is useful
645in cases where you want to force a reload of a cached value. If
646a background fetch is already in progress, then `forceRefresh`
647has no effect.
648
649Multiple fetches for the same `key` will only call `fetchMethod`
650a single time, and all will be resolved when the value is
651resolved, even if different options are used.
652
653If `fetchMethod` is not specified, then this is effectively an
654alias for `Promise.resolve(cache.get(key))`.
655
656When the fetch method resolves to a value, if the fetch has not
657been aborted due to deletion, eviction, or being overwritten,
658then it is added to the cache using the options provided.
659
660If the key is evicted or deleted before the `fetchMethod`
661resolves, then the AbortSignal passed to the `fetchMethod` will
662receive an `abort` event, and the promise returned by `fetch()`
663will reject with the reason for the abort.
664
665If a `signal` is passed to the `fetch()` call, then aborting the
666signal will abort the fetch and cause the `fetch()` promise to
667reject with the reason provided.
668
669#### Setting `context`
670
671If an `FC` type is set to a type other than `unknown`, `void`, or
672`undefined` in the LRUCache constructor, then all
673calls to `cache.fetch()` _must_ provide a `context` option. If
674set to `undefined` or `void`, then calls to fetch _must not_
675provide a `context` option.
676
677The `context` param allows you to provide arbitrary data that
678might be relevant in the course of fetching the data. It is only
679relevant for the course of a single `fetch()` operation, and
680discarded afterwards.
681
682#### Note: `fetch()` calls are inflight-unique
683
684If you call `fetch()` multiple times with the same key value,
685then every call after the first will resolve on the same
686promise<sup>1</sup>,
687_even if they have different settings that would otherwise change
688the behvavior of the fetch_, such as `noDeleteOnFetchRejection`
689or `ignoreFetchAbort`.
690
691In most cases, this is not a problem (in fact, only fetching
692something once is what you probably want, if you're caching in
693the first place). If you are changing the fetch() options
694dramatically between runs, there's a good chance that you might
695be trying to fit divergent semantics into a single object, and
696would be better off with multiple cache instances.
697
698**1**: Ie, they're not the "same Promise", but they resolve at
699the same time, because they're both waiting on the same
700underlying fetchMethod response.
701
702### `peek(key, { allowStale } = {}) => value`
703
704Like `get()` but doesn't update recency or delete stale items.
705
706Returns `undefined` if the item is stale, unless `allowStale` is
707set either on the cache or in the options object.
708
709### `has(key, { updateAgeOnHas, status } = {}) => Boolean`
710
711Check if a key is in the cache, without updating the recency of
712use. Age is updated if `updateAgeOnHas` is set to `true` in
713either the options or the constructor.
714
715Will return `false` if the item is stale, even though it is
716technically in the cache. The difference can be determined (if
717it matters) by using a `status` argument, and inspecting the
718`has` field.
719
720For the usage of the `status` option, see **Status Tracking**
721below.
722
723### `delete(key)`
724
725Deletes a key out of the cache.
726
727Returns `true` if the key was deleted, `false` otherwise.
728
729### `clear()`
730
731Clear the cache entirely, throwing away all values.
732
733### `keys()`
734
735Return a generator yielding the keys in the cache, in order from
736most recently used to least recently used.
737
738### `rkeys()`
739
740Return a generator yielding the keys in the cache, in order from
741least recently used to most recently used.
742
743### `values()`
744
745Return a generator yielding the values in the cache, in order
746from most recently used to least recently used.
747
748### `rvalues()`
749
750Return a generator yielding the values in the cache, in order
751from least recently used to most recently used.
752
753### `entries()`
754
755Return a generator yielding `[key, value]` pairs, in order from
756most recently used to least recently used.
757
758### `rentries()`
759
760Return a generator yielding `[key, value]` pairs, in order from
761least recently used to most recently used.
762
763### `find(fn, [getOptions])`
764
765Find a value for which the supplied `fn` method returns a truthy
766value, similar to `Array.find()`.
767
768`fn` is called as `fn(value, key, cache)`.
769
770The optional `getOptions` are applied to the resulting `get()` of
771the item found.
772
773### `dump()`
774
775Return an array of `[key, entry]` objects which can be passed to
776`cache.load()`
777
778The `start` fields are calculated relative to a portable
779`Date.now()` timestamp, even if `performance.now()` is available.
780
781Stale entries are always included in the `dump`, even if
782`allowStale` is false.
783
784Note: this returns an actual array, not a generator, so it can be
785more easily passed around.
786
787### `load(entries)`
788
789Reset the cache and load in the items in `entries` in the order
790listed. Note that the shape of the resulting cache may be
791different if the same options are not used in both caches.
792
793The `start` fields are assumed to be calculated relative to a
794portable `Date.now()` timestamp, even if `performance.now()` is
795available.
796
797### `purgeStale()`
798
799Delete any stale entries. Returns `true` if anything was
800removed, `false` otherwise.
801
802### `getRemainingTTL(key)`
803
804Return the number of ms left in the item's TTL. If item is not
805in cache, returns `0`. Returns `Infinity` if item is in cache
806without a defined TTL.
807
808### `forEach(fn, [thisp])`
809
810Call the `fn` function with each set of `fn(value, key, cache)`
811in the LRU cache, from most recent to least recently used.
812
813Does not affect recency of use.
814
815If `thisp` is provided, function will be called in the
816`this`-context of the provided object.
817
818### `rforEach(fn, [thisp])`
819
820Same as `cache.forEach(fn, thisp)`, but in order from least
821recently used to most recently used.
822
823### `pop()`
824
825Evict the least recently used item, returning its value.
826
827Returns `undefined` if cache is empty.
828
829## Status Tracking
830
831Occasionally, it may be useful to track the internal behavior of
832the cache, particularly for logging, debugging, or for behavior
833within the `fetchMethod`. To do this, you can pass a `status`
834object to the `get()`, `set()`, `has()`, and `fetch()` methods.
835
836The `status` option should be a plain JavaScript object.
837
838The following fields will be set appropriately:
839
840```ts
841interface Status<V> {
842 /**
843 * The status of a set() operation.
844 *
845 * - add: the item was not found in the cache, and was added
846 * - update: the item was in the cache, with the same value provided
847 * - replace: the item was in the cache, and replaced
848 * - miss: the item was not added to the cache for some reason
849 */
850 set?: 'add' | 'update' | 'replace' | 'miss'
851
852 /**
853 * the ttl stored for the item, or undefined if ttls are not used.
854 */
855 ttl?: LRUMilliseconds
856
857 /**
858 * the start time for the item, or undefined if ttls are not used.
859 */
860 start?: LRUMilliseconds
861
862 /**
863 * The timestamp used for TTL calculation
864 */
865 now?: LRUMilliseconds
866
867 /**
868 * the remaining ttl for the item, or undefined if ttls are not used.
869 */
870 remainingTTL?: LRUMilliseconds
871
872 /**
873 * The calculated size for the item, if sizes are used.
874 */
875 size?: LRUSize
876
877 /**
878 * A flag indicating that the item was not stored, due to exceeding the
879 * {@link maxEntrySize}
880 */
881 maxEntrySizeExceeded?: true
882
883 /**
884 * The old value, specified in the case of `set:'update'` or
885 * `set:'replace'`
886 */
887 oldValue?: V
888
889 /**
890 * The results of a {@link has} operation
891 *
892 * - hit: the item was found in the cache
893 * - stale: the item was found in the cache, but is stale
894 * - miss: the item was not found in the cache
895 */
896 has?: 'hit' | 'stale' | 'miss'
897
898 /**
899 * The status of a {@link fetch} operation.
900 * Note that this can change as the underlying fetch() moves through
901 * various states.
902 *
903 * - inflight: there is another fetch() for this key which is in process
904 * - get: there is no fetchMethod, so {@link get} was called.
905 * - miss: the item is not in cache, and will be fetched.
906 * - hit: the item is in the cache, and was resolved immediately.
907 * - stale: the item is in the cache, but stale.
908 * - refresh: the item is in the cache, and not stale, but
909 * {@link forceRefresh} was specified.
910 */
911 fetch?: 'get' | 'inflight' | 'miss' | 'hit' | 'stale' | 'refresh'
912
913 /**
914 * The {@link fetchMethod} was called
915 */
916 fetchDispatched?: true
917
918 /**
919 * The cached value was updated after a successful call to fetchMethod
920 */
921 fetchUpdated?: true
922
923 /**
924 * The reason for a fetch() rejection. Either the error raised by the
925 * {@link fetchMethod}, or the reason for an AbortSignal.
926 */
927 fetchError?: Error
928
929 /**
930 * The fetch received an abort signal
931 */
932 fetchAborted?: true
933
934 /**
935 * The abort signal received was ignored, and the fetch was allowed to
936 * continue.
937 */
938 fetchAbortIgnored?: true
939
940 /**
941 * The fetchMethod promise resolved successfully
942 */
943 fetchResolved?: true
944
945 /**
946 * The results of the fetchMethod promise were stored in the cache
947 */
948 fetchUpdated?: true
949
950 /**
951 * The fetchMethod promise was rejected
952 */
953 fetchRejected?: true
954
955 /**
956 * The status of a {@link get} operation.
957 *
958 * - fetching: The item is currently being fetched. If a previous value is
959 * present and allowed, that will be returned.
960 * - stale: The item is in the cache, and is stale.
961 * - hit: the item is in the cache
962 * - miss: the item is not in the cache
963 */
964 get?: 'stale' | 'hit' | 'miss'
965
966 /**
967 * A fetch or get operation returned a stale value.
968 */
969 returnedStale?: true
970}
971```
972
973## Storage Bounds Safety
974
975This implementation aims to be as flexible as possible, within
976the limits of safe memory consumption and optimal performance.
977
978At initial object creation, storage is allocated for `max` items.
979If `max` is set to zero, then some performance is lost, and item
980count is unbounded. Either `maxSize` or `ttl` _must_ be set if
981`max` is not specified.
982
983If `maxSize` is set, then this creates a safe limit on the
984maximum storage consumed, but without the performance benefits of
985pre-allocation. When `maxSize` is set, every item _must_ provide
986a size, either via the `sizeCalculation` method provided to the
987constructor, or via a `size` or `sizeCalculation` option provided
988to `cache.set()`. The size of every item _must_ be a positive
989integer.
990
991If neither `max` nor `maxSize` are set, then `ttl` tracking must
992be enabled. Note that, even when tracking item `ttl`, items are
993_not_ preemptively deleted when they become stale, unless
994`ttlAutopurge` is enabled. Instead, they are only purged the
995next time the key is requested. Thus, if `ttlAutopurge`, `max`,
996and `maxSize` are all not set, then the cache will potentially
997grow unbounded.
998
999In this case, a warning is printed to standard error. Future
1000versions may require the use of `ttlAutopurge` if `max` and
1001`maxSize` are not specified.
1002
1003If you truly wish to use a cache that is bound _only_ by TTL
1004expiration, consider using a `Map` object, and calling
1005`setTimeout` to delete entries when they expire. It will perform
1006much better than an LRU cache.
1007
1008Here is an implementation you may use, under the same
1009[license](./LICENSE) as this package:
1010
1011```js
1012// a storage-unbounded ttl cache that is not an lru-cache
1013const cache = {
1014 data: new Map(),
1015 timers: new Map(),
1016 set: (k, v, ttl) => {
1017 if (cache.timers.has(k)) {
1018 clearTimeout(cache.timers.get(k))
1019 }
1020 cache.timers.set(
1021 k,
1022 setTimeout(() => cache.delete(k), ttl)
1023 )
1024 cache.data.set(k, v)
1025 },
1026 get: k => cache.data.get(k),
1027 has: k => cache.data.has(k),
1028 delete: k => {
1029 if (cache.timers.has(k)) {
1030 clearTimeout(cache.timers.get(k))
1031 }
1032 cache.timers.delete(k)
1033 return cache.data.delete(k)
1034 },
1035 clear: () => {
1036 cache.data.clear()
1037 for (const v of cache.timers.values()) {
1038 clearTimeout(v)
1039 }
1040 cache.timers.clear()
1041 },
1042}
1043```
1044
1045If that isn't to your liking, check out
1046[@isaacs/ttlcache](http://npm.im/@isaacs/ttlcache).
1047
1048## Storing Undefined Values
1049
1050This cache never stores undefined values, as `undefined` is used
1051internally in a few places to indicate that a key is not in the
1052cache.
1053
1054You may call `cache.set(key, undefined)`, but this is just an
1055an alias for `cache.delete(key)`. Note that this has the effect
1056that `cache.has(key)` will return _false_ after setting it to
1057undefined.
1058
1059```js
1060cache.set(myKey, undefined)
1061cache.has(myKey) // false!
1062```
1063
1064If you need to track `undefined` values, and still note that the
1065key is in the cache, an easy workaround is to use a sigil object
1066of your own.
1067
1068```js
1069import { LRUCache } from 'lru-cache'
1070const undefinedValue = Symbol('undefined')
1071const cache = new LRUCache(...)
1072const mySet = (key, value) =>
1073 cache.set(key, value === undefined ? undefinedValue : value)
1074const myGet = (key, value) => {
1075 const v = cache.get(key)
1076 return v === undefinedValue ? undefined : v
1077}
1078```
1079
1080## Performance
1081
1082As of January 2022, version 7 of this library is one of the most
1083performant LRU cache implementations in JavaScript.
1084
1085Benchmarks can be extremely difficult to get right. In
1086particular, the performance of set/get/delete operations on
1087objects will vary _wildly_ depending on the type of key used. V8
1088is highly optimized for objects with keys that are short strings,
1089especially integer numeric strings. Thus any benchmark which
1090tests _solely_ using numbers as keys will tend to find that an
1091object-based approach performs the best.
1092
1093Note that coercing _anything_ to strings to use as object keys is
1094unsafe, unless you can be 100% certain that no other type of
1095value will be used. For example:
1096
1097```js
1098const myCache = {}
1099const set = (k, v) => (myCache[k] = v)
1100const get = k => myCache[k]
1101
1102set({}, 'please hang onto this for me')
1103set('[object Object]', 'oopsie')
1104```
1105
1106Also beware of "Just So" stories regarding performance. Garbage
1107collection of large (especially: deep) object graphs can be
1108incredibly costly, with several "tipping points" where it
1109increases exponentially. As a result, putting that off until
1110later can make it much worse, and less predictable. If a library
1111performs well, but only in a scenario where the object graph is
1112kept shallow, then that won't help you if you are using large
1113objects as keys.
1114
1115In general, when attempting to use a library to improve
1116performance (such as a cache like this one), it's best to choose
1117an option that will perform well in the sorts of scenarios where
1118you'll actually use it.
1119
1120This library is optimized for repeated gets and minimizing
1121eviction time, since that is the expected need of a LRU. Set
1122operations are somewhat slower on average than a few other
1123options, in part because of that optimization. It is assumed
1124that you'll be caching some costly operation, ideally as rarely
1125as possible, so optimizing set over get would be unwise.
1126
1127If performance matters to you:
1128
11291. If it's at all possible to use small integer values as keys,
1130 and you can guarantee that no other types of values will be
1131 used as keys, then do that, and use a cache such as
1132 [lru-fast](https://npmjs.com/package/lru-fast), or
1133 [mnemonist's
1134 LRUCache](https://yomguithereal.github.io/mnemonist/lru-cache)
1135 which uses an Object as its data store.
1136
11372. Failing that, if at all possible, use short non-numeric
1138 strings (ie, less than 256 characters) as your keys, and use
1139 [mnemonist's
1140 LRUCache](https://yomguithereal.github.io/mnemonist/lru-cache).
1141
11423. If the types of your keys will be anything else, especially
1143 long strings, strings that look like floats, objects, or some
1144 mix of types, or if you aren't sure, then this library will
1145 work well for you.
1146
1147 If you do not need the features that this library provides
1148 (like asynchronous fetching, a variety of TTL staleness
1149 options, and so on), then [mnemonist's
1150 LRUMap](https://yomguithereal.github.io/mnemonist/lru-map) is
1151 a very good option, and just slightly faster than this module
1152 (since it does considerably less).
1153
11544. Do not use a `dispose` function, size tracking, or especially
1155 ttl behavior, unless absolutely needed. These features are
1156 convenient, and necessary in some use cases, and every attempt
1157 has been made to make the performance impact minimal, but it
1158 isn't nothing.
1159
1160## Breaking Changes in Version 7
1161
1162This library changed to a different algorithm and internal data
1163structure in version 7, yielding significantly better
1164performance, albeit with some subtle changes as a result.
1165
1166If you were relying on the internals of LRUCache in version 6 or
1167before, it probably will not work in version 7 and above.
1168
1169## Breaking Changes in Version 8
1170
1171- The `fetchContext` option was renamed to `context`, and may no
1172 longer be set on the cache instance itself.
1173- Rewritten in TypeScript, so pretty much all the types moved
1174 around a lot.
1175- The AbortController/AbortSignal polyfill was removed. For this
1176 reason, **Node version 16.14.0 or higher is now required**.
1177- Internal properties were moved to actual private class
1178 properties.
1179- Keys and values must not be `null` or `undefined`.
1180- Minified export available at `'lru-cache/min'`, for both CJS
1181 and MJS builds.
1182
1183## Changes in Version 9
1184
1185- Named export only, no default export.
1186- AbortController polyfill returned, albeit with a warning when
1187 used.
1188
1189For more info, see the [change log](CHANGELOG.md).