1 | [![GitHub Workflow](https://github.com/WesVanVugt/promise-batcher/actions/workflows/main.yml/badge.svg)](https://github.com/WesVanVugt/promise-batcher/actions/workflows/main.yml)
|
2 | [![Install Size](https://packagephobia.now.sh/badge?p=promise-batcher)](https://packagephobia.now.sh/result?p=promise-batcher)
|
3 |
|
4 | # promise-batcher
|
5 |
|
6 | A module for batching individual promises to improve their collective efficiency.
|
7 |
|
8 | For release notes, see the [changelog](https://github.com/WesVanVugt/promise-batcher/blob/master/changelog.md).
|
9 |
|
10 | ## Install
|
11 |
|
12 | npm install promise-batcher
|
13 |
|
14 | ## Examples
|
15 |
|
16 | Promise batcher is useful for batching requests made from branching execution paths.
|
17 |
|
18 | ### Basic Example
|
19 |
|
20 | This example demonstrates a batcher which takes numbers as inputs, adds 1 to each and returns the result.
|
21 | Note that while the getResult function is called multiple times, the batching function is only run once.
|
22 | If the send method is not called, the batch will still be processed when Node.js idles.
|
23 |
|
24 | ```javascript
|
25 | const { Batcher } = require("promise-batcher");
|
26 | let runCount = 0;
|
27 | const batcher = new Batcher({
|
28 | batchingFunction: (nums) => {
|
29 | runCount++;
|
30 | return Promise.resolve(
|
31 | nums.map((n) => {
|
32 | return n + 1;
|
33 | }),
|
34 | );
|
35 | },
|
36 | });
|
37 | // Send the batch of requests. This step is optional.
|
38 | batcher.send();
|
39 | const inputs = [1, 3, 5, 7];
|
40 | // Start a series of individual requests
|
41 | const promises = inputs.map((input) => batcher.getResult(input));
|
42 | // Wait for all the requests to complete
|
43 | Promise.all(promises).then((results) => {
|
44 | console.log(results); // [ 2, 4, 6, 8 ]
|
45 | // The requests were still done in a single run
|
46 | console.log(runCount); // 1
|
47 | });
|
48 | ```
|
49 |
|
50 | ### Database Example
|
51 |
|
52 | This example shows how a batcher could be used to combine individual requests made to a database. This is especially
|
53 | useful when the requests need to be made independently of each other, allowing the requests to be made individually,
|
54 | yet be processed as batches. Note that in a real-world example, it would be best to implement exponential backoff for
|
55 | retried requests by using a delayFunction.
|
56 |
|
57 | ```javascript
|
58 | const { Batcher, BATCHER_RETRY_TOKEN } = require("promise-batcher");
|
59 | const batcher = new Batcher({
|
60 | batchingFunction: (recordIds) => {
|
61 | // Perform a batched request to the database with the inputs from getResult()
|
62 | return database.batchGet(recordIds).then((results) => {
|
63 | // Interpret the results from the request, returning an array of result values
|
64 | return results.map((result) => {
|
65 | if (result.error) {
|
66 | if (result.error.retryable) {
|
67 | // Retry the individual request (eg. request throttled)
|
68 | return BATCHER_RETRY_TOKEN;
|
69 | } else {
|
70 | // Reject the individual request (eg. record not found)
|
71 | return new Error(result.error.message);
|
72 | }
|
73 | } else {
|
74 | // Resolve the individual request
|
75 | return result.data;
|
76 | }
|
77 | });
|
78 | });
|
79 | },
|
80 | });
|
81 | // Send the batch of requests. This step is optional.
|
82 | batcher.send();
|
83 | const recordIds = ["ABC123", "DEF456", "HIJ789"];
|
84 | // Start a series of individual requests
|
85 | const promises = recordIds.map((recordId) => batcher.getResult(recordId));
|
86 | // Wait for all the requests to complete
|
87 | Promise.all(promises).then((results) => {
|
88 | // Use the results
|
89 | });
|
90 | ```
|
91 |
|
92 | ## API Reference
|
93 |
|
94 | ### Object: Batcher
|
95 |
|
96 | A tool for combining requests.
|
97 |
|
98 | #### Constructor
|
99 |
|
100 | **new Batcher(options)** - Creates a new batcher.
|
101 |
|
102 | - options.**batchingFunction**(inputArray) - A function which is passed an array of request values, returning a
|
103 | promise which resolves to an array of response values. The request and response arrays must be of equal length. To
|
104 | reject an individual request, return an Error object (or class which extends Error) at the corresponding element in
|
105 | the response array. To retry an individual request, return the BATCHER_RETRY_TOKEN in the response array.
|
106 | - options.**delayFunction**() - A function which can delay a batch by returning a promise which resolves when the
|
107 | batch should be run. If the function does not return a promise, no delay will be applied _(optional)_.
|
108 | - options.**maxBatchSize** - The maximum number of requests that can be combined in a single batch _(optional)_.
|
109 | - options.**queuingDelay** - The number of milliseconds to wait before running a batch of requests. This is used to
|
110 | allow time for the requests to queue up. Defaults to 1ms. This delay does not apply if the limit set by
|
111 | options.maxBatchSize is reached, or if batcher.send() is called. Note that since the batcher uses setTimeout to
|
112 | perform this delay, batches delayed by this will only be run when Node.js is idle, even if that means a longer delay
|
113 | _(optional)_.
|
114 | - options.**queuingThresholds** - An array containing the number of requests that must be queued in order to trigger
|
115 | a batch request at each level of concurrency. For example [1, 5], would require at least 1 queued request when no
|
116 | batch requests are active, and 5 queued requests when 1 (or more) batch requests are active. Defaults to [1]. Note
|
117 | that the delay imposed by options.queuingDelay still applies when a batch request is triggered _(optional)_.
|
118 |
|
119 | #### Methods
|
120 |
|
121 | - batcher.**getResult**(input) - Returns a promise which resolves or rejects with the individual result returned from
|
122 | the batching function.
|
123 | - batcher.**idlePromise**(input) - Returns a promise which resolves there are no pending batches.
|
124 | - batcher.**send**() - Bypasses any queuingDelay set, while respecting all other limits imposed. If no other limits
|
125 | are set, this will result in the batchingFunction being run immediately. Note that batches will still be run even if
|
126 | this function is not called, once the queuingDelay or maxBatchSize is reached.
|
127 |
|
128 | #### Properties
|
129 |
|
130 | - batcher.**idling** - `true` when there are no pending batches, and `false` otherwise.
|
131 |
|
132 | ## License
|
133 |
|
134 | [MIT](https://github.com/WesVanVugt/promise-batcher/blob/master/license)
|