UNPKG

10.8 kBMarkdownView Raw
1_**Micri** — Asynchronous HTTP microservices_
2
3> micri is an archaic non-SI decimal metric prefix for 10−14. Its symbol was mc.
4
5[Wikipedia - Micri-](https://en.wikipedia.org/wiki/Micri-)
6
7[![npm version](https://badge.fury.io/js/micri.svg)](https://badge.fury.io/js/micri)
8[![Install Size](https://packagephobia.now.sh/badge?p=micri)](https://packagephobia.now.sh/result?p=micri)
9
10## Features
11
12* **Easy**: Designed for usage with `async` and `await` ([more](https://zeit.co/blog/async-and-await))
13* **Fast**: Ultra-high performance (even JSON parsing is opt-in)
14* **Micri**: The whole project is ~500 lines of code
15* **Agile**: Super easy deployment and containerization
16* **Simple**: Oriented for single purpose modules (function)
17* **Standard**: Just HTTP!
18* **Explicit**: No middleware - modules declare all [dependencies](https://github.com/amio/awesome-micro)
19* **Lightweight**: [![Install Size](https://packagephobia.now.sh/badge?p=micri)](https://packagephobia.now.sh/result?p=micri)
20
21
22## Usage
23
24```js
25const { serve } = require('micri')
26
27const sleep = (ms) => new Promise((r) => setTimeout(r, ms));
28
29const server = serve(async (req, res) => {
30 await sleep(500)
31 return 'Hello world'
32})
33
34server.listen(3000)
35```
36
37And go to this URL: `http://localhost:3000` - 🎉
38
39### `async` & `await`
40
41<p><details>
42 <summary><b>Examples</b></summary>
43 <ul><li><a href="./examples/external-api-call">Fetch external api</a></li></ul>
44</details></p>
45
46Micri is built for usage with async/await. You can read more about async / await [here](https://zeit.co/blog/async-and-await)
47
48```js
49const sleep = (ms) => new Promise((r) => setTimeout(r, ms));
50
51module.exports = async (req, res) => {
52 await sleep(500);
53 return 'Ready!';
54}
55```
56
57### Body parsing
58
59<p id="body-parsing-examples"><details>
60 <summary><b>Examples</b></summary>
61 <ul>
62 <li><a href="./examples/json-body-parsing">Parse JSON</a></li>
63 <li><a href="./examples/urlencoded-body-parsing">Parse urlencoded form (html `form` tag)</a></li>
64 </ul>
65</details></p>
66
67For parsing the incoming request body we included an async functions `buffer`, `text` and `json`
68
69```js
70const {buffer, text, json} = require('micri')
71
72module.exports = async (req, res) => {
73 const buf = await buffer(req)
74 console.log(buf)
75 // <Buffer 7b 22 70 72 69 63 65 22 3a 20 39 2e 39 39 7d>
76 const txt = await text(req)
77 console.log(txt)
78 // '{"price": 9.99}'
79 const js = await json(req)
80 console.log(js.price)
81 // 9.99
82 return ''
83}
84```
85
86### Routing
87
88Micri has a simple built-in function router. The idea is fairly simple, you can
89use it as a wrapper virtually anywhere where it will be called with
90`(req, res, optionalArgs)` and can return a promise as a response to `micri()`.
91
92Firstly you create a router by calling the `router(...)` function. The router
93function takes routes as arguments. Routes are created by calling functions
94under `on` map, and the functions are organized there by HTTP method name. These
95functions in turn take two arguments, a predicate and request handler functions.
96
97A predicate function gets the usual arguments `(req, res, opts?)`. A predicate
98function may return a truthy value if the handler function should take care of
99this request, or it may return a falsy value if the handler should not take
100this request.
101
102Multiple predicates can be combined by using `Router.everyPredicate(...)` that
103takes predicate functions as arguments. The function returns true if every
104predicate function given as an argument returns true.
105
106The order of the route arguments marks the priority order of the routes.
107Therefore if two routes would match to a request the one that was passed earlier
108in the arguments list to the `router()` function will handle the request.
109
110`otherwise()` is a special route function that will always match and thus can be
111used as the last route rule for sending an error and avoid throwing an exception
112in case no other route predicate matches.
113
114```js
115const { Router: { router } } = require('micri');
116
117micri(router(
118 on.get((req) => req.url === '/', (req, _res) => ({ message: 'Hello world!'})),
119 on.post((req) => req.url === '/', (req) => text(req)),
120 otherwise((req, res) => send(res, 400, 'Method Not Accepted'))))
121 .listen(3000);
122```
123
124
125### Worker Threads
126
127Micri supports offloading computationally heavy request handlers to worker
128threads seamlessly. The offloading is configured per handler by wrapping the
129handler function with `withWorker()`. It works directly at the top-level or per
130route when using the router. See [with-workerthreads](examples/with-worker) for
131a couple of examples how to use it.
132
133```js
134micri(withWorker(() => doSomethingCPUHeavy))
135```
136
137Offloading requests to a worker may improve the responsiveness of a busy API
138significantly, as it removes almost all blocking from the main thread. In the
139following examples we first try to find prime numbers and finally return one
140as a response. In both cases we do two concurrent HTTP `GET` requests using
141`curl`.
142
143Finding prime numbers using the main thread:
144
145```
146~% time curl 127.0.0.1:3000/main
147299993curl 127.0.0.1:3000/main 0.01s user 0.00s system 0% cpu 8.791 total
148~% time curl 127.0.0.1:3000/main
149299993curl 127.0.0.1:3000/main 0.00s user 0.00s system 0% cpu 16.547 total
150```
151
152Notice that the second curl needs to wait until the first request finishes.
153
154Finding prime numbers using a worker thread:
155
156```
157~% time curl 127.0.0.1:3000/worker
158299993curl 127.0.0.1:3000/worker 0.00s user 0.00s system 0% cpu 9.025 total
159~% time curl 127.0.0.1:3000/worker
160299993curl 127.0.0.1:3000/worker 0.00s user 0.00s system 0% cpu 9.026 total
161```
162
163Note how both concurrently executed requests took the same time to finish.
164
165
166## API
167
168##### `buffer(req, { limit = '1mb', encoding = 'utf8' })`
169##### `text(req, { limit = '1mb', encoding = 'utf8' })`
170##### `json(req, { limit = '1mb', encoding = 'utf8' })`
171
172- Buffers and parses the incoming body and returns it.
173- Exposes an `async` function that can be run with `await`.
174- Can be called multiple times, as it caches the raw request body the first time.
175- `limit` is how much data is aggregated before parsing at max. Otherwise, an `Error` is thrown with `statusCode` set to `413` (see [Error Handling](#error-handling)). It can be a `Number` of bytes or [a string](https://www.npmjs.com/package/bytes) like `'1mb'`.
176- If JSON parsing fails, an `Error` is thrown with `statusCode` set to `400` (see [Error Handling](#error-handling))
177
178For other types of data check the [examples](#body-parsing-examples)
179
180### Sending a different status code
181
182So far we have used `return` to send data to the client. `return 'Hello World'` is the equivalent of `send(res, 200, 'Hello World')`.
183
184```js
185const {send} = require('micri')
186
187module.exports = async (req, res) => {
188 const statusCode = 400
189 const data = { error: 'Custom error message' }
190
191 send(res, statusCode, data)
192}
193```
194
195##### `send(res, statusCode, data = null)`
196
197- Use `require('micri').send`.
198- `statusCode` is a `Number` with the HTTP status code, and must always be supplied.
199- If `data` is supplied it is sent in the response. Different input types are processed appropriately, and `Content-Type` and `Content-Length` are automatically set.
200 - `Stream`: `data` is piped as an `octet-stream`. Note: it is _your_ responsibility to handle the `error` event in this case (usually, simply logging the error and aborting the response is enough).
201 - `Buffer`: `data` is written as an `octet-stream`.
202 - `object`: `data` is serialized as JSON.
203 - `string`: `data` is written as-is.
204- If JSON serialization fails (for example, if a cyclical reference is found), a `400` error is thrown. See [Error Handling](#error-handling).
205
206##### micri(fn)
207
208- This function is exposed as the `default` export.
209- Use `require('micri')`.
210- Returns a [`http.Server`](https://nodejs.org/dist/latest-v6.x/docs/api/http.html#http_class_http_server) that uses the provided `function` as the request handler.
211- The supplied function is run with `await`. So it can be `async`
212
213##### sendError(req, res, error)
214
215- Use `require('micri').sendError`.
216- Used as the default handler for errors thrown.
217- Automatically sets the status code of the response based on `error.statusCode`.
218- Sends the `error.message` as the body.
219- Stacks are printed out with `console.error` and during development (when `NODE_ENV` is set to `'development'`) also sent in responses.
220- Usually, you don't need to invoke this method yourself, as you can use the [built-in error handling](#error-handling) flow with `throw`.
221
222## Error Handling
223
224Micri allows you to write robust microservices. This is accomplished primarily
225by bringing sanity back to error handling and avoiding callback soup.
226
227If an error is thrown and not caught by you, the response will automatically be
228`500`. **Important:** Error stacks will be printed as `console.error` and during
229development mode (if the env variable `NODE_ENV` is `'development'`), they will
230also be included in the responses.
231
232If the error object throw is an instance of `MicriError` the `message`,
233`statusCode` and `code` properties of the object are used for the HTTP response.
234
235Let's say you want to write a rate limiting module:
236
237```js
238const rateLimit = require('my-rate-limit')
239
240micri((req, res) => {
241 await rateLimit(req);
242 // ... your code
243}).listen(3000);
244```
245
246If the API endpoint is abused, it can throw a `MicriError` like so:
247
248```js
249if (tooMany) {
250 throw MicriError(429, 'rate_limited' 'Rate limit exceeded');
251}
252```
253
254The nice thing about this model is that the `statusCode` is merely a suggestion.
255The user can override it:
256
257```js
258try {
259 await rateLimit(req)
260} catch (err) {
261 if (429 == err.statusCode) {
262 // perhaps send 500 instead?
263 send(res, 500);
264 }
265}
266```
267
268If the error is based on another error that **Micri** caught, like a `JSON.parse`
269exception, then `originalError` will point to it. If a generic error is caught,
270the status will be set to `500`.
271
272In order to set up your own error handling mechanism, you can use composition in
273your handler:
274
275```js
276const {send} = require('micri');
277
278const handleErrors = fn => async (req, res) => {
279 try {
280 return await fn(req, res)
281 } catch (err) {
282 console.log(err.stack)
283 send(res, 500, 'My custom error!')
284 }
285}
286
287micri(handleErrors(async (req, res) => {
288 throw new Error('What happened here?')
289})).listen(3000);
290```
291
292## Contributing
293
2941. [Fork](https://help.github.com/articles/fork-a-repo/) this repository to your own GitHub account and then [clone](https://help.github.com/articles/cloning-a-repository/) it to your local device
2952. Link the package to the global module directory: `npm link`
2963. Within the module you want to test your local development instance of Micri, just link it to the dependencies: `npm link micri`. Instead of the default one from npm, node will now use your clone of Micri!