UNPKG

11 kBMarkdownView Raw
1_**Micri** — Asynchronous HTTP microservices_
2
3> micri is an archaic non-SI decimal metric prefix for 10−14. Its symbol was mc.
4
5[Wikipedia - Micri-](https://en.wikipedia.org/wiki/Micri-)
6
7[![Install Size](https://packagephobia.now.sh/badge?p=micri)](https://packagephobia.now.sh/result?p=micri)
8
9## Features
10
11* **Easy**: Designed for usage with `async` and `await` ([more](https://zeit.co/blog/async-and-await))
12* **Fast**: Ultra-high performance (even JSON parsing is opt-in)
13* **Micri**: The whole project is ~260 lines of code
14* **Agile**: Super easy deployment and containerization
15* **Simple**: Oriented for single purpose modules (function)
16* **Standard**: Just HTTP!
17* **Explicit**: No middleware - modules declare all [dependencies](https://github.com/amio/awesome-micro)
18* **Lightweight**: With all dependencies, the package weighs less than a megabyte
19
20
21## Usage
22
23Create an `index.js` file and export a function that accepts the standard [http.IncomingMessage](https://nodejs.org/api/http.html#http_class_http_incomingmessage) and [http.ServerResponse](https://nodejs.org/api/http.html#http_class_http_serverresponse) objects:
24
25```js
26module.exports = (req, res) => {
27 res.end('Welcome to Micri')
28}
29```
30
31```js
32module.exports = () => 'Welcome to Micri'
33```
34
35Next, ensure that the `main` property inside `package.json` points to your microservice (which is inside `index.js` in this example case) and add a `start` script:
36
37```json
38{
39 "main": "index.js",
40 "scripts": {
41 "start": "micri"
42 }
43}
44```
45
46Once all of that is done, the server can be started like this:
47
48```bash
49npm start
50```
51
52And go to this URL: `http://localhost:3000` - 🎉
53
54### Command line
55
56```
57 micri - Asynchronous HTTP microservices
58
59 USAGE
60
61 $ micri --help
62 $ micri --version
63 $ micri [-l listen_uri [-l ...]] [entry_point.js]
64
65 By default micri will listen on 0.0.0.0:3000 and will look first
66 for the "main" property in package.json and subsequently for index.js
67 as the default entry_point.
68
69 Specifying a single --listen argument will overwrite the default, not supplement it.
70
71 OPTIONS
72
73 --help shows this help message
74
75 -v, --version displays the current version of micri
76
77 -l, --listen listen_uri specify a URI endpoint on which to listen (see below) -
78 more than one may be specified to listen in multiple places
79
80 ENDPOINTS
81
82 Listen endpoints (specified by the --listen or -l options above) instruct micri
83 to listen on one or more interfaces/ports, UNIX domain sockets, or Windows named pipes.
84
85 For TCP (traditional host/port) endpoints:
86
87 $ micri -l tcp://hostname:1234
88
89 For UNIX domain socket endpoints:
90
91 $ micri -l unix:/path/to/socket.sock
92
93 For Windows named pipe endpoints:
94
95 $ micri -l pipe:\\.\pipe\PipeName
96```
97
98### `async` & `await`
99
100<p><details>
101 <summary><b>Examples</b></summary>
102 <ul><li><a href="./examples/external-api-call">Fetch external api</a></li></ul>
103</details></p>
104
105Micri is built for usage with async/await. You can read more about async / await [here](https://zeit.co/blog/async-and-await)
106
107```js
108const sleep = (ms) => new Promise((r) => setTimeout(r, ms));
109
110module.exports = async (req, res) => {
111 await sleep(500);
112 return 'Ready!';
113}
114```
115
116### Transpilation
117
118The package takes advantage of native support for `async` and `await`, which is available as of **Node.js 8.0.0**! In turn, we suggest either using at least this version both in development and production (if possible), or transpiling the code using [async-to-gen](https://github.com/leebyron/async-to-gen), if you can't use the latest Node.js version.
119
120In order to do that, you firstly need to install it:
121
122```bash
123npm install --save async-to-gen
124```
125
126And then add the transpilation command to the `scripts.build` property inside `package.json`:
127
128```json
129{
130 "scripts": {
131 "build": "async-to-gen input.js > output.js"
132 }
133}
134```
135
136Once these two steps are done, you can transpile the code by running this command:
137
138```bash
139npm run build
140```
141
142That's all it takes to transpile by yourself. But just to be clear: **Only do this if you can't use Node.js 8.0.0**! If you can, `async` and `await` will just work right out of the box.
143
144### Port Based on Environment Variable
145
146When you want to set the port using an environment variable you can use:
147
148```
149micri -l tcp://0.0.0.0:$PORT
150```
151
152Optionally you can add a default if it suits your use case:
153
154```
155micri -l tcp://0.0.0.0:${PORT-3000}
156```
157
158`${PORT-3000}` will allow a fallback to port `3000` when `$PORT` is not defined.
159
160Note that this only works in Bash.
161
162### Body parsing
163
164<p id="body-parsing-examples"><details>
165 <summary><b>Examples</b></summary>
166 <ul>
167 <li><a href="./examples/json-body-parsing">Parse JSON</a></li>
168 <li><a href="./examples/urlencoded-body-parsing">Parse urlencoded form (html `form` tag)</a></li>
169 </ul>
170</details></p>
171
172For parsing the incoming request body we included an async functions `buffer`, `text` and `json`
173
174```js
175const {buffer, text, json} = require('micri')
176
177module.exports = async (req, res) => {
178 const buf = await buffer(req)
179 console.log(buf)
180 // <Buffer 7b 22 70 72 69 63 65 22 3a 20 39 2e 39 39 7d>
181 const txt = await text(req)
182 console.log(txt)
183 // '{"price": 9.99}'
184 const js = await json(req)
185 console.log(js.price)
186 // 9.99
187 return ''
188}
189```
190
191### API
192
193##### `buffer(req, { limit = '1mb', encoding = 'utf8' })`
194##### `text(req, { limit = '1mb', encoding = 'utf8' })`
195##### `json(req, { limit = '1mb', encoding = 'utf8' })`
196
197- Buffers and parses the incoming body and returns it.
198- Exposes an `async` function that can be run with `await`.
199- Can be called multiple times, as it caches the raw request body the first time.
200- `limit` is how much data is aggregated before parsing at max. Otherwise, an `Error` is thrown with `statusCode` set to `413` (see [Error Handling](#error-handling)). It can be a `Number` of bytes or [a string](https://www.npmjs.com/package/bytes) like `'1mb'`.
201- If JSON parsing fails, an `Error` is thrown with `statusCode` set to `400` (see [Error Handling](#error-handling))
202
203For other types of data check the [examples](#body-parsing-examples)
204
205### Sending a different status code
206
207So far we have used `return` to send data to the client. `return 'Hello World'` is the equivalent of `send(res, 200, 'Hello World')`.
208
209```js
210const {send} = require('micri')
211
212module.exports = async (req, res) => {
213 const statusCode = 400
214 const data = { error: 'Custom error message' }
215
216 send(res, statusCode, data)
217}
218```
219
220##### `send(res, statusCode, data = null)`
221
222- Use `require('micri').send`.
223- `statusCode` is a `Number` with the HTTP status code, and must always be supplied.
224- If `data` is supplied it is sent in the response. Different input types are processed appropriately, and `Content-Type` and `Content-Length` are automatically set.
225 - `Stream`: `data` is piped as an `octet-stream`. Note: it is _your_ responsibility to handle the `error` event in this case (usually, simply logging the error and aborting the response is enough).
226 - `Buffer`: `data` is written as an `octet-stream`.
227 - `object`: `data` is serialized as JSON.
228 - `string`: `data` is written as-is.
229- If JSON serialization fails (for example, if a cyclical reference is found), a `400` error is thrown. See [Error Handling](#error-handling).
230
231### Programmatic use
232
233You can use Micri programmatically by requiring Micri directly:
234
235```js
236const micri = require('micri')
237
238const sleep = (ms) => new Promise((r) => setTimeout(r, ms));
239
240const server = micri(async (req, res) => {
241 await sleep(500)
242 return 'Hello world'
243})
244
245server.listen(3000)
246```
247
248##### micri(fn)
249
250- This function is exposed as the `default` export.
251- Use `require('micri')`.
252- Returns a [`http.Server`](https://nodejs.org/dist/latest-v6.x/docs/api/http.html#http_class_http_server) that uses the provided `function` as the request handler.
253- The supplied function is run with `await`. So it can be `async`
254
255##### sendError(req, res, error)
256
257- Use `require('micri').sendError`.
258- Used as the default handler for errors thrown.
259- Automatically sets the status code of the response based on `error.statusCode`.
260- Sends the `error.message` as the body.
261- Stacks are printed out with `console.error` and during development (when `NODE_ENV` is set to `'development'`) also sent in responses.
262- Usually, you don't need to invoke this method yourself, as you can use the [built-in error handling](#error-handling) flow with `throw`.
263
264## Error Handling
265
266Micri allows you to write robust microservices. This is accomplished primarily
267by bringing sanity back to error handling and avoiding callback soup.
268
269If an error is thrown and not caught by you, the response will automatically be
270`500`. **Important:** Error stacks will be printed as `console.error` and during
271development mode (if the env variable `NODE_ENV` is `'development'`), they will
272also be included in the responses.
273
274If the error object throw is an instance of `MicriError` the `message`,
275`statusCode` and `code` properties of the object are used for the HTTP response.
276
277Let's say you want to write a rate limiting module:
278
279```js
280const rateLimit = require('my-rate-limit')
281
282module.exports = async (req, res) => {
283 await rateLimit(req)
284 // ... your code
285}
286```
287
288If the API endpoint is abused, it can throw an error with ``createError`` like so:
289
290```js
291if (tooMany) {
292 throw MicriError(429, 'rate_limited' 'Rate limit exceeded')
293}
294```
295
296The nice thing about this model is that the `statusCode` is merely a suggestion.
297The user can override it:
298
299```js
300try {
301 await rateLimit(req)
302} catch (err) {
303 if (429 == err.statusCode) {
304 // perhaps send 500 instead?
305 send(res, 500)
306 }
307}
308```
309
310If the error is based on another error that **Micri** caught, like a `JSON.parse`
311exception, then `originalError` will point to it. If a generic error is caught,
312the status will be set to `500`.
313
314In order to set up your own error handling mechanism, you can use composition in
315your handler:
316
317```js
318const {send} = require('micri')
319
320const handleErrors = fn => async (req, res) => {
321 try {
322 return await fn(req, res)
323 } catch (err) {
324 console.log(err.stack)
325 send(res, 500, 'My custom error!')
326 }
327}
328
329module.exports = handleErrors(async (req, res) => {
330 throw new Error('What happened here?')
331})
332```
333
334## Contributing
335
3361. [Fork](https://help.github.com/articles/fork-a-repo/) this repository to your own GitHub account and then [clone](https://help.github.com/articles/cloning-a-repository/) it to your local device
3372. Link the package to the global module directory: `npm link`
3383. Within the module you want to test your local development instance of Micri, just link it to the dependencies: `npm link micri`. Instead of the default one from npm, node will now use your clone of Micri!
339
340As always, you can run the [AVA](https://github.com/sindresorhus/ava) and [ESLint](http://eslint.org) tests using: `npm test`