UNPKG

12.5 kBMarkdownView Raw
1# Transports
2
3A "transport" for Pino is supplementary tool which consumes Pino logs.
4
5Consider the following example:
6
7```js
8const split = require('split2')
9const pump = require('pump')
10const through = require('through2')
11
12const myTransport = through.obj(function (chunk, enc, cb) {
13 // do the necessary
14 console.log(chunk)
15 cb()
16})
17
18pump(process.stdin, split(JSON.parse), myTransport)
19```
20
21The above defines our "transport" as the file `my-transport-process.js`.
22
23Logs can now be consumed using shell piping:
24
25```sh
26node my-app-which-logs-stuff-to-stdout.js | node my-transport-process.js
27```
28
29Ideally, a transport should consume logs in a separate process to the application,
30Using transports in the same process causes unnecessary load and slows down
31Node's single threaded event loop.
32
33## In-process transports
34
35> **Pino *does not* natively support in-process transports.**
36
37Pino does not support in-process transports because Node processes are
38single threaded processes (ignoring some technical details). Given this
39restriction, one of the methods Pino employs to achieve its speed is to
40purposefully offload the handling of logs, and their ultimate destination, to
41external processes so that the threading capabilities of the OS can be
42used (or other CPUs).
43
44One consequence of this methodology is that "error" logs do not get written to
45`stderr`. However, since Pino logs are in a parseable format, it is possible to
46use tools like [pino-tee][pino-tee] or [jq][jq] to work with the logs. For
47example, to view only logs marked as "error" logs:
48
49```
50$ node an-app.js | jq 'select(.level == 50)'
51```
52
53In short, the way Pino generates logs:
54
551. Reduces the impact of logging on an application to the absolute minimum.
562. Gives greater flexibility in how logs are processed and stored.
57
58Given all of the above, Pino recommends out-of-process log processing.
59
60However, it is possible to wrap Pino and perform processing in-process.
61For an example of this, see [pino-multi-stream][pinoms].
62
63[pino-tee]: https://npm.im/pino-tee
64[jq]: https://stedolan.github.io/jq/
65[pinoms]: https://npm.im/pino-multi-stream
66
67## Known Transports
68
69PR's to this document are welcome for any new transports!
70
71+ [pino-applicationinsights](#pino-applicationinsights)
72+ [pino-azuretable](#pino-azuretable)
73+ [pino-cloudwatch](#pino-cloudwatch)
74+ [pino-couch](#pino-couch)
75+ [pino-datadog](#pino-datadog)
76+ [pino-elasticsearch](#pino-elasticsearch)
77+ [pino-http-send](#pino-http-send)
78+ [pino-logflare](#pino-logflare)
79+ [pino-mq](#pino-mq)
80+ [pino-mysql](#pino-mysql)
81+ [pino-papertrail](#pino-papertrail)
82+ [pino-pg](#pino-pg)
83+ [pino-redis](#pino-redis)
84+ [pino-sentry](#pino-sentry)
85+ [pino-socket](#pino-socket)
86+ [pino-stackdriver](#pino-stackdriver)
87+ [pino-syslog](#pino-syslog)
88+ [pino-websocket](#pino-websocket)
89
90
91
92<a id="pino-applicationinsights"></a>
93### pino-applicationinsights
94The [pino-applicationinsights](https://www.npmjs.com/package/pino-applicationinsights) module is a transport that will forward logs to [Azure Application Insights](https://docs.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview).
95
96Given an application `foo` that logs via pino, you would use `pino-applicationinsights` like so:
97
98``` sh
99$ node foo | pino-applicationinsights --key blablabla
100```
101
102For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-applicationinsights#readme)
103
104<a id="pino-azuretable"></a>
105### pino-azuretable
106The [pino-azuretable](https://www.npmjs.com/package/pino-azuretable) module is a transport that will forward logs to the [Azure Table Storage](https://azure.microsoft.com/en-us/services/storage/tables/).
107
108Given an application `foo` that logs via pino, you would use `pino-azuretable` like so:
109
110``` sh
111$ node foo | pino-azuretable --account storageaccount --key blablabla
112```
113
114For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-azuretable#readme)
115
116<a id="pino-cloudwatch"></a>
117### pino-cloudwatch
118
119[pino-cloudwatch][pino-cloudwatch] is a transport that buffers and forwards logs to [Amazon CloudWatch][].
120
121```sh
122$ node app.js | pino-cloudwatch --group my-log-group
123```
124
125[pino-cloudwatch]: https://github.com/dbhowell/pino-cloudwatch
126[Amazon CloudWatch]: https://aws.amazon.com/cloudwatch/
127
128<a id="pino-couch"></a>
129### pino-couch
130
131[pino-couch][pino-couch] uploads each log line as a [CouchDB][CouchDB] document.
132
133```sh
134$ node app.js | pino-couch -U https://couch-server -d mylogs
135```
136
137[pino-couch]: https://github.com/IBM/pino-couch
138[CouchDB]: https://couchdb.apache.org
139
140<a id="pino-datadog"></a>
141### pino-datadog
142The [pino-datadog](https://www.npmjs.com/package/pino-datadog) module is a transport that will forward logs to [DataDog](https://www.datadoghq.com/) through it's API.
143
144Given an application `foo` that logs via pino, you would use `pino-datadog` like so:
145
146``` sh
147$ node foo | pino-datadog --key blablabla
148```
149
150For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-datadog#readme)
151
152<a id="pino-elasticsearch"></a>
153### pino-elasticsearch
154
155[pino-elasticsearch][pino-elasticsearch] uploads the log lines in bulk
156to [Elasticsearch][elasticsearch], to be displayed in [Kibana][kibana].
157
158It is extremely simple to use and setup
159
160```sh
161$ node app.js | pino-elasticsearch
162```
163
164Assuming Elasticsearch is running on localhost.
165
166To connect to an external elasticsearch instance (recommended for production):
167
168* Check that `network.host` is defined in the `elasticsearch.yml` configuration file. See [elasticsearch Network Settings documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-network.html#common-network-settings) for more details.
169* Launch:
170
171```sh
172$ node app.js | pino-elasticsearch --node http://192.168.1.42:9200
173```
174
175Assuming Elasticsearch is running on `192.168.1.42`.
176
177To connect to AWS Elasticsearch:
178
179```sh
180$ node app.js | pino-elasticsearch --node https://es-url.us-east-1.es.amazonaws.com --es-version 6
181```
182
183Then [create an index pattern](https://www.elastic.co/guide/en/kibana/current/setup.html) on `'pino'` (the default index key for `pino-elasticsearch`) on the Kibana instance.
184
185[pino-elasticsearch]: https://github.com/pinojs/pino-elasticsearch
186[elasticsearch]: https://www.elastic.co/products/elasticsearch
187[kibana]: https://www.elastic.co/products/kibana
188
189<a id="pino-http-send"></a>
190### pino-http-send
191
192[pino-http-send](https://npmjs.com/package/pino-http-send) is a configurable and low overhead
193transport that will batch logs and send to a specified URL.
194
195```console
196$ node app.js | pino-http-send -u http://localhost:8080/logs
197```
198
199<a id="pino-logflare"></a>
200### pino-logflare
201
202[pino-logflare](https://github.com/Logflare/pino-logflare) transport to send logs to a [Logflare](https://logflare.app) `source`.
203
204```sh
205$ node index.js | pino-logflare --key YOUR_KEY --source YOUR_SOURCE
206```
207
208<a id="pino-mq"></a>
209### pino-mq
210
211The `pino-mq` transport will take all messages received on `process.stdin` and send them over a message bus using JSON serialization.
212
213This useful for:
214
215* moving backpressure from application to broker
216* transforming messages pressure to another component
217
218```
219node app.js | pino-mq -u "amqp://guest:guest@localhost/" -q "pino-logs"
220```
221
222Alternatively a configuration file can be used:
223
224```
225node app.js | pino-mq -c pino-mq.json
226```
227
228A base configuration file can be initialized with:
229
230```
231pino-mq -g
232```
233
234For full documentation of command line switches and configuration see [the `pino-mq` readme](https://github.com/itavy/pino-mq#readme)
235
236<a id="pino-papertrail"></a>
237### pino-papertrail
238pino-papertrail is a transport that will forward logs to the [papertrail](https://papertrailapp.com) log service through an UDPv4 socket.
239
240Given an application `foo` that logs via pino, and a papertrail destination that collects logs on port UDP `12345` on address `bar.papertrailapp.com`, you would use `pino-papertrail`
241like so:
242
243```
244node yourapp.js | pino-papertrail --host bar.papertrailapp.com --port 12345 --appname foo
245```
246
247
248for full documentation of command line switches read [readme](https://github.com/ovhemert/pino-papertrail#readme)
249
250<a id="pino-pg"></a>
251### pino-pg
252[pino-pg](https://www.npmjs.com/package/pino-pg) stores logs into PostgreSQL.
253Full documentation in the [readme](https://github.com/Xstoudi/pino-pg).
254
255<a id="pino-mysql"></a>
256### pino-mysql
257
258[pino-mysql][pino-mysql] loads pino logs into [MySQL][MySQL] and [MariaDB][MariaDB].
259
260```sh
261$ node app.js | pino-mysql -c db-configuration.json
262```
263
264`pino-mysql` can extract and save log fields into corresponding database field
265and/or save the entire log stream as a [JSON Data Type][JSONDT].
266
267For full documentation and command line switches read the [readme][pino-mysql].
268
269[pino-mysql]: https://www.npmjs.com/package/pino-mysql
270[MySQL]: https://www.mysql.com/
271[MariaDB]: https://mariadb.org/
272[JSONDT]: https://dev.mysql.com/doc/refman/8.0/en/json.html
273
274<a id="pino-redis"></a>
275### pino-redis
276
277[pino-redis][pino-redis] loads pino logs into [Redis][Redis].
278
279```sh
280$ node app.js | pino-redis -U redis://username:password@localhost:6379
281```
282
283[pino-redis]: https://github.com/buianhthang/pino-redis
284[Redis]: https://redis.io/
285
286<a id="pino-sentry"></a>
287### pino-sentry
288
289[pino-sentry][pino-sentry] loads pino logs into [Sentry][Sentry].
290
291```sh
292$ node app.js | pino-sentry --dsn=https://******@sentry.io/12345
293```
294
295For full documentation of command line switches see the [pino-sentry readme](https://github.com/aandrewww/pino-sentry/blob/master/README.md)
296
297[pino-sentry]: https://www.npmjs.com/package/pino-sentry
298[Sentry]: https://sentry.io/
299
300<a id="pino-socket"></a>
301### pino-socket
302
303[pino-socket][pino-socket] is a transport that will forward logs to a IPv4
304UDP or TCP socket.
305
306As an example, use `socat` to fake a listener:
307
308```sh
309$ socat -v udp4-recvfrom:6000,fork exec:'/bin/cat'
310```
311
312Then run an application that uses `pino` for logging:
313
314```sh
315$ node app.js | pino-socket -p 6000
316```
317
318Logs from the application should be observed on both consoles.
319
320[pino-socket]: https://www.npmjs.com/package/pino-socket
321
322#### Logstash
323
324The [pino-socket][pino-socket] module can also be used to upload logs to
325[Logstash][logstash] via:
326
327```
328$ node app.js | pino-socket -a 127.0.0.1 -p 5000 -m tcp
329```
330
331Assuming logstash is running on the same host and configured as
332follows:
333
334```
335input {
336 tcp {
337 port => 5000
338 }
339}
340
341filter {
342 json {
343 source => "message"
344 }
345}
346
347output {
348 elasticsearch {
349 hosts => "127.0.0.1:9200"
350 }
351}
352```
353
354See <https://www.elastic.co/guide/en/kibana/current/setup.html> to learn
355how to setup [Kibana][kibana].
356
357For Docker users, see
358https://github.com/deviantony/docker-elk to setup an ELK stack.
359
360<a id="pino-stackdriver"></a>
361### pino-stackdriver
362The [pino-stackdriver](https://www.npmjs.com/package/pino-stackdriver) module is a transport that will forward logs to the [Google Stackdriver](https://cloud.google.com/logging/) log service through it's API.
363
364Given an application `foo` that logs via pino, a stackdriver log project `bar` and credentials in the file `/credentials.json`, you would use `pino-stackdriver`
365like so:
366
367``` sh
368$ node foo | pino-stackdriver --project bar --credentials /credentials.json
369```
370
371For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-stackdriver#readme)
372
373<a id="pino-syslog"></a>
374### pino-syslog
375
376[pino-syslog][pino-syslog] is a transforming transport that converts
377`pino` NDJSON logs to [RFC3164][rfc3164] compatible log messages. The `pino-syslog` module does not
378forward the logs anywhere, it merely re-writes the messages to `stdout`. But
379when used in combination with `pino-socket` the log messages can be relayed to a syslog server:
380
381```sh
382$ node app.js | pino-syslog | pino-socket -a syslog.example.com
383```
384
385Example output for the "hello world" log:
386
387```
388<134>Apr 1 16:44:58 MacBook-Pro-3 none[94473]: {"pid":94473,"hostname":"MacBook-Pro-3","level":30,"msg":"hello world","time":1459529098958}
389```
390
391[pino-syslog]: https://www.npmjs.com/package/pino-syslog
392[rfc3164]: https://tools.ietf.org/html/rfc3164
393[logstash]: https://www.elastic.co/products/logstash
394
395
396<a id="pino-websocket"></a>
397### pino-websocket
398
399[pino-websocket](https://www.npmjs.com/package/@abeai/pino-websocket) is a transport that will forward each log line to a websocket server.
400
401```sh
402$ node app.js | pino-websocket -a my-websocket-server.example.com -p 3004
403```
404
405For full documentation of command line switches read [readme](https://github.com/abeai/pino-webscoket#README)