UNPKG

11.4 kBMarkdownView Raw
1# gulp-awspublish
2
3[![NPM version][npm-image]][npm-url] [![Build Status][travis-image]][travis-url] [![Dependency Status][depstat-image]][depstat-url] [![Install size][packagephobia-image]][packagephobia-url]
4
5> awspublish plugin for [gulp](https://github.com/wearefractal/gulp)
6
7## Usage
8
9First, install `gulp-awspublish` as a development dependency:
10
11```shell
12npm install --save-dev gulp-awspublish
13```
14
15Then, add it to your `gulpfile.js`:
16
17```javascript
18var awspublish = require("gulp-awspublish");
19
20gulp.task("publish", function() {
21 // create a new publisher using S3 options
22 // http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#constructor-property
23 var publisher = awspublish.create(
24 {
25 region: "your-region-id",
26 params: {
27 Bucket: "..."
28 }
29 },
30 {
31 cacheFileName: "your-cache-location"
32 }
33 );
34
35 // define custom headers
36 var headers = {
37 "Cache-Control": "max-age=315360000, no-transform, public"
38 // ...
39 };
40
41 return (
42 gulp
43 .src("./public/*.js")
44 // gzip, Set Content-Encoding headers and add .gz extension
45 .pipe(awspublish.gzip({ ext: ".gz" }))
46
47 // publisher will add Content-Length, Content-Type and headers specified above
48 // If not specified it will set x-amz-acl to public-read by default
49 .pipe(publisher.publish(headers))
50
51 // create a cache file to speed up consecutive uploads
52 .pipe(publisher.cache())
53
54 // print upload updates to console
55 .pipe(awspublish.reporter())
56 );
57});
58
59// output
60// [gulp] [create] file1.js.gz
61// [gulp] [create] file2.js.gz
62// [gulp] [update] file3.js.gz
63// [gulp] [cache] file3.js.gz
64// ...
65```
66
67- Note: If you follow the [aws-sdk suggestions](http://docs.aws.amazon.com/AWSJavaScriptSDK/guide/node-configuring.html) for
68 providing your credentials you don't need to pass them in to create the publisher.
69
70- Note: In order for publish to work on S3, your policy has to allow the following S3 actions:
71
72```json
73{
74 "Version": "2012-10-17",
75 "Statement": [
76 {
77 "Effect": "Allow",
78 "Action": ["s3:ListBucket"],
79 "Resource": ["arn:aws:s3:::BUCKETNAME"]
80 },
81 {
82 "Effect": "Allow",
83 "Action": [
84 "s3:PutObject",
85 "s3:PutObjectAcl",
86 "s3:GetObject",
87 "s3:GetObjectAcl",
88 "s3:DeleteObject",
89 "s3:ListMultipartUploadParts",
90 "s3:AbortMultipartUpload"
91 ],
92 "Resource": ["arn:aws:s3:::BUCKETNAME/*"]
93 }
94 ]
95}
96```
97
98### Bucket permissions
99
100By default, the plugin works only when public access to the bucket is **not blocked**:
101
102- Block all public access: **Off**
103 - Block public access to buckets and objects granted through new access control lists (ACLs): Off
104 - Block public access to buckets and objects granted through any access control lists (ACLs): Off
105 - Block public access to buckets and objects granted through new public bucket policies: Off
106 - Block public and cross-account access to buckets and objects through any public bucket policies: Off
107
108When dealing with a private bucket, make sure to pass the option `{ noAcl: true }` or a value for the `x-amz-acl` header:
109
110```js
111publisher.publish({}, { noAcl: true });
112publisher.publish({ "x-amz-acl": "something" });
113```
114
115## Testing
116
1171. Create an S3 bucket which will be used for the tests. Optionally create an IAM user for running the tests.
1182. Set the buckets Permission, so it can be edited by the IAM user who will run the tests.
1193. Add an aws-credentials.json file to the project directory with the name of your testing buckets
120 and the credentials of the user who will run the tests.
1214. Run `npm test`
122
123```json
124{
125 "params": {
126 "Bucket": "<test-bucket-name>"
127 },
128 "credentials": {
129 "accessKeyId": "<your-access-key-id>",
130 "secretAccessKey": "<your-secret-access-key>",
131 "signatureVersion": "v3"
132 }
133}
134```
135
136## API
137
138### awspublish.gzip(options)
139
140create a through stream, that gzip file and add Content-Encoding header.
141
142- Note: Node version 0.12.x or later is required in order to use `awspublish.gzip`. If you need an older node engine to work with gzipping, you can use [v2.0.2](https://github.com/pgherveou/gulp-awspublish/tree/v2.0.2).
143
144Available options:
145
146- ext: file extension to add to gzipped file (eg: { ext: '.gz' })
147- smaller: gzip files only when result is smaller
148- Any options that can be passed to [zlib.gzip](https://nodejs.org/api/zlib.html#zlib_options)
149
150### awspublish.create(AWSConfig, cacheOptions)
151
152Create a Publisher.
153The AWSConfig object is used to create an `aws-sdk` S3 client. At a minimum you must pass a `Bucket` key, to define the site bucket. You can find all available options in the [AWS SDK documentation](http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#constructor-property).
154
155The cacheOptions object allows you to define the location of the cached hash digests. By default, they will be saved in your projects root folder in a hidden file called '.awspublish-' + 'name-of-your-bucket'.
156
157#### Adjusting upload timeout
158
159The AWS client has a default timeout which may be too low when pushing large files (> 50mb).
160To adjust timeout, add `httpOptions: { timeout: 300000 }` to the AWSConfig object.
161
162#### Credentials
163
164By default, gulp-awspublish uses the credential chain specified in the AWS [docs](http://docs.aws.amazon.com/AWSJavaScriptSDK/guide/node-configuring.html).
165
166Here are some example credential configurations:
167
168Hardcoded credentials (**Note**: We recommend you **not** hard-code credentials inside an application. Use this method only for small personal scripts or for testing purposes.):
169
170```javascript
171var publisher = awspublish.create({
172 region: "your-region-id",
173 params: {
174 Bucket: "..."
175 },
176 accessKeyId: "akid",
177 secretAccessKey: "secret"
178});
179```
180
181Using a profile by name from `~/.aws/credentials`:
182
183```javascript
184var AWS = require("aws-sdk");
185
186var publisher = awspublish.create({
187 region: "your-region-id",
188 params: {
189 Bucket: "..."
190 },
191 credentials: new AWS.SharedIniFileCredentials({ profile: "myprofile" })
192});
193```
194
195Instead of putting anything in the configuration object, you can also provide the following environment variables: `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`, `AWS_SESSION_TOKEN`, `AWS_PROFILE`. You can also define a `[default]` profile in `~/.aws/credentials` which the SDK will use transparently without needing to set anything.
196
197#### Publisher.publish([headers], [options])
198
199Create a through stream, that push files to s3.
200
201- header: hash of headers to add or override to existing s3 headers.
202- options: optional additional publishing options
203 - force: bypass cache / skip
204 - noAcl: do not set x-amz-acl by default
205 - simulate: debugging option to simulate s3 upload
206 - createOnly: skip file updates
207
208Files that go through the stream receive extra properties:
209
210- s3.path: s3 path
211- s3.etag: file etag
212- s3.date: file last modified date
213- s3.state: publication state (create, update, delete, cache or skip)
214- s3.headers: s3 headers for this file. Defaults headers are:
215 - x-amz-acl: public-read
216 - Content-Type
217 - Content-Length
218
219> Note: `publish` will never delete files remotely. To clean up unused remote files use `sync`.
220
221#### publisher.cache()
222
223Create a through stream that create or update a cache file using file s3 path and file etag.
224Consecutive runs of publish will use this file to avoid reuploading identical files.
225
226Cache file is save in the current working dir and is named `.awspublish-<bucket>`. The cache file is flushed to disk every 10 files just to be safe.
227
228#### Publisher.sync([prefix], [whitelistedFiles])
229
230create a transform stream that delete old files from the bucket.
231
232- prefix: prefix to sync a specific directory
233- whitelistedFiles: array that can contain regular expressions or strings that match against filenames that
234 should never be deleted from the bucket.
235
236e.g.
237
238```js
239// only directory bar will be synced
240// files in folder /foo/bar and file baz.txt will not be removed from the bucket despite not being in your local folder
241gulp
242 .src("./public/*")
243 .pipe(publisher.publish())
244 .pipe(publisher.sync("bar", [/^foo\/bar/, "baz.txt"]))
245 .pipe(awspublish.reporter());
246```
247
248> **warning** `sync` will delete files in your bucket that are not in your local folder unless they're whitelisted.
249
250```js
251// this will publish and sync bucket files with the one in your public directory
252gulp
253 .src("./public/*")
254 .pipe(publisher.publish())
255 .pipe(publisher.sync())
256 .pipe(awspublish.reporter());
257
258// output
259// [gulp] [create] file1.js
260// [gulp] [update] file2.js
261// [gulp] [delete] file3.js
262// ...
263```
264
265#### Publisher.client
266
267The `aws-sdk` S3 client is exposed to let you do other s3 operations.
268
269### awspublish.reporter([options])
270
271Create a reporter that logs s3.path and s3.state (delete, create, update, cache, skip).
272
273Available options:
274
275- states: list of state to log (default to all)
276
277```js
278// this will publish,sync bucket files and print created, updated and deleted files
279gulp
280 .src("./public/*")
281 .pipe(publisher.publish())
282 .pipe(publisher.sync())
283 .pipe(
284 awspublish.reporter({
285 states: ["create", "update", "delete"]
286 })
287 );
288```
289
290## Examples
291
292### [Rename file & directory](examples/rename.js)
293
294You can use `gulp-rename` to rename your files on s3
295
296```js
297// see examples/rename.js
298
299gulp
300 .src("examples/fixtures/*.js")
301 .pipe(
302 rename(function(path) {
303 path.dirname += "/s3-examples";
304 path.basename += "-s3";
305 })
306 )
307 .pipe(publisher.publish())
308 .pipe(awspublish.reporter());
309
310// output
311// [gulp] [create] s3-examples/bar-s3.js
312// [gulp] [create] s3-examples/foo-s3.js
313```
314
315### [Upload file in parallel](examples/concurrent.js)
316
317You can use `concurrent-transform` to upload files in parallel to your amazon bucket
318
319```js
320var parallelize = require("concurrent-transform");
321
322gulp
323 .src("examples/fixtures/*.js")
324 .pipe(parallelize(publisher.publish(), 10))
325 .pipe(awspublish.reporter());
326```
327
328### Upload both gzipped and plain files in one stream
329
330You can use the [`merge-stream`](https://github.com/grncdr/merge-stream) plugin
331to upload two streams in parallel, allowing `sync` to work with mixed file
332types
333
334```js
335var merge = require("merge-stream");
336var gzip = gulp.src("public/**/*.js").pipe(awspublish.gzip());
337var plain = gulp.src(["public/**/*", "!public/**/*.js"]);
338
339merge(gzip, plain)
340 .pipe(publisher.publish())
341 .pipe(publisher.sync())
342 .pipe(awspublish.reporter());
343```
344
345## Plugins
346
347### gulp-awspublish-router
348
349A router for defining file-specific rules
350https://www.npmjs.org/package/gulp-awspublish-router
351
352### gulp-cloudfront-invalidate-aws-publish
353
354Invalidate cloudfront cache based on output from awspublish
355https://www.npmjs.com/package/gulp-cloudfront-invalidate-aws-publish
356
357## License
358
359[MIT License](http://en.wikipedia.org/wiki/MIT_License)
360
361[npm-url]: https://npmjs.org/package/gulp-awspublish
362[npm-image]: https://badge.fury.io/js/gulp-awspublish.svg
363[depstat-url]: https://david-dm.org/pgherveou/gulp-awspublish
364[depstat-image]: https://david-dm.org/pgherveou/gulp-awspublish.svg
365[travis-url]: https://www.travis-ci.org/pgherveou/gulp-awspublish
366[travis-image]: https://www.travis-ci.org/pgherveou/gulp-awspublish.svg?branch=master
367[packagephobia-image]: https://packagephobia.now.sh/badge?p=gulp-awspublish
368[packagephobia-url]: https://packagephobia.now.sh/result?p=gulp-awspublish