UNPKG

2.5 kBMarkdownView Raw
1# Sitemap Generator CLI
2
3[![Travis](https://img.shields.io/travis/lgraubner/sitemap-generator-cli.svg)](https://travis-ci.org/lgraubner/sitemap-generator-cli) [![David](https://img.shields.io/david/lgraubner/sitemap-generator-cli.svg)](https://david-dm.org/lgraubner/sitemap-generator-cli) [![npm](https://img.shields.io/npm/v/sitemap-generator-cli.svg)](https://www.npmjs.com/package/sitemap-generator-cli)
4
5> Create xml sitemaps from the command line.
6
7Generates a sitemap by crawling your site. Uses streams to efficiently write the sitemap to your drive. Is cappable of creating multiple sitemaps if threshold is reached. Respects robots.txt and meta tags.
8
9## Table of contents
10
11- [Install](#install)
12- [Usage](#usage)
13- [Options](#options)
14- [License](#license)
15
16## Install
17
18This module is available on [npm](https://www.npmjs.com/).
19
20```BASH
21$ npm install -g sitemap-generator-cli
22```
23
24## Usage
25
26The crawler will fetch all folder URL pages and file types [parsed by Google](https://support.google.com/webmasters/answer/35287?hl=en). If present the `robots.txt` will be taken into account and possible rules are applied for each URL to consider if it should be added to the sitemap. Also the crawler will not fetch URL's from a page if the robots meta tag with the value `nofollow` is present and ignore them completely if `noindex` rule is present. The crawler is able to apply the `base` value to found links.
27
28```BASH
29$ sitemap-generator [options] <url> <filepath>
30```
31
32When the crawler finished the XML Sitemap will be built and saved to your specified filepath. If the count of fetched pages is greater than 50000 it will be splitted into several sitemap files and create a sitemapindex file. Google does not allow more than 50000 items in one sitemap.
33
34Example:
35
36```BASH
37$ sitemap-generator http://example.com some/path/sitemap.xml
38```
39
40## Options
41
42```BASH
43$ sitemap-generator --help
44
45 Usage: cli [options] <url> <filepath>
46
47 Options:
48
49 -h, --help output usage information
50 -V, --version output the version number
51 -q, --query consider query string
52 -v, --verbose print details when crawling
53```
54
55### query
56
57Consider URLs with query strings like `http://www.example.com/?foo=bar` as indiviual sites and add them to the sitemap.
58
59### verbose
60
61Print debug messages during crawling process. Also prints out a summery when finished.
62
63## License
64
65[MIT](https://github.com/lgraubner/sitemap-generator/blob/master/LICENSE) © [Lars Graubner](https://larsgraubner.com)