1 | # MindConnect-NodeJS
|
2 |
|
3 | ## nodejs library for the MindConnect API (V3)
|
4 |
|
5 | ![mindconnect-nodejs](images/mc3.png)
|
6 |
|
7 | The mindconnect-nodejs library enables the user to upload time series data, files and events to Siemens MindSphere Platform.
|
8 |
|
9 | This project has started as a community effort at Siemens AG and is now available for general use.
|
10 |
|
11 | [![Build Status](https://jenkins.mindconnect.rocks/buildStatus/icon?job=mindconnect-nodejs/master)](https://jenkins.mindconnect.rocks/blue/organizations/jenkins/mindconnect-nodejs/activity/) [![The MIT License](https://img.shields.io/badge/license-MIT-009999.svg?style=flat)](./LICENSE.md)
|
12 | [![npm](https://img.shields.io/npm/v/@mindconnect/mindconnect-nodejs/latest.svg?style=flat)](https://www.npmjs.com/package/@mindconnect/mindconnect-nodejs) ![downloads](https://img.shields.io/npm/dw/@mindconnect/mindconnect-nodejs.svg?colorB=009999)
|
13 | [![Known Vulnerabilities](https://snyk.io/test/github/mindsphere/mindconnect-nodejs/badge.svg?targetFile=package.json)](https://snyk.io/test/github/mindsphere/mindconnect-nodejs?targetFile=package.json)
|
14 | [![Language grade: JavaScript](https://img.shields.io/lgtm/grade/javascript/g/mindsphere/mindconnect-nodejs.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/mindsphere/mindconnect-nodejs/context:javascript) [![Greenkeeper badge](https://badges.greenkeeper.io/mindsphere/mindconnect-nodejs.svg)](https://greenkeeper.io/)
|
15 | [![Documentation](https://img.shields.io/badge/mindsphere-documentation-%23009999.svg)](https://opensource.mindsphere.io/docs/mindconnect-nodejs/index.html)
|
16 | [![Forum](https://img.shields.io/badge/mindsphere-community-%23009999.svg)](https://community.plm.automation.siemens.com/t5/Developer-Space/bd-p/MindSphere-platform-forum)
|
17 |
|
18 | ## Full documentation
|
19 |
|
20 | The full documentation can be found at [https://opensource.mindsphere.io/docs/mindconnect-nodejs/index.html](https://opensource.mindsphere.io/docs/mindconnect-nodejs/index.html)
|
21 |
|
22 | ## Installing the library
|
23 |
|
24 | There are several ways to install the library. The most common one is via npm registry:
|
25 |
|
26 | ```bash
|
27 | # install the latest stable library from the npm registry
|
28 | npm install @mindconnect/mindconnect-nodejs
|
29 |
|
30 | # install the latest alpha version from the npm registry
|
31 | npm install @mindconnect/mindconnect-nodejs@alpha
|
32 | ```
|
33 |
|
34 | As an alternative, you can also clone the repository, pack and install the files from local file:
|
35 |
|
36 | ```bash
|
37 | # clone the repository and run in the library directory
|
38 | npm install
|
39 | npm pack
|
40 |
|
41 | #this creates a mindconnect-....tgz file
|
42 |
|
43 | # in your project directory run
|
44 | npm install mindconnect-...tgz --save
|
45 | ```
|
46 |
|
47 | ## Installing the Command Line Interface
|
48 |
|
49 | The library comes with a command line interface which can also be installed globally. You can use the command line mode to upload timeseries, files and create events in the mindsphere.
|
50 |
|
51 | ```bash
|
52 | # install the library globaly if you want to use its command line interface.
|
53 | npm install -g @mindconnect/mindconnect-nodejs
|
54 | ```
|
55 |
|
56 | ## Using CLI
|
57 |
|
58 | The CLI can be used to create starter projects, upload timeseries, events and files, read agent diagnostics etc.
|
59 |
|
60 | ```bash
|
61 | # run mc --help to see the full help inforamtion
|
62 | mc --help
|
63 | ```
|
64 |
|
65 | Here are some examples how to use the CLI:
|
66 |
|
67 | ![image](images/full.gif)
|
68 |
|
69 | -->
|
70 |
|
71 | ## Using CLI to generate starter projects
|
72 |
|
73 | ```bash
|
74 | # install the library globaly if you want to use its command line interface.
|
75 | npm install -g @mindconnect/mindconnect-nodejs
|
76 |
|
77 | # for typescript nodejs project run
|
78 | mc starter-ts
|
79 |
|
80 | # for javascript nodejs project run
|
81 | mc starter-js
|
82 |
|
83 | # This will create a folder starterts (or starterjs) which you can use as a starting point for your agent.
|
84 | # Don't forget to run npm install there.
|
85 |
|
86 | # for full help run
|
87 | mc starter-ts --help # or
|
88 | mc starter-js --help
|
89 | ```
|
90 |
|
91 | ## How to create a nodejs MindSphere agent
|
92 |
|
93 | The following steps describe the easiest way to test the library. You can of course create the required dependencies also programmatically via API calls.
|
94 |
|
95 | ### Step 0: Create an asset type and aspect types
|
96 |
|
97 | Mindsphere V3 IoT model requires that you create an asset type and aspect types to describe your assets. For the example we will create an asset type of type Engine with two aspect types: Environment and Vibration. (Note that my tenant is called castidev, you will have to use your own tenant name)
|
98 |
|
99 | ![assetype](images/types.png)
|
100 |
|
101 | More information about [MindSphere Data Model](http://bit.ly/2IgVB9T).
|
102 |
|
103 | ### Step 1: Create an asset
|
104 |
|
105 | Create an asset (In example it is called **AcmeMotor**) of type Engine in AssetManager for your data.
|
106 |
|
107 | ### Step 2: Create an agent of type MindConnectLib in Mindsphere
|
108 |
|
109 | Create an agent in Asset Manager of type core.MindConnectLib create initial JSON token and store it to file (e.g. agentconfig.json)
|
110 |
|
111 | ```json
|
112 | {
|
113 | "content": {
|
114 | "baseUrl": "https://southgate.eu1.mindsphere.io",
|
115 | "iat": "<yourtokenishere>",
|
116 | "clientCredentialProfile": [
|
117 | "SHARED_SECRET"
|
118 | ],
|
119 | "clientId": "a3ac5ae889544717b02fa8282a30d1b4",
|
120 | "tenant": "<yourtenantishere>"
|
121 | },
|
122 | "expiration": "2018-04-06T00:47:39.000Z"
|
123 | }
|
124 | ```
|
125 |
|
126 | More Information about [core.MindConnectLib](http://bit.ly/2HZ2ehE) configuration.
|
127 |
|
128 | ### Step 3 : Create an agent
|
129 |
|
130 | Read the initial configuration from the config file and create the agent.
|
131 | If you are using the **SHARED_SECRET** profile there is no need to setup the local certificate for the communication (recommended for smaller devices).
|
132 |
|
133 | ```typescript
|
134 | const configuration = require("../../agentconfig.json");
|
135 | const agent = new MindConnectAgent(configuration);
|
136 | ```
|
137 |
|
138 | If you want to use the **RSA_3072** profile you must also set up the agent certificate.
|
139 |
|
140 | ```typescript
|
141 | // you can create the private.key for example using openssl:
|
142 | // openssl genrsa -out private.key 3072
|
143 |
|
144 | agent.SetupAgentCertificate(fs.readFileSync("private.key"));
|
145 | ```
|
146 |
|
147 | ### Step 4: Onboard the agent
|
148 |
|
149 | The first operation is onboarding of the agent. This creates a client secret which is used for the communication with mindshpere.
|
150 |
|
151 | This data is stored by default in the .mc folder in your application if you don't change the base path in the constructor of the agent.
|
152 |
|
153 | **Important**: Make sure that your folder with the configurations is not reachable from the internet as it contains the client_secret for the authentication.
|
154 |
|
155 | ```typescript
|
156 | if (!agent.IsOnBoarded()) {
|
157 | await agent.OnBoard();
|
158 | }
|
159 | ```
|
160 |
|
161 | ### Step 5: Configure the data model and data mappings to asset variables. (via UI)
|
162 |
|
163 | In the mindsphere version 3 you can configure the data model and mappings to aspect variables in the UI of the asset manager as well. Just go to configuration of mindconnectlib and configure the data sources like this.
|
164 |
|
165 | ![datasources](images/datasources.png)
|
166 |
|
167 | ![datasources](images/mappings.png)
|
168 |
|
169 | (it might be a bit tedious to click through all mappings).
|
170 |
|
171 | After that you can pull the configuration from mindsphere.
|
172 |
|
173 | ```typescript
|
174 | if (!agent.HasDataSourceConfiguration()) {
|
175 | await agent.GetDataSourceConfiguration();
|
176 | }
|
177 | ```
|
178 |
|
179 | ### Step 6 After this you can send the data in the code
|
180 |
|
181 | ```typescript
|
182 | for (let index = 0; index < 5; index++) {
|
183 |
|
184 | const values: DataPointValue[] = [
|
185 | { "dataPointId": "DP-Temperature", "qualityCode": "0", "value": (Math.sin(index) * (20 + index % 2) + 25).toString() },
|
186 | { "dataPointId": "DP-Pressure", "qualityCode": "0", "value": (Math.cos(index) * (20 + index % 25) + 25).toString() },
|
187 | { "dataPointId": "DP-Humidity", "qualityCode": "0", "value": ((index + 30) % 100).toString() },
|
188 | { "dataPointId": "DP-Acceleration", "qualityCode": "0", "value": (1000.0 + index).toString() },
|
189 | { "dataPointId": "DP-Frequency", "qualityCode": "0", "value": (60.0 + (index * 0.1)).toString() },
|
190 | { "dataPointId": "DP-Displacement", "qualityCode": "0", "value": (index % 10).toString() },
|
191 | { "dataPointId": "DP-Velocity", "qualityCode": "0", "value": (50.0 + index).toString() }
|
192 | ];
|
193 |
|
194 | // there is an optional timestamp parameter if you need to use something else instead of Date.now()
|
195 | const result = await agent.PostData(values);
|
196 | }
|
197 | ```
|
198 |
|
199 | (If you were using UI to configure data mappings you will have long integers instead of human-readable data point Ids.)
|
200 |
|
201 | ### Step 6.1 using bulk upload
|
202 |
|
203 | If you don't want to send the data points one by one, you can also use the bulkpostdata method
|
204 |
|
205 | ```typescript
|
206 | const bulk: TimeStampedDataPoint[] =
|
207 | [{
|
208 | "timestamp": "2018-08-23T18:38:02.135Z",
|
209 | "values":
|
210 | [{ "dataPointId": "DP-Temperature", "qualityCode": "0", "value": "10" },
|
211 | { "dataPointId": "DP-Pressure", "qualityCode": "0", "value": "10" }]
|
212 | },
|
213 | {
|
214 | "timestamp": "2018-08-23T19:38:02.135Z",
|
215 | "values": [{ "dataPointId": "DP-Temperature", "qualityCode": "0", "value": "10" },
|
216 | { "dataPointId": "DP-Pressure", "qualityCode": "0", "value": "10" }]
|
217 | }];
|
218 |
|
219 | await agent.BulkPostData (bulk);
|
220 | ```
|
221 |
|
222 | ## Events
|
223 |
|
224 | Events can now be created with the library. You can create events for your agent or for your entities. In order to create an event for your entity you need to know the asssetid of the asset.
|
225 |
|
226 | ```javascript
|
227 | const configuration = require("../../agentconfig.json");
|
228 | const agent = new MindConnectAgent(configuration);
|
229 |
|
230 | if (!agent.IsOnBoarded()) {
|
231 | await agent.OnBoard();
|
232 | }
|
233 |
|
234 | const event: MindsphereStandardEvent = {
|
235 | "entityId": configuration.content.clientId, // use assetid if you dont want to store event in the agent :)
|
236 | "sourceType": "Event",
|
237 | "sourceId": "application",
|
238 | "source": "Meowz",
|
239 | "severity": 20, // 0-99 : 20:error, 30:warning, 40: information
|
240 | "timestamp": new Date().toISOString(),
|
241 | "description": "Test"
|
242 | };
|
243 |
|
244 | // send event with current timestamp
|
245 | await agent.PostEvent(event);
|
246 | ```
|
247 |
|
248 | ![events](images/events.png)
|
249 |
|
250 | ## File Upload
|
251 |
|
252 | Files can now be uploaded via the library. You can upload files for your agent or for your entities. In order to create an event for your entity you need to know the assetid of the asset.
|
253 |
|
254 | Since version 3.5.1. the agents are using the multipart upload API of the MindSphere. This means that the agents can upload files also bigger > 8 MB, The
|
255 | multipart upload must be switched on (chunk:true) if you want to activate this behavior. The parameter parallelUploads determine the maximal number of paraellel uploads. You can increase this on a powerfull computer to speed up the upload or decrease to prevent network congestion.
|
256 |
|
257 | ```javascript
|
258 | const configuration = require("../../agentconfig.json");
|
259 | const agent = new MindConnectAgent(configuration);
|
260 |
|
261 | if (!agent.IsOnBoarded()) {
|
262 | await agent.OnBoard();
|
263 | }
|
264 |
|
265 |
|
266 | await agent.UploadFile(agent.ClientId(), "custom/mindsphere/path/package.json", "package.json", {
|
267 | retry: RETRYTIMES,
|
268 | description: "File uploaded with MindConnect-NodeJS Library",
|
269 | parallelUploads: 5,
|
270 | chunk: true
|
271 | });
|
272 | ```
|
273 |
|
274 | ![files](images/files.png)
|
275 |
|
276 | ## Full Agent
|
277 |
|
278 | Here is a demo agent implementation.
|
279 |
|
280 | [![mindsphere-agent](https://img.shields.io/badge/mindsphere-agent-green.svg)](src/demoagent/test-agent.ts)
|
281 |
|
282 | ## Making sure that the data arrives also with flaky internet connection
|
283 |
|
284 | You can wrap all asynchronous object calls into the retry function which will automatically retry the operation for n times before throwing an exception.
|
285 |
|
286 | ```typescript
|
287 | import { MindConnectAgent, MindsphereStandardEvent, retry, TimeStampedDataPoint } from "@mindconnect/mindconnect-nodejs";
|
288 |
|
289 | // if you want to be more resillient you can wrap every async method
|
290 | // in the retry function which will try to retry several times before throwing an exception
|
291 |
|
292 | // onboarding over flaky connection
|
293 | await retry (5, ()=>agent.OnBoard())
|
294 |
|
295 | // bulk upload with 5 retries
|
296 | const bulk: TimeStampedDataPoint[] =
|
297 | [{
|
298 | "timestamp": "2018-08-23T18:38:02.135Z",
|
299 | "values":
|
300 | [{ "dataPointId": "DP-Temperature", "qualityCode": "0", "value": "10" },
|
301 | { "dataPointId": "DP-Pressure", "qualityCode": "0", "value": "10" }]
|
302 | },
|
303 | {
|
304 | "timestamp": "2018-08-23T19:38:02.135Z",
|
305 | "values": [{ "dataPointId": "DP-Temperature", "qualityCode": "0", "value": "10" },
|
306 | { "dataPointId": "DP-Pressure", "qualityCode": "0", "value": "10" }]
|
307 | }];
|
308 |
|
309 | await retry(5, () => agent.BulkPostData(bulk));
|
310 |
|
311 | ```
|
312 |
|
313 | ## Generating the documentation
|
314 |
|
315 | You can always generate the current HTML documentation by running the command below.
|
316 |
|
317 | ```bash
|
318 | #this generates a docs/ folder the with full documentation of the library.
|
319 | npm run doc
|
320 | ```
|
321 |
|
322 | ## Proxy support
|
323 |
|
324 | Set the http_proxy or HTTP_PROXY environment variable if you need to connect via proxy.
|
325 |
|
326 | ```bash
|
327 | # set http proxy environment variable if you are using e.g. fiddler on the localhost.
|
328 |
|
329 | export HTTP_PROXY=http://localhost:8888
|
330 | ```
|
331 |
|
332 | ## Data in the mindsphere
|
333 |
|
334 | Environment data:
|
335 |
|
336 | ![Environment Data](images/environmentdata.PNG)
|
337 |
|
338 | Vibration data:
|
339 |
|
340 | ![Vibration Data](images/vibrationdata.PNG)
|
341 |
|
342 | ## Command Line Interface
|
343 |
|
344 | The CLI commands should only be used **in secure enviroments!** (e.g on your working station, not on the agents). In the next major version the CLI will be
|
345 | moved to a separate package to reduce the size of the library.
|
346 |
|
347 | Here is an overview of CLI commands:
|
348 |
|
349 | ```bash
|
350 | # run mc --help to get a full list of the commands
|
351 | mc --help
|
352 | ```
|
353 |
|
354 | ![CLI](images/cli.png)
|
355 |
|
356 | ### Setup and diagnostic
|
357 |
|
358 | The diagnostic endpoint gives informations about the possible problems which an agent might have on the cloud side. However, these operations require service credentials which should only be used for setup and diagnostic tasks in secure environments.
|
359 |
|
360 | ![setup](images/setup.gif)
|
361 |
|
362 | ### Bulk Import and Standard Import of the historical data
|
363 |
|
364 | The CLI provides the commands to import historical data to the MindSphere IoT services.
|
365 | The commands use bulk import API for simulation assets and standard time series for performance assets.
|
366 |
|
367 | If you use the standard import to the IoT services, the calls to the API will be **throttled** to match your throttling limits.
|
368 | The number of the records per message will be reduced to max 200 per message. Using this feature has direct impact on mindsphere resource consumption.
|
369 | You might get a notice that you will need to upgrade your account's data ingest rate.
|
370 |
|
371 | The feature will be deprecated once bulk upload also works for performance assets.
|
372 |
|
373 | ## Legal
|
374 |
|
375 | This project has been released under an [Open Source license](./LICENSE.md). The release may include and/or use APIs to Siemens’ or third parties’ products or services. In no event shall the project’s Open Source license grant any rights in or to these APIs, products or services that would alter, expand, be inconsistent with, or supersede any terms of separate license agreements applicable to those APIs. “API” means application programming interfaces and their specifications and implementing code that allows other software to communicate with or call on Siemens’ or third parties’ products or services and may be made available through Siemens’ or third parties’ products, documentations or otherwise.
|