You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For a full example see [node-addon-examples's package.json](https://github.com/springmeyer/node-addon-example/blob/master/package.json).
@@ -148,9 +152,9 @@ The location your native module is placed after a build. This should be an empty
148
152
149
153
Note: This property supports variables based on [Versioning](#versioning).
150
154
151
-
###### host
155
+
###### host (and host.endpoint)
152
156
153
-
A url to the remote location where you've published tarball binaries (must be `https` not `http`).
157
+
An object with atleast a single key `endpoint` defining the remote location where you've published tarball binaries (must be `https` not `http`).
154
158
155
159
It is highly recommended that you use Amazon S3. The reasons are:
156
160
@@ -162,13 +166,21 @@ Why then not require S3? Because while some applications using node-pre-gyp need
162
166
163
167
It should also be mentioned that there is an optional and entirely separate npm module called [node-pre-gyp-github](https://github.com/bchr02/node-pre-gyp-github) which is intended to complement node-pre-gyp and be installed along with it. It provides the ability to store and publish your binaries within your repositories GitHub Releases if you would rather not use S3 directly. Installation and usage instructions can be found [here](https://github.com/bchr02/node-pre-gyp-github), but the basic premise is that instead of using the ```node-pre-gyp publish``` command you would use ```node-pre-gyp-github publish```.
164
168
165
-
##### The `binary` object other optional S3 properties
169
+
This looks like:
166
170
167
-
If you are not using a standard s3 path like `bucket_name.s3(.-)region.amazonaws.com`, you might get an error on `publish` because node-pre-gyp extracts the region and bucket from the `host` url. For example, you may have an on-premises s3-compatible storage server, or may have configured a specific dns redirecting to an s3 endpoint. In these cases, you can explicitly set the `region` and `bucket` properties to tell node-pre-gyp to use these values instead of guessing from the `host` property. The following values can be used in the `binary` section:
#####The `host` object other optional S3 properties
170
182
171
-
The url to the remote server root location (must be `https` not `http`).
183
+
If you are not using a standard s3 path like `bucket_name.s3(.-)region.amazonaws.com`, you might get an error on `publish` because node-pre-gyp extracts the region and bucket from the `host` url. For example, you may have an on-premises s3-compatible storage server, or may have configured a specific dns redirecting to an s3 endpoint. In these cases, you can explicitly set the `region` and `bucket` properties to tell node-pre-gyp to use these values instead of guessing from the `host` property. The following values can be used in the `binary` section:
172
184
173
185
###### bucket
174
186
@@ -182,6 +194,21 @@ Your S3 server region.
182
194
183
195
Set `s3ForcePathStyle` to true if the endpoint url should not be prefixed with the bucket name. If false (default), the server endpoint would be constructed as `bucket_name.your_server.com`.
184
196
197
+
For example using an alternate S3 compatible host:
198
+
199
+
```js
200
+
{
201
+
"binary": {
202
+
"host": {
203
+
"endpoint":"https://play.min.io",
204
+
"bucket":"node-pre-gyp-production",
205
+
"region":"us-east-1",
206
+
"s3ForcePathStyle":true
207
+
}
208
+
}
209
+
}
210
+
```
211
+
185
212
##### The `binary` object has optional properties
186
213
187
214
###### remote_path
@@ -309,28 +336,38 @@ If a a binary was not available for a given platform and `--fallback-to-build` w
309
336
310
337
#### 9) One more option
311
338
312
-
It may be that you want to work with two s3 buckets, one for staging and one for production; this
313
-
arrangement makes it less likely to accidentally overwrite a production binary. It also allows the production
314
-
environment to have more restrictive permissions than staging while still enabling publishing when
315
-
developing and testing.
339
+
It may be that you want to work with multiple s3 buckets, one for development, on for staging and one for production; such arrangement makes it less likely to accidentally overwrite a production binary. It also allows the production environment to have more restrictive permissions than development or staging while still enabling publishing when developing and testing.
316
340
317
-
The binary.host property can be set at execution time. In order to do so all of the following conditions
318
-
must be true.
319
341
320
-
- binary.host is falsey or not present
321
-
- binary.staging_host is not empty
322
-
- binary.production_host is not empty
342
+
To use that option set `staging_host` and/or `development_host` using settings similar to those used for `host`.
323
343
324
-
If any of these checks fail then the operation will not perform execution time determination of the s3 target.
If the command being executed is either "publish" or "unpublish" then the default is set to `binary.staging_host`. In all other cases
327
-
the default is `binary.production_host`.
366
+
Once a development and/or staging host is defined, if the command being executed is either "publish" or "unpublish" then it will default to the lower of the alternate hosts (development and if not present, staging). if the command being executed is either "install" or "info" it will default to the production host (specified by `host`).
328
367
329
-
The command-line options `--s3_host=staging` or `--s3_host=production` override the default. If `s3_host`
330
-
is present and not `staging` or `production` an exception is thrown.
368
+
To explicitly choose a host use command-line options `--s3_host=development`, `--s3_host=staging` or `--s3_host=production`, or set environment variable `node_pre_gyp_s3_host` to either `development`, `staging` or `production`. Note that the environment variable has priority over the the command line.
331
369
332
-
This allows installing from staging by specifying `--s3_host=staging`. And it requires specifying
333
-
`--s3_option=production` in order to publish to, or unpublish from, production, making accidental errors less likely.
370
+
This setup allows installing from development or staging by specifying `--s3_host=staging`. And it requires specifying `--s3_option=production` in order to publish to, or unpublish from, production, making accidental errors less likely.
0 commit comments