-
Notifications
You must be signed in to change notification settings - Fork 161
Expand file tree
/
Copy pathoutput.txt
More file actions
66 lines (51 loc) · 1.68 KB
/
output.txt
File metadata and controls
66 lines (51 loc) · 1.68 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
=== Modify jobs
Change max_concurrent_runs, name, change default tag
=== Modify pipeline
Change edition and channel
=== Detect and save all changes
Detected changes in 2 resource(s):
Resource: resources.jobs.job1
max_concurrent_runs: add
name: add
tags['dev']: add
trigger.pause_status: add
Resource: resources.pipelines.pipeline1
channel: add
edition: add
=== Configuration changes
>>> diff.py databricks.yml.backup databricks.yml
--- databricks.yml.backup
+++ databricks.yml
@@ -13,4 +13,5 @@
interval: 1
unit: DAYS
+ pause_status: UNPAUSED
tasks:
- task_key: main
@@ -22,4 +23,8 @@
num_workers: 1
+ tags:
+ dev: default_tag_changed
+ max_concurrent_runs: 5
+ name: Custom Job Name
job2:
tasks:
@@ -38,2 +43,4 @@
- notebook:
path: /Users/{{workspace_user_name}}/notebook
+ channel: PREVIEW
+ edition: CORE
>>> [CLI] bundle destroy --auto-approve
The following resources will be deleted:
delete resources.jobs.job1
delete resources.jobs.job2
delete resources.pipelines.pipeline1
This action will result in the deletion or recreation of the following Lakeflow Spark Declarative Pipelines along with the Streaming Tables (STs) and Materialized Views (MVs) managed by them:
delete resources.pipelines.pipeline1
All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default
Deleting files...
Destroy complete!