This Universal Task allows Stonebranch users to perform end-to-end Orchestration and Automation of Jobs & Clusters in Databricks environment, either in AWS or Azure.
This integration will use the Databricks URL and the user bearer token to connect with the Databricks environment. Users can perform the following actions with respect to the Databricks jobs.
Create and list jobs
Get job details
Run now jobs
Run submit jobs
Cancel run jobs
Also with respect to Databricks clusters, this integration can perform the following operations.
Create, start, and restart a cluster
Terminate a cluster
Get cluster information
With respect to Databricks DBFS, this Universal Task also provides a feature to upload larger files.
Refer to Changelog for version history information.
This integration requires a Universal Agent and a Python runtime to execute the Universal Task against a Databricks environment. This Universal Task has been tested with the Azure Databricks environment API version 2.0.
Create a job in a Databricks environment from Universal Controller. Here, a JSON input for job creation in Databricks environment will be used.
List the jobs available within the Databricks environment.
Get Job details
Provides an existing job definition in Databricks by providing the job ID as input.
Run now Jobs
This feature helps to run an existing job in Databricks environment using the run time input parameters supplied in JSON from the Universal Task and the Universal Controller will be monitoring the execution of the job until it gets completed.
Run Submit jobs
This feature helps to run a job in Databricks environment that can be dynamically defined in JSON as an input parameter in the Universal Task and the Universal Controller will be monitoring the execution of the job until it gets completed.
Cancel Run job
Cancel an execution of a job that is in a running state within the Databricks environment.
Create a cluster in Databricks environment. Input to be provided in the JSON in a script in this Universal Task.
List the clusters available in the Databricks environment.
Start a cluster that is in a stopped state in Databricks.
Restart a cluster in the Databricks environment.
Terminate cluster in Databricks environment by providing cluster ID as input.
Get a Cluster info
Provides the definition of an existing cluster in Databricks environment in JSON.
Upload file to DBFS
Upload a file from local server to a Databricks file system DBFS.
Import Universal Template
To use this downloadable Universal Template, you first must perform the following steps: