Problem
You are deploying a workflow via the Pulumi IaC tool when you encounter cluster issues.
- The workflow deployment is not allowed without one of the
job_cluster_key
,new_cluster
, orexisting_cluster_id
to set it to run in serverless mode. - Users are unable to manually update a job to use serverless. It presents a similar error.
Cause
The primary issue is a configuration problem with the Pulumi provider, which requires one of the parameters (job_cluster_key
, new_cluster
, or existing_cluster_id
) to trigger a workflow. Pulumi can trigger workflows (serverless or multi-task jobs) with a REST activity.
The issue may be caused by an outdated version of the Pulumi provider or a configuration error in the Pulumi payload.
Solution
Update your Pulumi provider to the latest version and check the Pulumi payload for any configuration errors to resolve the primary issue.
Make sure you remove the job cluster definition from your payload.
Please specify your payload like this example.
You need to specify the following values before using the example payload:
-
<resource-prefix>
- The resource prefix from the Pulumi configuration. -
<user-email-address>
- The email address that you want to receive notifications. -
<user-home-path>
- The home directory for your user in your Databricks workspace.
job = Job(
resource_name = f"<resource-prefix>-job",
name = f"<resource-prefix>-job",
tasks = [
JobTaskArgs(
task_key = f"<resource-prefix>-task",
notebook_task = JobNotebookTaskArgs(
notebook_path = f"<user-home-path>/Pulumi/<resource-prefix>-notebook.py"
)
)
],
email_notifications = JobEmailNotificationsArgs(
on_successes = [ <user-email-address> ],
on_failures = [ <user-email-address> ]
)
)
If you are unable to manually update the job to use serverless, try the following steps:
- Open your workspace and click Workflows.
- Click Job & pipelines.
- Click the job that needs to be updated.
- Click Edit.
- Remove the
jobClusters
property from the job definition. - Save the changes and deploy the workflow.