Cannot create cluster: spark conf: 'spark.databricks.cluster.profile' is not allowed when choosing an access mode

Use the flag is_single_node to create single node compute.

Written by parth.sundarka

Last published at: April 9th, 2025

Problem

When you try to create a single node cluster with the API or Terraform, the creation fails with the following error. 

“Cannot create cluster: spark conf: 'spark.databricks.cluster.profile' is not allowed when choosing an access mode”

 

Cause

You have the Apache Spark config spark.databricks.cluster.profile added explicitly in the create new cluster API or cluster resource block. 

 

Solution

Use the API flag “is_single_node” instead of specifying a Spark config. When set to true, Databricks will automatically set single node-related custom_tagsspark_conf, and num_workers

 

For more information, refer to the Create new cluster (AWSGCPAzure) API documentation.

 

The following code is an example using the API documentation to create a single node Databricks Runtime 14.3 LTS compute resource. 

{
  "aws_attributes": {
    "availability": "SPOT_WITH_FALLBACK",
    "ebs_volume_count": 0,
    "first_on_demand": 1,
    "spot_bid_price_percent": 100,
    "zone_id": "auto"
  },
  "cluster_name": "single-node-with-kind-cluster",
  "is_single_node": true,
  "kind": "CLASSIC_PREVIEW",
  "node_type_id": "i3.xlarge",
  "spark_version": "14.3.x-scala2.12"
}