Problem
Your cluster’s Apache Spark UI is not available, and you see the following error message.
Could not find data to load UI for driver <driver-id> in cluster <cluster-id>
Cause
This message can appear when you have a custom Spark config, spark.extraListeners
overwriting the default Databricks daemon listener com.databricks.backend.daemon.driver.DBCEventLoggingListener
.
Solution
- Open the affected cluster.
- Click the Edit button.
- Scroll to Advanced options and click to expand.
- Click the Spark option in the vertical menu.
- Within the Spark config field, select
spark.extraListeners
. - Append the default Databricks daemon listener to your custom listener. You can use the following example code.
<your-custom-spark-listener-reference>, com.databricks.backend.daemon.driver.DBCEventLoggingListener
If this Spark setting does not appear in the Spark config field, check any of the init scripts influencing the cluster and adjust using the init script. You can use the following example code.
"spark.extraListeners" = "<your-custom-spark-listener-reference>, com.databricks.backend.daemon.driver.DBCEventLoggingListener"
If the further init script check doesn’t reveal any script influencing the cluster, contact Support to determine alternative causes.