Spark UI is empty for the job clusters after termination

For non-Spark tasks the Spark UI should be empty.

Written by kunal.jadhav

Last published at: April 17th, 2025

Problem

You have jobs executing Python/SQL commands, but you're unable to view the Apache Spark UI details on the associated clusters. Specifically, the Spark UI does not display any information under the Jobs, Stages, or Storage tabs.

 

Cause

This is the expected behavior when executing non-Spark operations or API based calls that do not involve a Spark execution context.

The Spark UI only displays details when a Spark job—such as a PySpark or Spark SQL operation—is executed within an active Spark session. It helps you monitor, debug, and optimize Spark jobs by offering real-time and historical views of job execution.

 

Solution

The Spark UI provides insights into Spark job execution, including job progress, stages, and resource utilization. However, it only displays information when Spark operations are executed.

The Spark UI populates details when an active Spark session runs:

  • RDD transformations
  • Spark SQL queries
  • DataFrame operations

 

For example, the following PySpark code triggers a Spark job, which appears in the Spark UI.

%python

# Create a simple DataFrame
df = spark.createDataFrame([(1, "Alice"), (2, "Bob")], ["id", "name"])
df.show()

 

 

If no Spark jobs are triggered, the Spark UI does not display any details.

For example, the following pure Python code does not interact with Spark and does not appear in the UI:

%python

data = [(1, "Alice"), (2, "Bob")]
# Process data using pure Python
processed_data = [f"ID: {id}, Name: {name}" for id, name in data]

# Print the result
for item in processed_data:
   print(item)

 

Spark SQL queries remain visible in the SQL/DataFrame tab in the cluster’s Spark UI, even after the compute is terminated.

For example, this SQL code appears in the SQL tab after it has been run, even if the compute is terminated.

%sql

SELECT
   id,
   name,
   age,
   age + 5 AS age_after_5_years
FROM default.users;

 

For more information on Spark UI components, review the Spark Web UI documentation.