Problem
When an Apache Spark job using Unity Catalog tables fails, it throws an error message with the Delta table ID, but you also need the table name.
Error in handleErrors(returnStatus, conn) :
  org.apache.spark.sql.AnalysisException: <>  detected when writing to the Delta table (Table ID: <table-id>).
Cause
Displaying only the Delta table ID is by design.
Solution
To find the table name corresponding to a table ID:
- Query the table 
tablesinsystem.information_schema - Match the table ID from the Spark logs with the 
storage_sub_directorycolumn. 
%sql
select table_catalog, table_schema, table_name 
from system.information_schema.tables where contains(storage_sub_directory, '<table-id>')