Using Vertica Spark Connector to write a DataFrame to external database fails with [TABLE_OR_VIEW_NOT_FOUND] error

Switch to a non-Unity Catalog cluster.

Written by shubham.bhusate

Last published at: June 9th, 2025

Problem

While using Unity Catalog, when you use the Vertica Spark Connector to write DataFrames to an external database, the process fails with the following error.

[TABLE_OR_VIEW_NOT_FOUND] The table or view `vertica`.`<table-name>` cannot be found. Verify the spelling and correctness of the schema and catalog. If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog. To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01

 

Cause

Vertica Spark Connector (a third-party library)’s way of operating conflicts with Unity Catalog’s (UC) three-level namespacing system. The library cannot be used with a UC-enabled cluster.

 

Databricks does not guarantee that Unity Catalog supports third-party libraries. 

 

Solution

Switch to a non-UC cluster, such as No isolation shared

 

  1. Navigate to the Compute tab in your Databricks workspace.
  2. Select the desired cluster from the list of available clusters.
  3. Click the Edit button on the cluster details page.
  4. Change the Access Mode by selecting an appropriate option (such as No isolation shared) from the dropdown.
  5. Click Save to apply the updates and restart the cluster.
  6. Verify the updated access mode on the cluster details page.