Problem
While using Unity Catalog, when you use the Vertica Spark Connector to write DataFrames to an external database, the process fails with the following error.
[TABLE_OR_VIEW_NOT_FOUND] The table or view `vertica`.`<table-name>` cannot be found. Verify the spelling and correctness of the schema and catalog. If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog. To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01
Cause
Vertica Spark Connector (a third-party library)’s way of operating conflicts with Unity Catalog’s (UC) three-level namespacing system. The library cannot be used with a UC-enabled cluster.
Databricks does not guarantee that Unity Catalog supports third-party libraries.
Solution
Switch to a non-UC cluster, such as No isolation shared.
- Navigate to the Compute tab in your Databricks workspace.
- Select the desired cluster from the list of available clusters.
- Click the Edit button on the cluster details page.
- Change the Access Mode by selecting an appropriate option (such as No isolation shared) from the dropdown.
- Click Save to apply the updates and restart the cluster.
- Verify the updated access mode on the cluster details page.