Resolve invalid cast input error on serverless compute

Set spark.sql.ansi.enabled to false to resolve casting errors on serverless compute.

Written by anudeep.konaboina

Last published at: April 16th, 2025

Problem

You are running jobs on serverless compute when you get an invalid cast input error message. When you run the same jobs on classic compute, you do not get an invalid cast input error.

 

Example error message

[CAST_INVALID_INPUT] The value '100.0' of the type "STRING" cannot be cast to "BIGINT" because it is malformed. Correct the value as per the syntax, or change its target type. Use 'try_cast' to tolerate malformed input and return NULL instead.

 

Cause

The default setting of spark.sql.ansi.enabled is set to true on serverless compute, which enforces strict ANSI SQL compliance. When you start a classic cluster with Databricks Runtime, spark.sql.ansi.enabled is set to false by default. This difference in settings can lead to casting errors when running jobs on serverless compute.

 

Solution

Set spark.sql.ansi.enabled to false before running queries on serverless compute. This disables strict ANSI SQL compliance.

  1. Run SET spark.sql.ansi.enabled=false to disable strict ANSI SQL compliance.
  2. Run your query.