DB2 JDBC timestamp read error in Databricks Runtime 15.4 LTS

Set spark.sql.legacy.jdbc.useNullCalendar to False and define an Apache Spark session timezone.

Written by Amruth Ashoka

Last published at: October 29th, 2025

Problem

While using Databricks Runtime version 15.4 LTS, you attempt to read a DB2 table using the DB2 JDBC driver and display a column containing timestamps. You receive the following error.

SqlSyntaxErrorException: [x][x][x][x][4.31.10] Invalid parameter calendar: Parameter cannot be null. ERRORCODE=-4461, SQLSTATE=42815

 

Cause

As of Databricks Runtime 15.2, in JDBC driver calls the stmt.setTimestamp() and rs.getTimestamp() methods now explicitly use a Calendar object. (In previous Databricks Runtime versions these methods were used without a Calendar object.)

 

By default, the calendar parameter in stmt.setTimestamp() is set to null. This change affects certain JDBC drivers, such as the DB2 JDBC driver, which are unable to handle null calendars.

 

Additionally, when no Calendar object is provided, an error is thrown in the stmt.setTimestamp() call. 

 

Syntax before Databricks Runtime 15.2

  • stmt.setTimestamp(position, value)
  • rs.getTimestamp(position)

 

Syntax as of Databricks Runtime 15.2 and above

  •  stmt.setTimestamp(position, value, calendar)
  •  rs.getTimestamp(position, calendar)

 

Solution

1. In your compute settings, set the Apache Spark config spark.sql.legacy.jdbc.useNullCalendar to False to instead use spark.sql.session.timeZone to read the calendar. 

 

For details on how to apply Spark configs, refer to the “Spark configuration” section of the Compute configuration reference (AWS | Azure | GCP) documentation.

 

2.  Set the Spark session timezone using the command SET TIME ZONE '<zone-id>'. Refer to the TIMEZONE (AWSAzureGCP) documentation for more information.