==============================================================================
Databricks JDBC Driver Release Notes 
==============================================================================

The release notes provide details of enhancements, features, known issues, and
workflow changes in Databricks JDBC Driver 2.7.5, as well as the version 
history. 


2.7.5 ========================================================================

Released 2025-10-09

Enhancements & New Features

 * [SPARKJ-1137] Kerberos proxy auth support

   The connector now supports Kerberos with proxy connections. To enable 
   kerberos proxy, set the UseProxy=1 and ProxyAuth=2. You can use ProxyHost,
   ProxyPort, ProxyKrbRealm, ProxyKrbFQDN and ProxyKrbService to set proxy 
   details. For more details, see the Installation and Configuration Guide.


Resolved Issues 
The following issues have been resolved in Databricks JDBC Driver 2.7.5.

 * [SPARKJ-737][SPARKJ-997]  In some cases, the connector fails to run complex
   queries that contain ? in native mode.

 * [SPARKJ-1139] In some cases, UC Volume Ingestion intermittently fails due
   to unexpected behavior in the connector.

 * [SPARKJ-1135] Assertion check fails in getColumns for table with 
   column type of Void or Variant with java -ea flag.

Known Issues 
The following are known issues that you may encounter due to limitations in 
the data source, the driver or an application. 
  
 * [SPARKJ-573] Issue when deserializing Apache Arrow data with Java JVMs 
   version 11 or higher, due to compatibility issues. 
   
   As a workaround, if you encounter the "Error occurred while deserializing 
   arrow data: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) 
   not available" error, add the follwing line:
   
   --add-opens java.base/java.nio=ALL-UNNAMED

   For more information, see the Installation and Configuration Guide.

 * [SPARKJ-330] Issue with date and timestamp before the beginning of the 
   Gregorian calendar when connecting to Spark 2.4.4 or later, or versions 
   previous to 3.0, with Arrow result set serialization.
 
   When using Spark 2.4.4 or later, or versions previous to Spark 3.0, DATE 
   and TIMESTAMP data before October 15, 1582 may be returned incorrectly if 
   the server supports serializing query results using Apache Arrow. This 
   issue should not impact most distributions of Apache Spark.

   To confirm if your distribution of Spark 2.4.4 or later has been impacted 
   by this issue, you can execute the following query:

   SELECT DATE '1581-10-14'

   If the result returned by the connector is 1581-10-24, then you are 
   impacted by the issue. In this case, if your data set contains date and/or
   timestamp data earlier than October 15, 1582, you can work around this 
   issue by adding EnableArrow=0 in your connection URL to disable the Arrow
   result set serialization feature. 

 * [SPARKJ-267] The JDBC 4.0 version of the connector fails to connect to 
   servers that require encryption using TLS 1.1 or later.

   When you attempt to connect to the server, the connection fails and the
   connector returns an SSL handshake exception. This issue occurs only when
   you run the connector using Java Runtime Environment (JRE) 6.0. 

   As a workaround, run the connector using JRE 7.0 or 8.0.

 * When retrieving data from a BINARY column, a ClassCastException error 
   occurs.

   In Spark 1.6.3 or earlier, the server sometimes returns a 
   ClassCastException error when attempting to retrieve data from a BINARY 
   column.

   This issue is fixed as of Spark 2.0.0.

   For more information, see the JIRA issue posted by Apache named "When
   column type is binary, select occurs ClassCastException in Beeline" at
   https://issues.apache.org/jira/browse/SPARK-12143.


Workflow Changes =============================================================

The following changes may disrupt established workflows for the connector. 


2.7.4 ------------------------------------------------------------------------

 * [SPARKJ-806] Change in default length for Binary type

   Beginning with this release, Binary type default length in getColumns
   has changed from 1 to 32767.


2.7.1 -----------------------------------------------------------------------

 * [SPARKJ-885] Renamed username and password authentication

   Beginning with this release, AuthMech 3 uses PAT (Personal Access Token) 
   authentication. Previously, it was known as username and password 
   authentication. It only accepts 'token' values for UIDs. For more 
   information, see the Installation and Configuration Guide.


2.6.38 -----------------------------------------------------------------------

 * [SPARKJ-825] Updated access token information

   The access token information has been updated in the Installation and 
   Configuration Guide.


2.6.33 -----------------------------------------------------------------------

 * [SPARKJ-646] Removed support for Java 7.0

   Beginning with this release, the driver no longer supports Java 7.0. For 
   a list of supported JDBC versions, see the Installation and Configuration 
   Guide.


2.6.29 -----------------------------------------------------------------------

 * [SPARKJ-618] Renamed jar files

   Beginning with this release, the following files have been renamed:
   - SparkJDBC41.jar is now DatabricksJDBC41.jar
   - SparkJDBC42.jar is now DatabricksJDBC42.jar


2.6.21 -----------------------------------------------------------------------

 * [SPARKJ-534] Renamed connection properties

   Beginning with this release, the following connection properties have been 
   renamed:
   - ClusterAutostartRetry is now TemporarilyUnavailableRetry
   - ClusterAutostartRetryTimeout is now TemporarilyUnavailableRetryTimeout


2.6.20 -----------------------------------------------------------------------

 * [SPARKJ-474] Updated catalog support 

   When connecting to a server that supports multiple catalogs, the connector
   no longer reports the catalog for schemas and tables as SPARK. Instead, the
   catalog is the one reported by the Spark server. For more information, see
   the Installation and Configuration Guide.
   
   
2.6.19 -----------------------------------------------------------------------

 * [SPARKJ-483] Removed third-party libraries

   Beginning with this release, the connector no longer includes the ZooKeeper
   and Jute libraries in the JAR file. 


2.6.18 -----------------------------------------------------------------------

 * [SPARKJ-296][SPARKJ-297] Removed support for 2.1

   Beginning with this release, the connector no longer supports servers that
   run Spark version 2.1. For information about the supported Spark versions,
   see the Installation and Configuration Guide.

 * [SPARKJ-288][SPARKJ-289] Removed support for JDBC 4.0 (Java 6)

   Beginning with this release, the connector no longer supports JDBC 4.0 
   (Java 6). For a list of supported JDBC versions, see the Installation and
   Configuration Guide.


2.6.11 -----------------------------------------------------------------------

 * [SPARKJ-301] Removed support for Spark 1.5.2 and earlier, as well as 2.0

   Beginning with this release, the driver no longer supports servers that run
   any of the following Spark versions:
   - Versions 1.5.2 and earlier
   - Version 2.0

   For information about the supported Spark versions, see the Installation 
   and Configuration Guide.

 * [SPARKJ-296][SPARKJ-298] Deprecated support for Spark 1.6 and 2.1

   Beginning with this release, support for Spark versions 1.6 and 2.1 has
   been deprecated. For information about the supported Spark versions, 
   see the Installation and Configuration Guide.

 * [SPARKJ-288] Deprecated support for JDBC 4.0 (Java 6)
 
   Beginning with this release, support for JDBC 4.0 (Java 6) has been
   deprecated. Support will be removed in a future release. For a list of
   supported JDBC versions, see the Installation and Configuration Guide.


Version History ==============================================================

2.7.4 ------------------------------------------------------------------------

Released 2025-08-21

Enhancements & New Features

 * [SPARKJ-1118] Removed third-party Dependency

   The connector no longer uses the Apache Commons Lang library.

 * [SPARKJ-806] Updated SEN SDK support
   
   The connector now uses SEN SDK version 10.3. Previously, the connector 
   used SEN SDK version 10.1.
 
 * [SPARKJ-1032][SPARKJ-1117] Updated third party libraries 

   The connector has been upgraded with the following libraries:
   - log4j 2.24.3 (previously 2.20.0)
   - nimbus-jose-jwt 10.4.1 (previously 9.37.2)

 * [SPARKJ-1100] Updated JRE support
   
   The connector now supports Java 21.0. For a list of supported JRE versions,
   see Installation and Configuration Guide.

 * [SPARKJ-1107] Telemetry support

   The connector now supports the Telemetry feature. To enable this feature, 
   set the EnableTelemetry property to 1. For more information, see the 
   Installation and Configuration Guide.

* [SPARKJ-638] CaseInsensitiveMetadataFilter support

   The connector can now support case sensitivity in metadata API calls. To do 
   this, set CaseInsensitiveMetadataFilter property to 0 or 1 in the 
   connection string. For more information, see the Installation and 
   Configuration Guide.


Resolved Issues 
The following issues have been resolved in Databricks JDBC Driver 2.7.4.

 * [SPARKJ-902] The getTimestamp function does not parse certain Julian 
   timestamps generated by DBR14.

 * [SPARKJ-998] The connector error message for system truststore is incorrect.

 * [SPARKJ-1020] When a proxy is used, the DNS lookup returns an error.

 * [SPARKJ-1032] The connector does not display error messages related to
   issues in Arrow initialization as expected.

 * [SPARKJ-1067] In some cases, when using a parameterized insert query, the
   connector rounds off decimal values automatically.

 * [SPARKJ-1096] If a prepared statement is created and not executed, the  
   connector does not close the session when the prepared statement is closed.

 * [SPARKJ-1099] When the multi-catalog feature is turned off
   (enableMultipleCatalog=0), Metadata API calls with a catalog filter do not
   return any results.


2.7.3 ------------------------------------------------------------------------

Released 2025-04-23

Enhancements & New Features

 * [SPARKJ-687] OAuth 2.0  Azure Managed Identity authentication support

   The connector now supports Azure Managed Identity OAuth 2.0 authentication. 
   To do this, set the Auth_Flow property to 3. For more information, see the 
   Installation and Configuration Guide.

 * [SPARKJ-958] VOID data type support

   The connector now supports the Void data type in getColumns() and 
   getTypeInfo() API calls. For more details, see:
   https://docs.databricks.com/aws/en/sql/language-manual/data-types/null-type

 * [SPARKJ-995] Variant data type support

   The connector now supports the Variant data type in getColumns() and 
   getTypeInfo() API calls. For more details, see:
   https://docs.databricks.com/en/sql/language-manual/data-types/variant-type.html

 * [SPARKJ-996] OAuth Token exchange support

   The connector now supports OAuth Token exchange feature for IDP different 
   than host. In these cases, OAuth access token (including BYOT) will be 
   exchanged for a Databricks in-house access token. For more information, see 
   the Installation and Configuration Guide.

 * [SPARKJ-1002] Token cache support

   The OAuth Browser (Auth_flow=2) now offers token caching support for Linux
   and Mac operating systems.

 * [SPARKJ-1014] Updated netty libraires 

   The connector has been upgraded with the following netty libraries:
   - netty-buffer 4.1.119 (previously 4.1.115)
   - netty-common 4.1.119 (previously 4.1.115)

 * [SPARKJ-1052] Unknown types handling support

   The connector now lists columns with unknown or unsupported types and maps
   them to SQL VARCHAR in the GetColumns() metadata API call.

 * [SPARKJ-1061] Databricks domains support

   The connector now supports cloud.databricks.us and cloud.databricks.mil 
   domains when connecting to Databricks using OAuth (AuthMech 11).

 * [SPARKJ-875] TIMESTAMP_NTZ data type support

   The connector now supports the TIMESTAMP_NTZ data type. For more details, 
   see: https://docs.databricks.com/aws/en/sql/language-manual/data-types/timestamp-ntz-type


Resolved Issues 
The following issues have been resolved in Databricks JDBC Driver 2.7.3.

 * [SPARKJ-926] The useCustomTimestampConverter connection property does not
   get passed to the server as an SSP property anymore.

 * [SPARKJ-1005] Corrected statement about HTTP proxy support in the 
   Installation and Configuration Guide.


2.7.1 ------------------------------------------------------------------------

Released 2024-12-05

Enhancements & New Features

 * [SPARKJ-807] Upgraded IP range support
  
   The OAuthEnabledIPAddressRanges setting now allows overriding the OAuth 
   private link. For more information, see the Installation and Configuration
   Guide.

 * [SPARKJ-942] Refresh token support

   Refresh token support is now available. This enables the driver to refresh
   authentication tokens using the Auth_RefreshToken automatically. 
   For more information, see the Installation and Configuration Guide.

 * [SPARKJ-952] UseSystemTrustStore support

   The connector supports the use of system’s trusted store with a new
   UseSystemTrustStore property. When enabled (UseSystemTrustStore=1), 
   the driver verifies connections using certificates from the system’s 
   trusted store.

 * [SPARKJ-952] UseServerSSLConfigsForOAuthEndPoint support
   The connector supports UseServerSSLConfigsForOAuthEndPoint property that 
   when it is enabled, it allows clients to share the driver’s SSL 
   configuration for the OAuth endpoint.

 * [SPARKJ-951] Updated third-party library

   The connector has been upgraded with the following third-party libraries:
   - arrow-memory-core 17.0.0 (previously 14.0.2)
   - arrow-vector  17.0.0 (previously 14.0.2)
   - arrow-format  17.0.0 (previously 14.0.2)
   - arrow-memory-netty 17.0.0 (previously 14.0.2)
   - arrow-memory-unsafe  17.0.0 (previously 14.0.2)
   - commons-codec 1.17.0 (previously 1.15)
   - flatbuffers-java 24.3.25 (previously 23.5.26) 
   - jackson-annotations-2.17.1 (previously 2.16.0)
   - jackson-core-2.17.1 (previously 2.16.0)
   - jackson-databind-2.17.1 (previously 2.16.0)
   - jackson-datatype-jsr310-2.17.1 (previously 2.16.0)
   - netty-buffer 4.1.115 (previously 4.1.100)
   - netty-common 4.1.115 (previously 4.1.100)
   

Resolved Issues
The following issues have been resolved in Databricks JDBC Driver 2.7.1.

 * [SPARKJ-642] When using IBM JRE and the Arrow result set serialization 
   feature, the connector now handles the Unicode characters correctly.

 * [SPARKJ-643] Complete error messages and causes for error code 401 are now
   returned.

 * [SPARKJ-713] Heartbeat threads no longer leak when connections are created
   using the DataSource class.

 * [SPARKJ-749] A change in cloud fetch request list has been applied to 
   manage memory usage better.

 * [SPARKJ-894] The translation issue with coalesce in the group by has been
   resolved.

 * [SPARKJ-940] A potential OAuth2Secret leak in the driver log has been 
   resolved.

 * [SPARKJ-971] Driver logs now contain query ID.

 * [SPARKJ-949] Tag mismatch error in OAuth U2M authentication (Auth_Flow=2)
   has been fixed.


2.6.40 -----------------------------------------------------------------------

Released 2024-08-16

Resolved Issues
The following issue have been resolved in Databricks JDBC Driver 2.6.40.

 * [SPARKJ-911] Security improvement (CVE-2024-49194)

 * [SPARKJ-892] Log improvement. Reduced WARNING: Invalid cookie header log.

 * [SPARKJ-933] Resolved issue with OAuth authorization URL check.


2.6.39 -----------------------------------------------------------------------

Released 2024-07-25

Enhancements & New Features
 
 * [SPARKJ-718] OIDC discovery endpoint support

   The connector can now enable OIDC discovery endpoint to fetch token and 
   authorization endpoint. For more information, see the Installation and
   Configuration Guide.

 * [SPARKJ-816] Updated Arrow support

   The connector now uses Apache Arrow version 14.0.2. Previously, the 
   connector used Apache Arrow version 9.0.0.
   
 * [SPARKJ-817] ProxyIgnoreList support
   
   The connector now supports ProxyIgnoreList property when UseProxy is set to
   1. For more information, see the Installation and Configuration Guide.

 * [SPARKJ-829] Refresh token support
   
   The connector now supports the optional refresh token. It saves the access
   token and reuses it for new connections as long as it is valid. If the 
   connector cannot renew the access token using the refresh token, it will 
   sign in again. For more information, see the Installation and Configuration
   Guide.

 * [SPARKJ-836] Query type support

   The connector now includes query type to the query profile to help with 
   query type identification.

 * [SPARKJ-870] OAuth Logging improvements

   Enhancements to OAuthlogging make resolving OAuth issues easier. 

 * [SPARKJ-877] Updated authentication support

   The connector now supports Browser Based (U2M) and Client Credentials (M2M)
   authentication on GCP cloud. 
   
   On Azure and GCP these are the new default values:
   - OAuth2ConnAuthAuthorizeEndpoint: $(host) +/oidc/oauth2/v2.0/authorize
   - OAuth2ConnAuthTokenEndpoint: $(host) +/oidc/oauth2/v2.0/token
   - OAuth2ClientId: databricks-sql-jdbc
   - OAuth2ConnAuthAuthscopeKey: sql offline_access
   
   On Azure, if the application specifies a client ID that is different from 
   the default value, the default scope is:
   - For U2M: 2ff814a6-3304-4ab8-85cb-cd0e6f879c1d/user_impersonation,
     offline_access
   - For M2M: 2ff814a6-3304-4ab8-85cb-cd0e6f879c1d/.default

   You can now configure the OAuth redirect port. To do this, set the 
   OAuth2RedirectUrlPort property to your port. For more information, see the 
   Installation and Configuration Guide.


Resolved Issues
The following issue has been resolved in Databricks JDBC Driver 2.6.39.

 * [SPARKJ-878] When a report with excessive log entries is generated, the
   connector returns 'StatusLogger Unrecognized' error.


2.6.38 -----------------------------------------------------------------------

Released 2024-04-05

Resolved Issues
The following issues have been resolved in Databricks JDBC Driver 2.6.38.

 * [SPARKJ-824] In parameterized queries the decimal data type truncates.


2.6.37 -----------------------------------------------------------------------

Released 2024-03-05

Enhancements & New Features
 
 * [SPARKJ-707] QueryProfile support

   IHadoopStatement now supports QueryProfile object that contains getQueryIds
   to retrieve a list of query IDs.
 
 * [SPARKJ-708] ASync bit support

   The connector now supports the ASync operations for metadata thrift calls,
   if the server uses SPARK_CLI_SERVICE_PROTOCOL_V9. To do this, set the 
   EnableAsyncModeForMetadataOperation property to 1. For more information,
   see the Installation and Configuration Guide.

 * [SPARKJ-716] Parameterized Query support

   The connector now supports the parameterized query in the native mode, if 
   the server uses SPARK_CLI_SERVICE_PROTOCOL_V8.

 * [SPARKJ-733] JWT assertion support

   The connector now supports JWT assertion OAuth using client credentials. To
   do this, set the UseJWTAssertion property to 1. For more information, see 
   the Installation and Configuration Guide.

 * [SPARKJ-739] UC Volume ingestion support

   The connector now supports UC Volume ingestion commands. To do this, set 
   the UseNativeQuery to 1. For more information, see the Installation and 
   Configuration Guide.

 * [SPARKJ-748] Updated Jackson libraries

   The connector now uses the following libraries for the Jackson JSON parser:
   - jackson-annotations 2.16.0 (previously 2.15.2)
   - jackson-core 2.16.0 (previously 2.15.2)
   - jackson-databind-2.16.0 (previously 2.15.2)


Resolved Issues
The following issues have been resolved in Databricks JDBC Driver 2.6.37.

 * [SPARKJ-714] The connector contains unshaded class files in META-INF 
   directory. 

 * [SPARKJ-726] The connector returns an assert error in the getMessage() 
   method, if JVM enables assertions.  


2.6.36 -----------------------------------------------------------------------

Released 2023-11-9

Enhancements & New Features

 * [SPARKJ-720] Token cache support

   The connector now supports disabling the refresh token cache. To do this, 
   set the EnableTokenCache property to 0. The TokenCachePassPhrase property
   needs to be set to a passphrase when using token cache. For more 
   information, see the Installation and Configuration Guide.

Resolved Issues
The following issues have been resolved in Databricks JDBC Driver 2.6.36.

 * [SPARKJ-720] The driver had issues with the OAuth token cache expiring. 
   
 * [SPARKJ-724] Package org.apache.commons.lang does not exist issue.
 
 * [SPARKJ-725] JDBC Driver throws exception if the string length of HOST is 
   less than 20 when using OAuth.


2.6.35 -----------------------------------------------------------------------

Released 2023-09-19

Enhancements & New Features

 * [SPARKJ-634] OAuth 2.0 M2M based authentication support

   The connector now supports M2M based OAuth 2.0 authentication. To do this, 
   set the Auth_Flow property to 1. For more information, see the Installation
   and Configuration Guide.

 * [SPARKJ-640] Server side encryption support

   The connector now supports server side encryption with user provided keys.

 * [SPARKJ-634] OAuth 2.0 browser based authentication support

   The connector now supports browser based OAuth 2.0 authentication. To do 
   this, set the Auth_Flow property to 2. For more information, see the 
   Installation and Configuration Guide.

 * [SPARKJ-634] OAuthWebServerTimeout support

   The connector now waits for the browser response during OAuth 2.0 
   authentication before timing out. For more information, see the 
   Installation and Configuration Guide. 

* [SPARKJ-703] Improved security feature

   The connector has been updated with improved security feature for 
   connection properties to prevent SSRF attacks.
 

Resolved Issues
The following issue has been resolved in Databricks JDBC Driver 2.6.35.

 * [SPARKJ-688] The connector turns on the socket timeout by default for HTTP
   connections and provides default values.


2.6.34 -----------------------------------------------------------------------

Released 2023-06-30

Enhancements & New Features

 * [SPARKJ-661][SPARKJ-693] Updated third-party library

   The connector has been upgraded with the following third-party libraries:
   - Apache Arrow 9.0.0 (previously 7.0.0)
   - Apache HttpClient 4.5.14 (previously 4.5.13)
   - Apache HttpCore 4.4.16 (previously 4.4.14)
   - Byte Buddy 1.14.5 (previously 1.14.0) 
   - flatbuffers 23.5.26 (previously 1.12.0) 
   - Google Guava 32.0.1 (previously 31.1)
   - jackson-annotations-2.15.2 (previously 2.13.4)
   - jackson-core-2.15.2 (previously 2.13.4)
   - jackson-databind-2.15.2 (previously 2.13.4.2)
   - log4j-api 2.20.0 (previously 2.17.1)
   - log4j-core 2.20.0 (previously 2.17.1)
   - log4j-slf4j-impl 2.20.0 (previously 2.17.1)
   - lz4 1.8.0 (previously 1.7.1)
   - netty-buffer 4.1.94.Final (previously 4.1.82.Final)
   - netty-common 4.1.94.Final (previously 4.1.82.Final)
   - slf4j 1.7.36 (previously 1.7.30)
   - thrift 0.17.0 (previously 0.13.0)
   
 * [SPARKJ-680] SonarCloud scan feature

   You can now configure SonarCloud scan feature for the connector which 
   identifies and logs the expected security issues in the source code.


Resolved Issues
The following issues have been resolved in Databricks JDBC Connector 2.6.34.

 * [SPARKJ-622] The REMARKS column of the tables metadata does not populate 
   with comments. 

 * [SPARKJ-655] When a query fails to connect to the server, the connector 
   does not clean up the unused threads.

 * [SPARKJ-666] The connector shows the SQLState and the error messages
   incorrectly.

 * [SPARKJ-667] When a resultset closure operation returns an error, the 
   connector does not clean up the operation handle entries from the heartbeat
   thread.

 * [SPARKJ-676] The connector checks the server protocol version incorrectly.


============================================================================== 
