These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.
The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.
The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.
The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:
java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.
For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.
Idle Google BigQuery connections did not fully close, causing the Hybrid Data Pipeline server to reach the limit on open files.
After upgrading to the On-Premises Connector, the On-Premises Connector received an HTTP 401 error from the Hybrid Data Pipeline server when attempting to connect. (On-Premises Connector 4.6.1.255)
When attempting to authenticate using SAML, Hybrid Data Pipeline returned the exception "Request header is too large."
After an account lockout occurred, OData queries were running successfully.
When using a third-party connector where the database credentials are not included in the Hybrid Data Pipeline data source, the user is prompted to enter credentials. In this scenario, Hybrid Data Pipeline returned the error message "user name is missing." (JDBC driver 4.6.1.77)
Hybrid Data Pipeline was unable to support the Azure Database for PostgreSQL as an external database because Azure Database for PostgreSQL requires a unique user naming convention.
When attempting to query an Azure Synapse serverless instance, Hybrid Data Pipeline returned a java.io.IOException worker thread error.
See Hybrid Data Pipeline known issues for details.