Hybrid Data Pipeline Release Notes History

4.6.2.3430

November 17, 2025

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Changed Behavior

SpyAttributes update

The JDBC connection property SpyAttributes has been updated to exclude the attribute load=classname, which was previously used to load the driver specified by the given class name. If a log file name does not include the .log extension, the driver automatically appends it. (JDBC driver 4.6.2.1023)

Resolved Issues

HDP-13329 Upgrade from 4.6.2.2978 to 4.6.2.3275 failed due to index name mismatch

When upgrading from Hybrid Data Pipeline version 4.6.2.2978 to version 4.6.2.3275, deployment failed because the schema update script assumed a specific index name that did not match the existing one.

HDP-13095 The validateservercertificate=false setting in the JDBC driver is not honored when using Java 17

When connecting to Hybrid Data Pipeline via JDBC with SSL enabled on Java 17, the validateservercertificate=false property was ignored, causing the SSL connection to fail due to server certificate validation. (JDBC driver 4.6.2.1023)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.2.3316

September 17, 2025

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

HDP-13315 StackOverflowError when connecting to Azure Synapse

When connecting to an Azure Synapse serverless pool, a stack overflow error was returned. (On-Premises Connector 4.6.2.1255)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.2.3309

August 11, 2025

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancement

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.107 to address the third-party vulnerability issue described in CVE-2025-52520. (On-Premises Connector 4.6.2.1223)

Resolved Issues

HDP-12920 Odata handles data length of string containing emojis incorrectly

A mismatch between PostgreSQL and OData in handling string lengths containing 4-byte characters such as emojis caused schema validation errors during data retrieval.

HDP-12382 Server accepts client credentials from both HTTP headers and request parameters during OAuth handshake

The Hybrid Data Pipeline Server accepts OAuth client credentials from both HTTP headers and request parameters, which could allow attackers to combine credentials from different sources and potentially impersonate clients, leading to unauthorized access.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.2.3275

July 11, 2025

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Hybrid Data Pipeline Helm chart

A Hybrid Data Pipeline Helm chart is now available. The Hybrid Data Pipeline Helm chart may be used to deploy Hybrid Data Pipeline to an Azure Kubernetes Service (AKS) cluster with the Application Gateway Ingress Controller (AGIC). The Helm chart bootstraps Hybrid Data Pipeline to a Kubernetes cluster using the Helm package manager. For details, refer to the Hybrid Data Pipeline Kubernetes Guide and the Hybrid Data Pipeline Helm chart GitHub repository.

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.106 to address third-party vulnerability issues described in the following Common Vulnerabilities and Exposures (CVEs):

(On-Premises Connector 4.6.2.1204)

BeanUtils and ESAPI libraries upgrade

The Hybrid Data Pipeline server has been upgraded to install and use commons-beanutils v1.11.0 and ESAPI v2.6.2.0 to address third-party vulnerability issues described in CVE-2025-48734.

FileUpload component upgrade

The Hybrid Data Pipeline server has been upgraded to install and use commons-fileupload v1.6 to address third-party vulnerability issues described in CVE-2025-48976.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.2.3226

May 06, 2025

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Azure SQL system database failover

Failover is supported when Azure SQL Database is used as a system database. The primary database can be replicated as a failover database in another Azure region or to another logical server. For more information, refer to Azure SQL system database failover.

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.102 to address third-party vulnerability issues described in CVE-2025-24813. (On-Premises Connector 4.6.2.1185)

Resolved Issues

HDP-12248 User validation fails when connecting to a PostgreSQL external system database

During the installation process, user validation fails when attempting to connect to a PostgreSQL external system database with SSL enabled. As a result, the installation fails, and a Hybrid Data Pipeline directory is not created.

HDP-11941 Tomcat and hdpui logs are not deleted per the log retention policy

Tomcat and hdpui logs are not deleted per the value set in the system level configuration option LogRetentionDays in Log Management.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.2.3113

February 18, 2025

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Snowflake key-pair authentication

Key-pair authentication is now supported for Snowflake data sources. For more information, refer to Snowflake parameters.

PostgreSQL external system database support for custom schema

When using PostgreSQL as a system database, a custom schema may be specified with the PostgreSQL search_path parameter. Refer to External system databases for details.

Docker image non-root user deployment

The Docker image has been enhanced to deploy Hybrid Data Pipeline Docker containers as a non-root user when running Docker on a Linux host. The Docker image has been changed to deploy and run the container with an hdpuser. For more information, refer to Docker deployment steps and Creating the shared file location on Linux.

Enhanced Security with Content Security Policy

A Content Security Policy (CSP) nonce has been added to mitigate security risks associated with inline scripting. By adding a nonce to the CSP header and to the inline scripts, only trusted scripts with the matching nonce are allowed to execute. This prevents unauthorized scripts from running.

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.98. (On-Premises Connector version 4.6.2.1132)

Changed Behavior

On-Premises Connector Java requirements

For On-Premises Connector versions 4.6.2.1132 and later, a JRE is no longer bundled with the installer. Therefore, for successful installation and operation of the On-Premises Connector, a JRE must installed on the host machine, and a JVM must be defined on the system path. Refer to JRE support and integration for details. (On-Premises Connector version 4.6.2.1132)

JDBC driver Java requirements

For JDBC driver versions 4.6.2.436 and later, a JRE is no longer bundled with the JDBC driver installer. Therefore, for successful installation and operation of the OPC a JRE must installed on the host machine, and a JVM must be defined on the system path. Refer to JRE support and integration for details. (JDBC driver version 4.6.2.436)

ODBC driver Java requirements (installation only)

For ODBC driver versions 4.6.2.356 and later, a JRE is no longer bundled with the ODBC driver installer. For successful installation, a JRE must be installed on your operating system, and the JVM must be defined on your system path. Refer to JRE support and integration for details. (JDBC driver version 4.6.2.356)

Third-party driver requirements

The installation package no longer includes the json-smart and accessor-smart jar files. However, some third-party JDBC drivers, such as the Microsoft JDBC Driver for SQL Server, require these components. For these drivers, the json-smart and accessor-smart jar files must be copied to the drivers folder with the driver jar file.

 

Resolved Issues

HDP-11478 Caching Web UI credentials is allowed

The Web UI allowed the caching of account credentials through browser auto-completion.

HDP-11748 OData schema map not refreshing in the Web UI for third-party driver

When using a JDBC third-party connector, the OData schema map could not be refreshed using the Web UI.

HDP-11785 Database processes persist in an idle state for days after the OData session should be closed

Connections from an OData data source to a database server persisted in an idle state when they should have been closed.

HDP-11786 Docker deployment does not allow the license key to be set by an environment variable

When deploying Hybrid Data Pipeline as a Docker container, the license key could not be set using an environment variable. Hybrid Data Pipeline was deployed with the docker run command, but the update.properties file indicated that the instance was unlicensed.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.2566

May 9, 2024

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

HDP-9319 Manage Configuration page did not load after upgrade to build 4.6.1.2529

After upgrading to build 4.6.1.2529 of the Hybrid Data Pipeline server, the Manage Configuration page in the Web UI did not load.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.2.2978

November 8, 2024

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Log management capabilities

Log management capabilities have been enhanced. Administrators may now specify a centralized location for Hybrid Data Pipeline logs. In addition, administrators may set logging levels for system services, including the web UI, the data access service, the notification server, and the Apache Tomcat server. For details, refer to Log management.

OData Version 4 Long Types

For OData Version 4, Hybrid Data Pipeline now supports long binary and long character types up to 1 MB. Supported long binary types include BLOB and LONGVARBINARY. Supported long character types include CLOB, LONGNVARCHAR, LONGVARCHAR, and NCLOB. Column sizes for long binary types may be managed with the limits ODataBinaryColumnSizeLimit and ODataIncludeBinaryLongData. Column sizes for long character types may be managed with the limits ODataCharacterColumnSizeLimit and ODataIncludeCharacterLongData. Refer to the following documentation resources for details: Entity Data Model (EDM) types for OData Version 4, Manage Limits view, and Limits API.

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.95. (On-Premises Connector version 4.6.2.1046)

ODBC driver ICU library upgrade (74.1) for Linux

For Linux, the ICU library files that are installed with the driver have been upgraded to version 74.1. In addition, the ICU library file names have changed. For the 32-bit driver, the ICU file name has changed from libivicu28.so to libivicu.so. For the 64-bit driver, the ICU file name has changed from libddicu28.so to libddicu.so. (ODBC driver version 4.6.2.340, November 21, 2024)

ODBC driver ICU library upgrade (74.1) for Windows

For Windows, the ICU library files that are installed with the driver have been upgraded to version 74.1. As a part of this upgrade, the ICU library file names have changed. For the 32-bit driver, the ICU file name has changed from ivicu28.dll to ivicu.dll. For the 64-bit driver, the ICU file name has changed from ddicu28.dll to ddicu.dll. (ODBC driver version 4.6.2.340, November 21, 2024)

Changed Behavior

IP address whitelists

Hybrid Data Pipeline server behavior with respect to the handling of client and proxy IP addresses has been modified to mitigate IP spoofing and enhance security. When deploying or upgrading to Hybrid Data Pipeline 4.6.2.2978 or later, you may be required to run a silent installation and configure additional settings to use or continue to use IP address whitelists. Refer to IP address whitelist deployment configurations for details.

Google Analytics support

Google has ended support for Universal Analytics (also referred to as Google Analytics 3). Therefore, Google Analytics 3 support has been removed from Hybrid Data Pipeline. Google Analytics 4 support was added to Hybrid Data Pipeline with the 4.6.1.1854 release of the server. Google Analytics 4 continues to be supported and maintained as a Hybrid Data Pipeline data store. Refer to Google Analytics 4 parameters for details. (On-Premises Connector version 4.6.2.1046)

ODBC driver Windows runtime version upgrade

The driver is now compiled with an upgraded compiler for Windows platforms. As a result, you must have Microsoft Visual C/C++ runtime version 14.40.33810 or higher on your machine to run the driver. (ODBC driver version 4.6.2.340, November 21, 2024)

Resolved Issues

Issue HDP-9081 Error "ORA-03137: malformed TTC packet from client rejected" returned when query ended with semicolon

When executing a SQL query that ends with a semicolon against an Oracle data source, the error "ORA-03137: malformed TTC packet from client rejected" was returned. (JDBC driver version 4.6.2.403)

Issue HDP-10028 OData schema map does not refresh for third-party JDBC connectors

When using a JDBC third-party connector, the OData schema map could not be refreshed using the Web UI.

Issue HDP-10898 Resource leaks resulted in "Too many open files" exception and caused the server to fail

Resource leaks occurred with SSL connections to OpenEdge, MySQL, Sybase, Oracle Service Cloud, and Db2 data sources. These leaks resulted in a "Too many open files" exception and caused the server to fail.

Issue HDP-11034 SAP S/4HANA data source missing Connector ID parameter for On-Premises Connector

On the data source page for the SAP S/4HANA data store, the Connector ID parameter for the On-Premises Connector was missing.

Issue HDP-11081 SAP S4/HANA data source missing the "Extended Options" parameter

On the advanced tab of the SAP S4/HANA data store page, the "Extended Options" parameter was not exposed.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.2529

April 18, 2024

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Custom password policy

Hybrid Data Pipeline now supports the creation of a custom password policy. Administrators may set an expiration date for passwords and configure the minimum and maximum number of characters allowed in a password. A custom policy may also be configured to require upper case letters, lower case letters, numbers, and special characters. See Password policy for details.

Collect information about On-Premises Connectors

The new Administrator Connectors API allows administrators to retrieve information about On-Premises Connectors registered with Hybrid Data Pipeline. Administrators may obtain a full list of On-Premises Connectors with this API. They may also use it to filter the list for details such as version number, owner, and tenant. See Obtaining information about On-Premises Connectors for details.

Download data source logs using the Web UI or the Hybrid Data Pipeline API

Hybrid Data Pipeline provides data source logging to record user activity against data sources. Data source logs may now be obtained from the Data Sources view in the Web UI or with the data source logs endpoint. In addition, data source logs may be retrieved by running the getdslogs.sh script on each node in the deployment. See Obtaining the logs for a data source for details.

Download system logs using the Web UI or the Hybrid Data Pipeline API

Hybrid Data Pipeline generates a number of log files to record events, activity, and other information. System logs may now be obtained through the System Configurations view in the Web UI or via the Nodes API. In addition, system logs may be retrieved by running the getlogs.sh script on each node in the deployment. See System logs for details.

Changed Behavior

MySQL Community Edition

For connectivity to MySQL CE, the MySQL CE Connector/J jar must be supplied during the deployment of Hybrid Data Pipeline. With this release, version 8.0 of the MySQL CE Connector/J jar has been certified with Hybrid Data Pipeline. For the latest data source and platform support information, refer to the  Product Compatibility Guide.

Resolved Issues

Issue HDP-8498 MaxFetchRows limit not specifying the maximum rows allowed to be fetched

After setting the MaxFetchRows limit, it was observed that Hybrid Data Pipeline ignored the limit. In addition, the SQL Editor returned incorrect results.

Issue HDP-8677 Calls to api/mgmt/datastores endpoint returning invalid JSON

When querying the api/mgmt/datastores endpoint, Hybrid Data Pipeline returned invalid JSON in the response payload.

Issue HDP-8683 UserMeter table RemoteAddress field contains the Hybrid Data Pipeline server IP address instead of the client machine IP address for OData queries

When querying the UserMeter table for information about an OData query, the RemoteAddress field contained the Hybrid Data Pipeline server IP address instead of the IP address of the client machine.

Issue HDP-8690 Error 'Value must be a valid URL' returned when registering a SAML authentication service

When registering a SAML authentication service using Azure as the Identify Provider, Hybrid Data Pipeline returned the error "Value must be a valid URL" even though the IDP entity ID was valid.

Issue HDP-8710 Special characters not supported for external system database user passwords

When a special character was used for the user password of a MySQL system database, the Hybrid Data Pipeline server installation failed.

Issue HDP-8844 Worker thread error: java.io.EOFException with the ODBC driver

When specifying NULL for a SQL_DECIMAL parameter while inserting data with the ODBC driver, the error "[DataDirect][ODBC Hybrid driver][Service]Worker thread error: java.io.EOFException" was returned. (ODBC driver 4.6.1.268)

Issue HDP-9079 OAuth2 is not working against the Salesforce test instance

When attempting to connect to a Salesforce test instance using OAuth, Hybrid Data Pipeline returned the error "There is a problem connecting to the DataSource. REST STatus 404 NOT Found returned for GET https://login.salesforce.com/services/oauth2/userinfo."

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.2057

December 5, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Data Store page

The Data Store page has been enhanced. The Data Store page lists all supported data stores. It is the first stop when creating a data source, or connection, to a data store. The enhancements to the Data Store page include a new layout, search functionality, and links to documentation resources.

SSL certificate update script

The Hybrid Data Pipeline product package now includes the update_server_cert.sh shell script to simplify the process of updating SSL certificates in Linux deployments of Hybrid Data Pipeline. After you obtain a new CA certificate, you may run the script to configure the server to use the new certificate. Then, depending on your environment, certificate information must be updated for components such as the ODBC driver, JDBC driver, and On-Premises Connector. See Updating SSL certificates in the Deployment Guide for details.

curl Library Upgrade

The curl library files that are installed with the ODBC driver have been upgraded to version 8.4.0, which fixes a number of potential security vulnerabilities. For more information on the vulnerabilities resolved by this enhancement, refer to Vulnerabilities in the curl documentation. (ODBC driver 4.6.1.249)

Changed Behavior

Shutdown Port

The default value of the shutdown port has been changed from -1 to 8005.

Resolved Issues

Issue HDP-8191 Exit Code 1 returned when deploying the ODBC driver as a Docker container

When attempting to deploy the Hybrid Data Pipeline ODBC driver in a Docker container, Exit Code 1 is returned. (ODBC driver 4.6.1.249)

Issue HDP-8281 ODBC driver curl Library vulnerabilities (CVE-2023-38545 and CVE-2023-38546)

The curl library files that are installed with the ODBC driver have been upgraded to version 8.4.0 to address the curl Library vulnerabilities CVE-2023-38545 and CVE-2023-38546. (ODBC driver 4.6.1.249)

Issue HDP-8307 OPC not sending SNI extension on SSL handshake for websocket connections

When contacting the Hybrid Data Pipeline server to open a websocket connection, the On-Premises Connector was not providing the Server Name Indication (SNI) extension for the SSL handshake. (On-Premises Connector 4.6.1.758)

Issue HDP-8431 JDBC driver installation fails with "Invalid CEN header" error after upgrade to Java 11

After upgrading to Java 11.0.20 on Windows Server 2019, the installation of the JDBC driver failed with the error "java.util.zip.ZipException: Invalid CEN header (invalid extra data field size for tag: 0x3831 at 0)." (JDBC driver 4.6.1.271)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.1930

October 26, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-8003 Server unresponsive to Azure Application Gateway load balancer

Hybrid Data Pipeline became unresponsive to incoming queries (1) due to slow response times associated with queries sent to an on-premises data source and (2) because threads were not timing out as expected

Issue HDP-8009 Azure CosmosDB and Mongo datasources in FIPS environment

An issue that prevented FIPS to be used with Azure CosmosDB and MongoDB connections has been resolved.

HDP-8010 SAP HANA datasource in FIPS environment

An issue that prevented FIPS to be used with SAP HANA connections has been resolved.

Issue HDP-8183 Unable to connect to on-premises data source after upgrading to OPC version 4.6.1.676

After upgrading to version 4.6.1.676 of the On-Premises Connector, the Hybrid Data Pipeline server was unable to connect to the on-premises data source. (On-Premises Connector 4.6.1.709)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.1128

January 25, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-6464 Service returned only Date and Time values in UTC when fetching data from an OData-enabled Oracle database

When fetching data from an OData-enabled Oracle database, Hybrid Data Pipeline returned Date and Time values only in UTC.

Issue HDP-6539 SQL Editor unable to browse tables, views, or procedures under a schema name that has dot

When using the SQL Editor to query a SQL Server data source, the SQL Editor was unable to browse tables, views, and procedures under any schema name that included a dot.

Issue HDP-6623 HDP_DATABASE_ADVANCED_OPTIONS did not enable SSL against the system database

When deploying the server as a Docker container, using the HDP_DATABASE_ADVANCED_OPTIONS option to enable SSL (HDP_DATABASE_ADVANCED_OPTIONS=EncryptionMethod=SSL) failed to enable SSL against the system database.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0607

July 28, 2022

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-5690 Hybrid Data Pipeline reached open files limit

Idle Google BigQuery connections did not fully close, causing the Hybrid Data Pipeline server to reach the limit on open files.

Issue HDP-5866 On-Premises Connector throwing HTTP 401 error during installation

After upgrading to the On-Premises Connector, the On-Premises Connector received an HTTP 401 error from the Hybrid Data Pipeline server when attempting to connect. (On-Premises Connector 4.6.1.255)

Issue HDP-5938 "Request header is too large" exception with HDP SAML authentication

When attempting to authenticate using SAML, Hybrid Data Pipeline returned the exception "Request header is too large."

Issue HDP-6133 Account lockout not working as expected for ODBC and OData data access

After an account lockout occurred, OData queries were running successfully.

Issue HDP-6152 Not passing user credentials when using third-party connector

When using a third-party connector where the database credentials are not included in the Hybrid Data Pipeline data source, the user is prompted to enter credentials. In this scenario, Hybrid Data Pipeline returned the error message "user name is missing." (JDBC driver 4.6.1.77)

Issue HDP-6154 Unable to use Azure Database for PostgreSQL as an external database

Hybrid Data Pipeline was unable to support the Azure Database for PostgreSQL as an external database because Azure Database for PostgreSQL requires a unique user naming convention.

Issue HDP-6178 Worker thread error when connecting to Azure Synapse serverless instance

When attempting to query an Azure Synapse serverless instance, Hybrid Data Pipeline returned a java.io.IOException worker thread error.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.1548

June 23, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Snowflake support

Support for connectivity to Snowflake has been added to Hybrid Data Pipeline. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to Snowflake.

Note: Hybrid Data Pipeline does not support FIPS for Snowflake connections. Refer to "FIPS mode" or "Snowflake" in Hybrid Data Pipeline known issues for details.

Changed Behavior

MySQL Community Edition

The MySQL CE data store icon no longer appears by default on the Data Stores page. The icon will only appear if the MySQL Connector/J driver jar has been provided during the deployment process.

Resolved Issues

Issue HDP-7219 ODBC Driver on Windows is not showing the version number

The ODBC driver does not include the version metadata required to display the driver version number in the ODBC Administrator and in the driver library properties. (ODBC driver 4.6.1.177)

Issue HDP-7510 Connection failed when using custom authentication to connect to a REST service

When using custom authentication to connect to a REST service with the Autonomous REST Connector, the connection failed after an initial connection because the Hybrid Data Pipeline server was not properly storing authentication parameters.

Issue HDP-7541 Address the Spring Security vulnerability (CVE-2023-20862)

Hybrid Data Pipeline has been updated to use Spring Framework version 5.8.3 to address security vulnerabilities described in CVE-2023-20862. (Hybrid Data Pipeline server 4.6.1.1548, On-Premises Connector 4.6.1.570)

Issue HDP-7545 enable_ssl.sh does not throw an error when an argument is not supplied

When running enable_ssl.sh, the script does not throw an error when an argument is not supplied.

Issue HDP-7595 SQL Editor query to Azure SQL Data Warehouse failed when using ActiveDirectoryPassword authentication

When using the SQL Editor to query Azure SQL Data Warehouse with ActiveDirectoryPassword authentication, the error message "Catalog view 'dm_exec_sessions' is not supported in this version" was returned.

Issue HDP-7596 Server-side request forgery with Autonomous REST Connector

The Autonomous REST Connector was able to access the local file system of the server hosting Hybrid Data Pipeline.

Issue HDP-7597 No results returned for a query that attempted to use dynamic filtering on a date field

When using the Autonomous REST Connector to connect to a REST service, Hybrid Data Pipeline failed to return results for a query that attempted to use dynamic filtering on a date field.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0014

December 17, 2019

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Entity case conversion feature

The Hybrid Data Pipeline has been enhanced to support case conversions for entity types, entity sets, and properties. The owner of a data source can now change the entity type, entity set, and property names to all uppercase or all lowercase on the OData tab in the Web UI or using the Hybrid Data Pipeline Management API.

Web UI data source sharing feature

The Web UI has been enhanced to support data source sharing. The owner of a data source can now share access to a data store with Hybrid Data Pipeline users and tenants through the Data Sources view in the Web UI.

Web UI IP address whitelist feature

The Web UI has been enhanced to fully support the IP address whitelist feature. Administrators can secure access to Hybrid Data Pipeline resources by implementing IP address whitelists through the Web UI. The Web UI can be used to create IP address whitelists at the system level, tenant level, user level, or some combination of these levels

Web UI navigation bar

The navigation bar can be expanded to show the names of the views supported in the Web UI. The icons in the navigation bar have been reordered and updated.

PostgreSQL OData Version 4 stored functions

Hybrid Data Pipeline supports exposing stored functions for OData Version 4 connectivity to PostgreSQL data sources. When configuring a PostgreSQL data source, the OData schema map can be configured to expose stored functions.

JDBC and ODBC throttling

A new throttling limit has been introduced in the System Limits view. The XdbcMaxResponse limit can be used to set the approximate maximum size of JDBC and ODBC HTTP result data.

ODBC driver branded installation

The ODBC driver installation program has been enhanced to support branded installations for OEM customers (available in the ODBC driver installer on November 18, 2019). The branded driver can then be distributed with OEM customer client applications. For the Hybrid Data Pipeline ODBC driver distribution guide, visit the Progress DataDirect Product Books page on the Progress PartnerLink website (login required).

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.20. (On-Premises Connector version 4.6.1.7)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0372

March 9, 2022

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

PostgreSQL 14

Hybrid Data Pipeline now supports connectivity to PostgreSQL 14 databases. PostgreSQL 14 can also be used as a system database to store account and configuration information for a Hybrid Data Pipeline instance. This functionality is supported in the following component versions.

  • Hybrid Data Pipeline Server 4.6.1.372 and higher
  • On-Premises Connector 4.6.1.120 and higher

Changed Behavior

PostgreSQL call escape behavior

The default behavior for handling PostgreSQL call escape syntax has changed. Previously, Hybrid Data Pipeline only supported stored functions, and treated the non-standard escape syntax {call function()} the same as the standard escape syntax {? = call function()}. With this latest patch, Hybrid Data Pipeline supports stored functions and stored procedures for JDBC and ODBC connections. Now Hybrid Data Pipeline determines whether a function or procedure is being called based on the call escape syntax. If the return value parameter ?= is used, then the connectivity service calls a stored function. If the return value parameter is not used, then the connectivity service calls a stored procedure. You can change this default behavior by setting the CallEscapeBehavior option as an extended option under the Advanced tab. These are the valid values for the CallEscapeBehavior option:

  • If set to select, treats the object being called by both the {call...} syntax and the {? = call...} syntax as a stored function and makes the applicable native call to the PostgreSQL database.
  • If set to call, treats the object being called by both the {call...} syntax and the {? = call...} syntax as a stored procedure and makes the applicable native call to the PostgreSQL database.
  • If set to callIfNoReturn (the default), the service determines whether to call a function or stored procedure based on the call escape syntax. If the return value parameter ?= is used, the service calls a function. If not, the service calls a stored procedure.

Resolved Issues

Issue HDP-5459 OData $expand query fails against OpenEdge data source

When using the OData $expand functionality to query an OpenEdge data source, the query failed and an error was returned.

Issue HDP-5605 SQL Editor not displaying values when two columns had the same name

When a SQL query included columns of the same name, the SQL Editor did not display the column values.

Issue HDP-5642 SQL Editor not displaying results

The SQL Editor did not display results as expected.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.1854

September 12, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Google Analytics 4 support

Support for connections to Google Analytics 4 (GA4) has been added to Hybrid Data Pipeline. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to GA4. Refer to [GA4] Introducing the next generation of Analytics, Google Analytics 4 in Google's Analytics Help for information on GA4 and the retirement of Universal Analytics (also referred to as Google Analytics 3 or GA3). Refer to Google Analytics 4 parameters for details. (On-Premises Connector 4.6.1.676)

MongoDB support (including MongoDB Atlas and Azure CosmosDB for MongoDB)

Hybrid Data Pipeline now supports access to MongoDB and MongoDB-type data stores, such as MongoDB Atlas and Azure CosmosDB for MongoDB. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to MongoDB and MongoDB-type data stores. Refer to MongoDB parameters for details. (On-Premises Connector 4.6.1.676)

Note:

  • In this release, Hybrid Data Pipeline does not support MongoDB-type data stores in FIPS environments.
  • The Kerberos authentication method is not supported for MongoDB in Hybrid Data Pipeline.
SAP HANA support

Hybrid Data Pipeline now supports access to SAP HANA data stores. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to SAP HANA. Refer to SAP HANA parameters for details. (On-Premises Connector 4.6.1.676)

Note: In this release, Hybrid Data Pipeline does not support SAP HANA in FIPS environments.

SAP S/4HANA support (including SAP BW/4HANA and SAP NetWeaver)

Hybrid Data Pipeline now supports access to SAP S/4HANA and S/4HANA-type data stores, such as SAP BW/4HANA and SAP NetWeaver. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to SAP S/4HANA and S/4HANA-type data stores. Refer to SAP S/4HANA parameters for details. (On-Premises Connector 4.6.1.676)

Note: The HTTP Header authentication method is not supported for SAP S/4HANA, SAP BW/4HANA, and SAP NetWeaver in Hybrid Data Pipeline

OpenSSL 3.0 support

The ODBC driver has been updated to use OpenSSL 3.0 to implement TLS protocols for data encryption between client applications and Hybrid Data Pipeline. In addition, the driver supports the Federal Information Processing Standard or FIPS (140-2), regarding cryptographic module security requirements. (ODBC driver 4.6.1.239)

Changed Behavior

OpenSSL 3.0 support

The ODBC driver has been updated to use OpenSSL 3.0 to implement TLS protocols for data encryption between client applications and Hybrid Data Pipeline. This enhancement allows the driver to support the Federal Information Processing Standard or FIPS (140-2), regarding cryptographic module security requirements. To support OpenSSL 3.0 and FIPS, the Crypto Protocol Version and Enable FIPS connection options have been added to the driver. (ODBC driver 4.6.1.239)

SQL Editor

Previously, when an end user created and saved a Hybrid Data Pipeline data source without providing authentication credentials, the user would be prompted for credentials when using the SQL editor to query the data source. This is no longer the case. Now, when an end user attempts to use the SQL editor to query a data source for which credentials have not been saved, Hybrid Data Pipeline returns the error "INVALID_LOGIN: Invalid username, password, security token; or user locked out."

Resolved Issues

Issue HDP-7621 Address Apache Tomcat vulnerability CVE-2023-28709

Hybrid Data Pipeline has been updated to install and use Tomcat 9.0.75. This update addresses the security vulnerability in Tomcat 9.0.73 as described in CVE-2023-28709. (On-Premises Connector 4.6.1.676)

Issue HDP-7228 Values in the odbc.ini template not correct

After the installation of the ODBC driver on Linux, the default values in the odbc.ini template installed with the driver did not match the values in the hybridDefaults.properties file. (ODBC driver 4.6.1.239)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0023

May 19, 2020

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-4490 ODataApplicationException returned when filtering on BIT/BOOLEAN field

When exposing a Microsoft SQL Server table via OData Version 4 and filtering on a BIT/BOOLEAN field, Hybrid Data Pipeline returned the ODataApplicationException An expression of non-boolean type specified in a context where a condition is expected, near ')'.

Issue HDP-4480 Shutdown script not working

With some shell configurations, the Hybrid Data Pipeline shutdown script stop.sh was not shutting down Hybrid Data Pipeline server processes.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0558

July 6, 2022

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Support for Google Analytics, Google BigQuery, Salesforce, and REST data store implementations of OAuth 2.0

Hybrid Data Pipeline has been enhanced to support Google Analytics, Google BigQuery, Salesforce, and REST data store implementations of OAuth 2.0. To integrate Hybrid Data Pipeline with an OAuth 2.0 authorization flow, Hybrid Data Pipeline must be registered as a client application with the given data store. Then, OAuth application and profile objects must be created to manage OAuth endpoints, properties, and tokens. For details, refer to Integrating Hybrid Data Pipeline as a client application with a data store OAuth 2.0 authorization flow. (On-Premises Connector 4.6.1.241)

Changed Behavior

Google Analytics OAuth 2.0 implementation

The procedures for integrating Hybrid Data Pipeline as a client application to enable access to Google Analytics include the ability to select or create an OAuth application in the Web UI. For details, refer to Google Analytics parameters.

Resolved Issues

Issue HDP-5804 Selecting a data source in the SQLEditor results in no suitable driver found error

When selecting a data source from the dropdown in the SQL Editor, the server is returning the error "No suitable driver found."

Issue HDP-5805 Error on datetime column when using OData to connect to MySQL Community Edition

When performing an insert on an OData-enabled MySQL Community Edition data source, Hybrid Data Pipeline returned an error on a datetime column.

Issue HDP-5836 NPE after receiving a merge request with an empty payload

Performing a mergeEntity operation against an OData-enabled MySQL Community Edition data source resulted in a NullPointerException.

Issue HDP-5881 Unable to configure server-side SSL between HDP nodes

Server-side SSL could not be configured because the enable_ssl.sh script was not properly setting the truststore information from the Web UI.

Issue HDP-5924 Update the context.xml file to disable Session persistence in Tomcat

To mitigate the CVE-2022-23181 security vulnerability, the Tomcat context.xml file has been modified such that session persistence is disabled by default.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0092

December 15, 2020

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Aggregation support

Hybrid Data Pipeline has added support for a subset of the functionality defined by the OData Version 4 extension for data aggregation. Aggregation functionality is extended with the $apply query parameter. See Aggregation support in the user's guide for details.

Windows Server 2019

The Hybrid Data Pipeline On-Premises Connector, ODBC driver, and JDBC driver have added support for Windows Server 2019. (On-Premises Connector version 4.6.1.48, ODBC driver 4.6.1.12, JDBC driver 4.6.1.9)

Resolved Issues

Issue HDP-4478 Unable to connect using TNS connection option for Oracle

The Hybrid Data Pipeline server required the specification of the Server Name parameter, even though Server Name is not required for a TNS connection. In addition, when Server Name was specified, the server returned an inaccurate error message.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0020

April 30, 2020

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-4465 Parenthesis in OData query not honored

When translating an OData Version 4 query to a SQL query, Hybrid Data Pipeline did not honor the parenthesis in the OData query. This caused the reordering of the operator precedence, and lead to incorrect results. The parenthesis in OData queries is now reproduced in SQL queries to maintain operator precedence.

Issue HDP-4464 Intermittent error "origin=driver message=PAYLOAD_TYPE_BINARY != opcode, was 10" (On-Premises Connector version 4.6.1.8)

When connecting to an on-premise data source using the On-Premises Connector, the error origin=driver message=PAYLOAD_TYPE_BINARY != opcode, was 10 was returned. The On-Premises Connector now correctly handles pongs sent from load balancers according to WebSocket protocol.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0296

December 6, 2021

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Docker image

A production instance of the Hybrid Data Pipeline server can now be deployed using a Docker image. The Hybrid Data Pipeline Docker image is available in the Hybrid Data Pipeline Docker Deployment Package. In addition, the Docker Deployment Package includes demos for a number of deployment scenarios. For details and instructions, see Deploying Hybrid Data Pipeline using Docker in the installation guide.

OpenID Connect (OIDC) support

Hybrid Data Pipeline now supports user authentication using the OIDC protocol. An identity provider and client applications can be configured to authorize users and grant access to the OData endpoints of the Hybrid Data Pipeline server. See Integrating an OIDC authentication service in the user's guide for details.

Resolved Issues

Issue HDP-5395 Third-party JDBC Oracle driver integration does not return tables

When using the third-party JDBC Oracle driver, the Hybrid Data Pipeline SQL Editor did not return tables.

Issue HDP-5433 Unable to authenticate when special character '+' (plus sign) in account password

When the special character '+' (plus sign) was used in an account password, the user was unable to authenticate with the Hybrid Data Pipeline server.

Issue HDP-5461 Unable to access Oracle Cloud Financials

Hybrid Data Pipeline was unable to access Oracle Cloud Financials REST Endpoints with the Autonomous REST Connector.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0032

July 16, 2020

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-4534 Unable to connect to Google Analytics data source

When connecting to Google Analytics data source using an OAuth profile created in a previous version of Hybrid Data Pipeline, the following error was returned: There is a problem connecting to the data source. Error getting data source. System not available, try again later.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0306

December 15, 2021

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-5560 Resolved Log4j security vulnerability

Hybrid Data Pipeline has been updated to use Log4j version 2.15 to address the security vulnerability found in Log4j version 2.13.3 as described in CVE-2021-44228. For details, refer to CVE-2021-44228. (Hybrid Data Pipeline server 4.6.1.306, On-Premises Connector version 4.6.1.85).

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0256

October 13, 2021

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Stored procedures

Hybrid Data Pipeline now supports invoking stored procedures for JDBC and ODBC connections. Stored procedures functionality includes support for input parameters, output parameters, and in/out parameters. Stored procedures that return multiple results are also supported. This functionality is supported in the following component versions.

  • Hybrid Data Pipeline Server 4.6.1.256 and higher
  • On-Premises Connector 4.6.1.73 and higher
  • JDBC Driver 4.6.1.23 and higher
  • ODBC Driver 4.6.1.31 and higher

Resolved Issues

Issue HDP-5020 Error message did not state reason that the required secure random instance could not be created when enabling FIPS

When attempting to enable FIPS, the error message did not state that the required secure random instance could not be created because there was not enough entropy on the host machine.

Issue HDP-5064 JDBC driver not able to follow redirects (JDBC driver 4.6.1.23)

When an HTTP redirect status was returned, the driver was unable to follow the redirection and returned an error that the HTTP endpoint had been relocated. To resolve this issue, the FollowRedirects connection property has been introduced. When FollowRedirects is enabled, the driver can follow an HTTP redirection instead of returning an error. For details, refer to FollowRedirects.

Issue HDP-5412 "Unexpected end of stream in statement" error returned

When processing an empty result reply to a query execution request against a ServiceNow REST service, Hybrid Data Pipeline returned "Unexpected end of stream in statement" error.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0138

April 16, 2021

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-4923 Performance issue querying OpenEdge database

To improve performance, the OData startswith() function was changed to use LIKE instead of LOCATE for a SQL statement that takes less time to execute.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.1391

April 27, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Web UI branding

Hybrid Data Pipeline now supports branding of its Web UI. The default branding information like logo, colors, naming, and icons can be configured before or after installation. For more information, refer to Branding the Web UI for details.

Autonomous REST Composer

The Autonomous REST Composer is now available on the Configure Endpoints tab from the Autonomous REST Connector data store interface. The Composer allows you to create a REST data source and configure or import a REST Model file using the Web UI. For more information, refer to Creating REST data sources with the Web UI for details.

Tomcat updates

Hybrid Data Pipeline has been updated to install and use Tomcat 9.0.73.In addition, the following Hybrid Data Pipeline Tomcat configurations have been made to improve security.

  • The autoDeploy attribute has been set to false. Hence, Tomcat does not check for new or updated web applications.
  • TLS 1.2 is the minimum supported version for TLS encryption.
  • Weak ciphers are no longer supported.
  • For Linux installations, the Shutdown Port permits a REST call to shutdown the server. The default value of this port has been changed from 8005 to -1. The new default value -1 disables this port.
Oracle 19c certified as a system database

Oracle 19c has been certified to operate as a Hybrid Data Pipeline system database.

Microsoft Dynamics 365 Cross Company connection option

The Microsoft Dynamics 365 data store supports a new connection option Cross Company that allows access to cross company data for users who have access to multiple companies. Refer to Microsoft Dynamics 365 parameters for details.

Resolved Issues

Issue HDP-7029 JDBC driver returned the error "unexpected end of stream reached"

When querying a SQL Server data source, the JDBC driver returned the "unexpected end of stream reached" error. (JDBC driver 4.6.1.212)

Issue HDP-7147 Resolved Hybrid Data Pipeline vulnerability CVE-2023-24998

The shipping version of the Tomcat server was upgraded from Tomcat 9.0.65 to 9.0.73 to address the vulnerability described in CVE-2023-24998. (Hybrid Data Pipeline server 4.6.1.1391, On-Premises Connector 4.6.1.524)

Issue HDP-7495 Unable to configure SSL behind a load balancer when using FIPS and an external JRE

After configuring the Hybrid Data Pipeline server to use an external JRE and run in FIPS mode, server-side SSL could not be enabled. (Hybrid Data Pipeline server 4.6.1.1391)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0169

June 10, 2021

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Google BigQuery support

Hybrid Data Pipeline now supports access to Google BigQuery. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to Google BigQuery. OAuth 2.0 and Service Account authentication methods are supported.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0107

January 13, 2021

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

OData query throttling for users

Hybrid Data Pipeline supports throttling the number of simultaneous OData queries a user may have running against a Hybrid Data Pipeline server at one time. OData query throttling for users may be configured with the ODataMaxConcurrentRequests and ODataMaxWaitingRequests limits. The ODataMaxConcurrentRequests limit sets the maximum number of simultaneous OData requests allowed per user, while the ODataMaxWaitingRequests limit sets the maximum number of waiting OData requests allowed per user. See Throttling in the user's guide for details.

Environment variables support for silent installation

Support for environment variables to specify server and system database credentials during the installation process has been added. The use of environment variables allows you to perform a more secure silent installation, compared to a standard silent installation where credential information must be specified in plain text in the silent installation response file. See Silent installation process in the user's guide for details.

Resolved Issues

Issue HDP-4853 Installation failed when special characters were used in system database credentials

When installing the Hybrid Data Pipeline server using SQL Server as the system database, the use of special characters in admin or user account credentials caused the installation to fail with the error Error in createSchema at Line 266.
NOTE: While installation no longer fails when special characters are used in system database account credentials, the installer cannot currently validate the necessary database schema objects when any of the following special characters are used in either database user ID or password values: space ( ), quotation mark ("), number sign (#), dollar sign ($), and apostrophe ('). Therefore, in a standard installation where these characters are used in database credentials, database validation must be skipped to proceed with the installation. Similarly, when performing a silent installation in this case, the SKIP_DATABASE_VALIDATION property should be set to true. Note that when skipping database validation in this scenario, the server should install successfully and work with the specified system database.

Issue HDP-4854 Silent installation process required the specification of system database admin and user passwords in clear text in the response file

The specification of system database admin and user passwords in plain text in the response file as part of the silent installation process raised security concerns. Support for environment variables to specify server and system database credentials during the installation process has been added. See Silent installation process in the user's guide for details.

Issue HDP-4859 Firefox, Chrome, and Microsoft Edge browsers not rendering Web UI correctly for load balancer installation

When the HAProxy load balancer was configured with the setting x-content-type-options:nosniff, Firefox, Chrome, and Microsoft Edge browsers rendered the Web UI as text instead of HTML.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0233

September 10, 2021

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

SSO/SAML support

Hybrid Data Pipeline now supports user authentication using the SSO/SAML protocol. Customers can configure SAML authentication by providing the details of an identity provider and can configure users to use the SAML authentication.

Resolved Issues

Issue HDP-4549 HDP server unreachable due to OS file handle leak

When the "FileNotFoundException (Too many open files)" error occurred, the Hybrid Data Pipeline connection was lost and the server had to be restarted.

Issue HDP-5202 Error returned when fetching MySQL zero values for date and datetime columns

When fetching invalid date and datetime values from columns or literals, such as SELECT DATE(0), against MySQL data sources, the Hybrid Data Pipeline server returned an error.

Issue HDP-5210 OData v4 Endpoint not compatible with Tableau Desktop

Tableau was unable to connect to OData v4 endpoints exposed by Hybrid Data Pipeline.

Issue HDP-5217 Some special characters not allowed in passwords

Users were unable to use special characters for Hybrid Data Pipeline passwords.

Issue HDP-5266 Load balancer not returning OData responses from the server

When HTTP was disabled on the load balancer, the load balancer did not return OData responses to the client application as would be expected with the configuration of the X-Forwarded-Proto header to manage HTTP and HTTPS traffic.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0357

February 16, 2022

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Server-side SSL

Hybrid Data Pipeline now supports server-side SSL. Server-side SSL allows you to enable SSL behind the load balancer and secure communication between the load balancer and server nodes, as well as Hybrid Data Pipeline nodes in a cluster deployment. This functionality is supported in the following component versions.

  • Hybrid Data Pipeline Server 4.6.1.357 and higher
  • JDBC Driver 4.6.1.32 and higher
  • ODBC Driver 4.6.1.34 and higher

Note:

  • Updating On-Premises Connectors is not required to configure server-side SSL.
  • For details on server-side SSL, refer to SSL configuration.
curl Library update (ODBC driver 4.6.1.34)

The curl library files used with the ODBC driver have been upgraded to version 7.80.0.

OpenSSL library update (ODBC driver 4.6.1.34)

The default version of the OpenSSL library used with the ODBC driver has been upgraded to version 1.1.1l.

Resolved Issues

Issue HDP-5587 SYNONYMS not displayed in the Web UI

The SQL Editor was not displaying SYNONYM objects.

Issue HDP-5611 "Unexpected end of stream" error returned

When queries demanded the return of multiple large result sets, the query failed and the error "Unexpected end of stream" was returned.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0757

September 14, 2022

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Microsoft Dynamics 365 support

Hybrid Data Pipeline now supports access to a number of Microsoft Dynamics 365 apps. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to these Dynamics 365 apps. OAuth 2.0 connectivity is supported. Refer to Microsoft Dynamics 365 parameters for details. (On-Premises Connector 4.6.1.287)

Docker trial deployment

The generally available Hybrid Data Pipeline Docker image now supports a trial Docker deployment. After you obtain the image from the Progress Enterprise Delivery site (ESD) or the Trial Download page, you may perform a trial deployment of Hybrid Data Pipeline as a Docker container on a single node with an internal system database. Refer to the Hybrid Data Pipeline Trial Docker Deployment Getting Started for details.

Power BI custom connector

A Power BI custom connector is now available from the Progress DataDirect Hybrid Data Pipeline Public GitHub repository. This custom connector may be used to implement connectivity from Power BI to Hybrid Data Pipeline resources that use OAuth 2.0 or OIDC authentication. For details, refer to Configuring a Power BI custom connector for OAuth 2.0 or Configuring a Power BI custom connector for OIDC.

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.63. This addresses the CVE-2022-23181 security vulnerability that was mitigated with the resolution of Issue HDP-5924. (On-Premises Connector version 4.6.1.287)

Changed Behavior

Microsoft Dynamics CRM data store deprecated

The Microsoft Dynamics CRM data store has been deprecated. Connectivity to a number of Dynamics 365 apps is now supported with app-specific Hybrid Data Pipeline data stores. Refer to Microsoft Dynamics 365 parameters for details. (On-Premises Connector 4.6.1.287)

Docker trial image

The Docker trial image has been deprecated. A Docker trial deployment of Hybrid Data Pipeline may now be performed using the generally available Hybrid Data Pipeline Docker image. This image may be obtained from the Progress Enterprise Delivery site (ESD) or the Trial Download page. Refer to the Hybrid Data Pipeline Trial Docker Deployment Getting Started for details.

Resolved Issues

Issue HDP-5854 The ODBC driver not supporting the GUID data type

The ODBC driver did not support the GUID data type. (ODBC driver 4.6.1.67)

Issue HDP-5925 Upgrade the version of Tomcat shipped with Hybrid Data Pipeline server from Tomcat 9.0.54 to 9.0.63

The shipping version of the Tomcat server was upgraded from Tomcat 9.0.54 to 9.0.63. This addresses the CVE-2022-23181 security vulnerability that was mitigated with the resolution of Issue HDP-5924. (On-Premises Connector 4.6.1.287)

Issue HDP-6212 SQL Editor query of datetimeoffset and sql_variant data type columns returns NullPointerException

When using the SQL Editor to query datetimeoffset and sql_variant, a NullPointerException was returned.

Issue HDP-6217 Problem with setting HDP_DATABASE_ADVANCED_OPTIONS setting for Docker deployments

When setting HDP_DATABASE_ADVANCED_OPTIONS to use an SSL connection to the external system database, the setting was not propagated correctly.

Issue HDP-6275 Hybrid Data Pipeline server upgrade failed in an environment using FIPS and an external JRE

When performing a Hybrid Data Pipeline server upgrade in an environment using FIPS and an external JRE, the upgrade failed with the error Error in MAIN at line 576.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.1030

November 21, 2022

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Changed Behavior

Microsoft Dynamics CRM data store removed

The Microsoft Dynamics CRM data store was recently deprecated, and has now been removed from the product package. Connectivity to a number of Dynamics 365 apps, including CRM and ERP apps, is supported with app-specific Hybrid Data Pipeline data stores. Refer to Microsoft Dynamics 365 parameters for details.

Rollbase data store removed

The Rollbase data store has been removed from the product package. If you would like to reintroduce the Rollbase data store, contact Technical Support.

SugarCRM data store removed

The SugarCRM data store has been removed from the product package. If you would like to reintroduce the SugarCRM data store, contact Technical Support.

Resolved Issues

Issue HDP-6514 Use of Externally-Controlled Input to Select Classes or Code ('Unsafe Reflection') - (CVE-2022-41853)

The Hybrid Data Pipeline product and it’s connectors utilized a version of HyperSQL Database that was vulnerable to remote code execution described in CVE-2022-41853. All impacted components have been patched to fix this vulnerability. For details of components impacted and fixed versions, refer to the following KB article:

https://community.progress.com/s/article/DataDirect-Hybrid-Data-Pipeline-Critical-Security-Bulletin-November-2022-CVE-2022-41853

Note: In addition to updating the Hybrid Data Pipeline server, if any On-Premises Connectors are used in your environment, they should be updated with build 4.6.1.395 of the On-Premises Connector

Issue HDP-6431 Microsoft Dynamics 365 Authorization URI auto-populated for client credentials auth flow

After an initial connection to Microsoft Dynamics 365 using the OAuth 2.0 client credentials grant, the Authorization URI field automatically populated with the default value when the data source was reopened. The value in the Authorization URI field had to be manually cleared to reconnect with Microsoft Dynamics 365.

Issue HDP-6601 Hybrid Data Pipeline unable to connect to an Azure Synapse serverless database via a SQL Server data source

Hybrid Data Pipeline was unable to connect to an Azure Synapse serverless database via a SQL Server data source.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.1248

March 30, 2023

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

curl Library Upgrade

The curl library files that are installed with the ODBC driver have been upgraded to version 7.88.1, which fixes a number of potential security vulnerabilities. For more information on the vulnerabilities resolved by this enhancement, refer to Vulnerabilities in the curl documentation. (ODBC driver 4.6.1.158)

OpenSSL Upgrade

The default version of the OpenSSL library has been upgraded to version 1.1.1t, which fixes a number of potential security vulnerabilities. For more information on the vulnerabilities resolved by this enhancement, refer to Vulnerabilities: Fixed in OpenSSL 1.1.1 in OpenSSL News. (ODBC driver 4.6.1.158)

Resolved Issues

Issue HDP-6444 Hybrid Data Pipeline Server upgrade to enable FIPS is failing

When upgrading the Hybrid Data Pipeline server to enable FIPS, the installation failed and the installer returned an account database error.

Issue HDP-6931 JDBC driver allowed statements to be executed even when the connection was dead causing "Invalid session token" error

The JDBC driver was allowing statements to be executed after a connection was terminated, resulting in an "Invalid session token" error. (JDBC driver 4.6.1.194)

Issue HDP-6973 User and password properties should be made optional in the JDBC data source

On a JDBC data source configured for OAuth and created with the DataDirect Snowflake JDBC driver, the user was prompted for a user ID and password when attempting to test connect.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0417

April 14, 2022

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-5675 The Metadata Exposed Schemas dropdown not loading schemas when using the PostgreSQL driver

When using the PostgreSQL JDBC driver as a third party driver to connect to backend data, the Metadata Exposed Schemas dropdown did not load PostgreSQL schemas.

Issue HDP-5780 Unable to login after upgrading server

After upgrading to server build 4.6.1.357, the introduction of a new keystore prevented successful login.

Issue HDP-5792 Unable to deploy as Docker container using environment variables

Hybrid Data Pipeline deployment failed when using environment variables to deploy the server as a Docker container

Issue HDP-5811 Resolved Spring Framework vulnerability

Hybrid Data Pipeline has been updated to use Spring Framework version 5.3.18, Spring Boot version 2.6.6, and Spring Security version 5.6.2 to address the vulnerability described in CVE-2022-22965. (Hybrid Data Pipeline server 4.6.1.417, On-Premises Connector 4.6.1.164, JDBC driver 4.1.6.43)

Issue HDP-5813 Resolved Jackson Deserializer vulnerability

Hybrid Data Pipeline has been updated to use version 2.13.2.2 of the Jackson library to address the vulnerability described in CVE-2020-36518. (Hybrid Data Pipeline server 4.6.1.417, On-Premises Connector 4.6.1.164, JDBC driver 4.1.6.43)

Issue HDP-5841 On-Premises Connector unable to connect after upgrade

After upgrading to On-Premises Connector build 4.6.1.120, the On-Premises Connector received an HTTP 401 error from the Hybrid Data Pipeline server when attempting to connect. (On-Premises Connector 4.6.1.164)

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0132

March 22, 2021

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Enhancements

Changing catalog

Hybrid Data Pipeline supports changing the catalog of data sources. The setCatalog method can be used to change catalogs in JDBC, while the connection attribute SQL_ATTR_CURRENT_CATALOG can be used in ODBC. Support for changing catalogs includes support for changing the default database on an active connection to a SQL Server data source. This support extends to any data source configured with an underlying JDBC connector that supports the setCatalog method. This enhancement is available in the latest build of the Hybrid Data Pipeline server (4.6.1.132). Components such as the Hybrid Data Pipeline ODBC and JDBC drivers, as well as the On-Premises Connector must be reinstalled to adopt the enhancement (On-Premises Connector version 4.6.1.62, ODBC driver 4.6.1.27, JDBC driver 4.6.1.13).

Resolved Issues

Issue HDP-4463 JDBC driver defaulted to service.datadirectcloud.com host name and returned inaccurate error message (JDBC driver 4.6.1.13)

When an incorrect host name was specified in the connection URL, the Hybrid Data Pipeline JDBC driver defaulted to service.datadirectcloud.cloud.com as the host name and returned an inaccurate error message.

Issue HDP-4858 ODBC driver not installing on Amazon Linux 2

The ODBC driver was not installing on Amazon Linux 2.

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0325

February 3, 2022

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-5589 Resolved Log4j 2.17 security vulnerability

Hybrid Data Pipeline has been updated to use Log4j version 2.17.1 to address security vulnerabilities found in Log4j versions 2.17 as described in CVE-2021-44832. For details, refer to CVE-2021-44832. (Hybrid Data Pipeline server 4.6.1.325, On-Premises Connector version 4.6.1.99).

4.6.1.311

Resolved Issues

Issue HDP-5565 Resolved Log4j 2.15 and 2.16 security vulnerabilities

Hybrid Data Pipeline has been updated to use Log4j version 2.17 to address security vulnerabilities found in Log4j versions 2.15 and 2.16 as described in CVE-2021-45046 and CVE-2021-45105. For details, refer to CVE-2021-45046 and CVE-2021-45105. (Hybrid Data Pipeline server 4.6.1.311, On-Premises Connector 4.6.1.91).

Known Issues

See Hybrid Data Pipeline known issues for details.

 

4.6.1.0062

November 19, 2020

Preface

These release notes provide enhancements, changed behavior, and resolved issues. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Resolved Issues

Issue HDP-4757 Cannot retrieve data from SQL Server table (On-Premises Connector version 4.6.1.47)

Sometimes when trying to execute SELECT * FROM table against an on-premise SQL Server database using the On-Premises Connector, the ODBC driver returned the error [HY000] [DataDirect][ODBC Hybrid driver][SQLServer]Unexpected content at the end of chunk.

Issue HDP-4574 HTTP error 404 while renaming the connector label (On-Premises Connector version 4.6.1.47)

When the name of the On-Premises Connector host machine was in all uppercase at the time of the installation of the On-Premises Connector, the Connector Label field in the On-Premises Configuration Tool did not populate with the hostname as expected. Then, when attempting to update the Connector Label field with the correct hostname, the On-Premises Configuration Tool returned Error setting connector label for user Request returned Status:404 Message.

Issue HDP-4704 Error while accessing link tables in MS Access application using Hybrid Data Pipeline data source (ODBC driver 4.6.1.12)

When using the Hybrid Data Pipeline ODBC driver to connect to a data source created with a third party JDBC driver, the following error was returned: ODBC--call failed. [DataDirect][ODBC Hybrid driver]Numeric value out of range. Error in column 16. (#0). This error was returned because the third party driver diverged from the JDBC specification when describing the data type of CHAR_OCTET_LENGTH for DatabaseMetaData.getColumns(). The ODBC driver has been modified to work with the third party JDBC driver despite this divergence from the JDBC specification.

Enhancements

SQL statement auditing

Hybrid Data Pipeline has added a SQL statement auditing feature. When SQL statement auditing is enabled, the connectivity service records SQL statements and related metrics in the SQLAudit table on the Hybrid Data Pipeline system database (also referred to as the account database). This information can then be queried directly by administrators. See SQL statement auditing in the user's guide for details.

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.37. (On-Premises Connector version 4.6.1.14)

Known Issues

See Hybrid Data Pipeline known issues for details.

 
DataDirect Hybrid Data Pipeline Release History

Connect any application to any data source anywhere

Explore all DataDirect Connectors

A product specialist will be glad to get in touch with you

Contact Us