Progress DataDirect
Hybrid Data Pipeline

Release Notes

Enhancements, changed behavior, and resolved issues are provided according to the product version number. When applicable, a component-specific version number is provided for the On-Premises Connector, ODBC driver, or JDBC driver.

The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown.

For On-Premises Connectors, administrators can obtain version information using the Administrator Connectors API. See Obtaining information about On-Premises Connectors for details. In addition, version information can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab.

The JDBC driver version number can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory:

java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver.

For the ODBC driver, see Driver version string for details on obtaining the driver version number.

Note: For the latest data source and platform support information, refer to the Product Compatibility Guide.

Latest Release

4.6.1.2529

April 18, 2024

Enhancements

Custom password policy

Hybrid Data Pipeline now supports the creation of a custom password policy. Administrators may set an expiration date for passwords and configure the minimum and maximum number of characters allowed in a password. A custom policy may also be configured to require upper case letters, lower case letters, numbers, and special characters. See Password policy for details.

Collect information about On-Premises Connectors

The new Administrator Connectors API allows administrators to retrieve information about On-Premises Connectors registered with Hybrid Data Pipeline. Administrators may obtain a full list of On-Premises Connectors with this API. They may also use it to filter the list for details such as version number, owner, and tenant. See Obtaining information about On-Premises Connectors for details.

Download data source logs using the Web UI or the Hybrid Data Pipeline API

Hybrid Data Pipeline provides data source logging to record user activity against data sources. Data source logs may now be obtained from the Data Sources view in the Web UI or with the data source logs endpoint. In addition, data source logs may be retrieved by running the getdslogs.sh script on each node in the deployment. See Obtaining the logs for a data source for details.

Download system logs using the Web UI or the Hybrid Data Pipeline API

Hybrid Data Pipeline generates a number of log files to record events, activity, and other information. System logs may now be obtained through the System Configurations view in the Web UI or via the Nodes API. In addition, system logs may be retrieved by running the getlogs.sh script on each node in the deployment. See System logs for details.

Changed Behavior

MySQL Community Edition

For connectivity to MySQL CE, the MySQL CE Connector/J jar must be supplied during the deployment of Hybrid Data Pipeline. With this release, version 8.0 of the MySQL CE Connector/J jar has been certified with Hybrid Data Pipeline. For the latest data source and platform support information, refer to the  Product Compatibility Guide.

Resolved Issues

Issue HDP-8498 MaxFetchRows limit not specifying the maximum rows allowed to be fetched

After setting the MaxFetchRows limit, it was observed that Hybrid Data Pipeline ignored the limit. In addition, the SQL Editor returned incorrect results.

Issue HDP-8677 Calls to api/mgmt/datastores endpoint returning invalid JSON

When querying the api/mgmt/datastores endpoint, Hybrid Data Pipeline returned invalid JSON in the response payload.

Issue HDP-8683 UserMeter table RemoteAddress field contains the Hybrid Data Pipeline server IP address instead of the client machine IP address for OData queries

When querying the UserMeter table for information about an OData query, the RemoteAddress field contained the Hybrid Data Pipeline server IP address instead of the IP address of the client machine.

Issue HDP-8690 Error 'Value must be a valid URL' returned when registering a SAML authentication service

When registering a SAML authentication service using Azure as the Identify Provider, Hybrid Data Pipeline returned the error "Value must be a valid URL" even though the IDP entity ID was valid.

Issue HDP-8710 Special characters not supported for external system database user passwords

When a special character was used for the user password of a MySQL system database, the Hybrid Data Pipeline server installation failed.

Issue HDP-9079 OAuth2 is not working against the Salesforce test instance

When attempting to connect to a Salesforce test instance using OAuth, Hybrid Data Pipeline returned the error "There is a problem connecting to the DataSource. REST STatus 404 NOT Found returned for GET https://login.salesforce.com/services/oauth2/userinfo."

Known Issues

See Hybrid Data Pipeline known issues for details.

Connect any application to any data source anywhere

Explore all DataDirect Connectors

A product specialist will be glad to get in touch with you

Contact Us