Progress DataDirect
Hybrid Data Pipeline

4.6.1 Release Notes

Enhancements and resolved issues are provided according to product version number. The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown. When applicable, a component-specific version number is provided for the On-Premises Connector or a driver. Version numbers for these components are not available through the Web UI. The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab. The version number for the JDBC driver can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory: java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver. For the ODBC driver, see Driver version string for details on obtaining the driver version number.

4.6.1.757

Enhancements

Microsoft Dynamics 365 support

Hybrid Data Pipeline now supports access to a number of Microsoft Dynamics 365 apps. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to these Dynamics 365 apps. OAuth 2.0 connectivity is supported. Refer to Microsoft Dynamics 365 parameters for details. (On-Premises Connector 4.6.1.287)

Docker trial deployment

The generally available Hybrid Data Pipeline Docker image now supports a trial Docker deployment. After you obtain the image from the Progress Enterprise Delivery site (ESD) or the Trial Download page, you may perform a trial deployment of Hybrid Data Pipeline as a Docker container on a single node with an internal system database. Refer to the Hybrid Data Pipeline Trial Docker Deployment Getting Started for details.

Power BI custom connector

A Power BI custom connector is now available from the Progress DataDirect Hybrid Data Pipeline Public GitHub repository. This custom connector may be used to implement connectivity from Power BI to Hybrid Data Pipeline resources that use OAuth 2.0 or OIDC authentication. For details, refer to Configuring a Power BI custom connector for OAuth 2.0 or Configuring a Power BI custom connector for OIDC.

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.63. This addresses the CVE-2022-23181 security vulnerability that was mitigated with the resolution of Issue HDP-5924. (On-Premises Connector version 4.6.1.287)

Changed Behavior

Microsoft Dynamics CRM data store deprecated

The Microsoft Dynamics CRM data store has been deprecated. Connectivity to a number of Dynamics 365 apps is now supported with app-specific Hybrid Data Pipeline data stores. Refer to Microsoft Dynamics 365 parameters for details. (On-Premises Connector 4.6.1.287)

Docker trial image

The Docker trial image has been deprecated. A Docker trial deployment of Hybrid Data Pipeline may now be performed using the generally available Hybrid Data Pipeline Docker image. This image may be obtained from the Progress Enterprise Delivery site (ESD) or the Trial Download page. Refer to the Hybrid Data Pipeline Trial Docker Deployment Getting Started for details.

Resolved Issues

Issue HDP-5854 The ODBC driver not supporting the GUID data type

The ODBC driver did not support the GUID data type. (ODBC driver 4.6.1.67)

Issue HDP-5925 Upgrade the version of Tomcat shipped with Hybrid Data Pipeline server from Tomcat 9.0.54 to 9.0.63

The shipping version of the Tomcat server was upgraded from Tomcat 9.0.54 to 9.0.63. This addresses the CVE-2022-23181 security vulnerability that was mitigated with the resolution of Issue HDP-5924. (On-Premises Connector 4.6.1.287)

Issue HDP-6212 SQL Editor query of datetimeoffset and sql_variant data type columns returns NullPointerException

When using the SQL Editor to query datetimeoffset and sql_variant, a NullPointerException was returned.

Issue HDP-6217 Problem with setting HDP_DATABASE_ADVANCED_OPTIONS setting for Docker deployments

When setting HDP_DATABASE_ADVANCED_OPTIONS to use an SSL connection to the external system database, the setting was not propagated correctly.

Issue HDP-6275 Hybrid Data Pipeline server upgrade failed in an environment using FIPS and an external JRE

When performing a Hybrid Data Pipeline server upgrade in an environment using FIPS and an external JRE, the upgrade failed with the error Error in MAIN at line 576.

4.6.1.607

Resolved Issues

Issue HDP-5690 Hybrid Data Pipeline reached open files limit

Idle Google BigQuery connections did not fully close, causing the Hybrid Data Pipeline server to reach the limit on open files.

Issue HDP-5866 On-Premises Connector throwing HTTP 401 error during installation

After upgrading to the On-Premises Connector, the On-Premises Connector received an HTTP 401 error from the Hybrid Data Pipeline server when attempting to connect. (On-Premises Connector 4.6.1.255)

Issue HDP-5938 "Request header is too large" exception with HDP SAML authentication

When attempting to authenticate using SAML, Hybrid Data Pipeline returned the exception "Request header is too large."

Issue HDP-6133 Account lockout not working as expected for ODBC and OData data access

After an account lockout occurred, OData queries were running successfully.

Issue HDP-6152 Not passing user credentials when using third-party connector

When using a third-party connector where the database credentials are not included in the Hybrid Data Pipeline data source, the user is prompted to enter credentials. In this scenario, Hybrid Data Pipeline returned the error message "user name is missing." (JDBC driver 4.6.1.77)

Issue HDP-6154 Unable to use Azure Database for PostgreSQL as an external database

Hybrid Data Pipeline was unable to support the Azure Database for PostgreSQL as an external database because Azure Database for PostgreSQL requires a unique user naming convention.

Issue HDP-6178 Worker thread error when connecting to Azure Synapse serverless instance

When attempting to query an Azure Synapse serverless instance, Hybrid Data Pipeline returned a java.io.IOException worker thread error.

4.6.1.558

Enhancements

Support for Google Analytics, Google BigQuery, Salesforce, and REST data store implementations of OAuth 2.0

Hybrid Data Pipeline has been enhanced to support Google Analytics, Google BigQuery, Salesforce, and REST data store implementations of OAuth 2.0. To integrate Hybrid Data Pipeline with an OAuth 2.0 authorization flow, Hybrid Data Pipeline must be registered as a client application with the given data store. Then, OAuth application and profile objects must be created to manage OAuth endpoints, properties, and tokens. For details, refer to Integrating Hybrid Data Pipeline as a client application with a data store OAuth 2.0 authorization flow. (On-Premises Connector 4.6.1.241)

Changed Behavior

Google Analytics OAuth 2.0 implementation

The procedures for integrating Hybrid Data Pipeline as a client application to enable access to Google Analytics include the ability to select or create an OAuth application in the Web UI. For details, refer to Google Analytics parameters.

Resolved Issues

Issue HDP-5804 Selecting a data source in the SQLEditor results in no suitable driver found error

When selecting a data source from the dropdown in the SQL Editor, the server is returning the error "No suitable driver found."

Issue HDP-5805 Error on datetime column when using OData to connect to MySQL Community Edition

When performing an insert on an OData-enabled MySQL Community Edition data source, Hybrid Data Pipeline returned an error on a datetime column.

Issue HDP-5836 NPE after receiving a merge request with an empty payload

Performing a mergeEntity operation against an OData-enabled MySQL Community Edition data source resulted in a NullPointerException.

Issue HDP-5881 Unable to configure server-side SSL between HDP nodes

Server-side SSL could not be configured because the enable_ssl.sh script was not properly setting the truststore information from the Web UI.

Issue HDP-5924 Update the context.xml file to disable Session persistence in Tomcat

To mitigate the CVE-2022-23181 security vulnerability, the Tomcat context.xml file has been modified such that session persistence is disabled by default.

4.6.1.417

Resolved Issues

Issue HDP-5675 The Metadata Exposed Schemas dropdown not loading schemas when using the PostgreSQL driver

When using the PostgreSQL JDBC driver as a third party driver to connect to backend data, the Metadata Exposed Schemas dropdown did not load PostgreSQL schemas.

Issue HDP-5780 Unable to login after upgrading server

After upgrading to server build 4.6.1.357, the introduction of a new keystore prevented successful login.

Issue HDP-5792 Unable to deploy as Docker container using environment variables

Hybrid Data Pipeline deployment failed when using environment variables to deploy the server as a Docker container

Issue HDP-5811 Resolved Spring Framework vulnerability

Hybrid Data Pipeline has been updated to use Spring Framework version 5.3.18, Spring Boot version 2.6.6, and Spring Security version 5.6.2 to address the vulnerability described in CVE-2022-22965. (Hybrid Data Pipeline server 4.6.1.417, On-Premises Connector 4.6.1.164, JDBC driver 4.1.6.43)

Issue HDP-5813 Resolved Jackson Deserializer vulnerability

Hybrid Data Pipeline has been updated to use version 2.13.2.2 of the Jackson library to address the vulnerability described in CVE-2020-36518. (Hybrid Data Pipeline server 4.6.1.417, On-Premises Connector 4.6.1.164, JDBC driver 4.1.6.43)

Issue HDP-5841 On-Premises Connector unable to connect after upgrade

After upgrading to On-Premises Connector build 4.6.1.120, the On-Premises Connector received an HTTP 401 error from the Hybrid Data Pipeline server when attempting to connect. (On-Premises Connector 4.6.1.164)

4.6.1.372

Enhancements

PostgreSQL 14

Hybrid Data Pipeline now supports connectivity to PostgreSQL 14 databases. PostgreSQL 14 can also be used as a system database to store account and configuration information for a Hybrid Data Pipeline instance. This functionality is supported in the following component versions.

  • Hybrid Data Pipeline Server 4.6.1.372 and higher
  • On-Premises Connector 4.6.1.120 and higher

Changed Behavior

PostgreSQL call escape behavior

The default behavior for handling PostgreSQL call escape syntax has changed. Previously, Hybrid Data Pipeline only supported stored functions, and treated the non-standard escape syntax {call function()} the same as the standard escape syntax {? = call function()}. With this latest patch, Hybrid Data Pipeline supports stored functions and stored procedures. Now Hybrid Data Pipeline determines whether a function or procedure is being called based on the call escape syntax. If the return value parameter ?= is used, then the connectivity service calls a stored function. If the return value parameter is not used, then the connectivity service calls a stored procedure. You can change this default behavior by setting the CallEscapeBehavior option as an extended option under the Advanced tab. These are the valid values for the CallEscapeBehavior option:

  • If set to select, treats the object being called by both the {call...} syntax and the {? = call...} syntax as a stored function and makes the applicable native call to the PostgreSQL database.
  • If set to call, treats the object being called by both the {call...} syntax and the {? = call...} syntax as a stored procedure and makes the applicable native call to the PostgreSQL database.
  • If set to callIfNoReturn (the default), the service determines whether to call a function or stored procedure based on the call escape syntax. If the return value parameter ?= is used, the service calls a function. If not, the service calls a stored procedure.

Resolved Issues

Issue HDP-5459 OData $expand query fails against OpenEdge data source

When using the OData $expand functionality to query an OpenEdge data source, the query failed and an error was returned.

Issue HDP-5605 SQL Editor not displaying values when two columns had the same name

When a SQL query included columns of the same name, the SQL Editor did not display the column values.

Issue HDP-5642 SQL Editor not displaying results

The SQL Editor did not display results as expected.

4.6.1.357

Enhancements

Server-side SSL

Hybrid Data Pipeline now supports server-side SSL. Server-side SSL allows you to enable SSL behind the load balancer and secure communication between the load balancer and server nodes, as well as Hybrid Data Pipeline nodes in a cluster deployment. This functionality is supported in the following component versions.

  • Hybrid Data Pipeline Server 4.6.1.357 and higher
  • JDBC Driver 4.6.1.32 and higher
  • ODBC Driver 4.6.1.34 and higher

Note:

  • Updating On-Premises Connectors is not required to configure server-side SSL.
  • For details on server-side SSL, refer to SSL configuration.
curl Library update (ODBC driver 4.6.1.34)

The curl library files used with the ODBC driver have been upgraded to version 7.80.0.

OpenSSL library update (ODBC driver 4.6.1.34)

The default version of the OpenSSL library used with the ODBC driver has been upgraded to version 1.1.1l.

Resolved Issues

Issue HDP-5587 SYNONYMS not displayed in the Web UI

The SQL Editor was not displaying SYNONYM objects.

Issue HDP-5611 "Unexpected end of stream" error returned

When queries demanded the return of multiple large result sets, the query failed and the error "Unexpected end of stream" was returned.

4.6.1.325

Resolved Issues

Issue HDP-5589 Resolved Log4j 2.17 security vulnerability

Hybrid Data Pipeline has been updated to use Log4j version 2.17.1 to address security vulnerabilities found in Log4j versions 2.17 as described in CVE-2021-44832. For details, refer to CVE-2021-44832. (Hybrid Data Pipeline server 4.6.1.325, On-Premises Connector version 4.6.1.99).

4.6.1.311

Resolved Issues

Issue HDP-5565 Resolved Log4j 2.15 and 2.16 security vulnerabilities

Hybrid Data Pipeline has been updated to use Log4j version 2.17 to address security vulnerabilities found in Log4j versions 2.15 and 2.16 as described in CVE-2021-45046 and CVE-2021-45105. For details, refer to CVE-2021-45046 and CVE-2021-45105. (Hybrid Data Pipeline server 4.6.1.311, On-Premises Connector 4.6.1.91).

4.6.1.306

Resolved Issues

Issue HDP-5560 Resolved Log4j security vulnerability 

Hybrid Data Pipeline has been updated to use Log4j version 2.15 to address the security vulnerability found in Log4j version 2.13.3 as described in CVE-2021-44228. For details, refer to CVE-2021-44228. (Hybrid Data Pipeline server 4.6.1.306, On-Premises Connector version 4.6.1.85).

4.6.1.296

Enhancements

Docker image

A production instance of the Hybrid Data Pipeline server can now be deployed using a Docker image. The Hybrid Data Pipeline Docker image is available in the Hybrid Data Pipeline Docker Deployment Package. In addition, the Docker Deployment Package includes demos for a number of deployment scenarios. For details and instructions, see Deploying Hybrid Data Pipeline using Docker in the installation guide.

OpenID Connect (OIDC) support

Hybrid Data Pipeline now supports user authentication using the OIDC protocol. An identity provider and client applications can be configured to authorize users and grant access to the OData endpoints of the Hybrid Data Pipeline server. See Integrating an OIDC authentication service in the user's guide for details.

Resolved Issues

Issue HDP-5395 Third-party JDBC Oracle driver integration does not return tables

When using the third-party JDBC Oracle driver, the Hybrid Data Pipeline SQL Editor did not return tables.

Issue HDP-5433 Unable to authenticate when special character '+' (plus sign) in account password

When the special character '+' (plus sign) was used in an account password, the user was unable to authenticate with the Hybrid Data Pipeline server.

Issue HDP-5461 Unable to access Oracle Cloud Financials

Hybrid Data Pipeline was unable to access Oracle Cloud Financials REST Endpoints with the Autonomous REST Connector.

4.6.1.256

Enhancements

Stored procedures

Stored procedures functionality now includes support for input parameters, output parameters, and in/out parameters. Stored procedures that return multiple results are also supported. This functionality is supported in the following component versions.

  • Hybrid Data Pipeline Server 4.6.1.256 and higher
  • On-Premises Connector 4.6.1.73 and higher
  • JDBC Driver 4.6.1.23 and higher
  • ODBC Driver 4.6.1.31 and higher

Resolved Issues

Issue HDP-5020 Error message did not state reason that the required secure random instance could not be created when enabling FIPS

When attempting to enable FIPS, the error message did not state that the required secure random instance could not be created because there was not enough entropy on the host machine.

Issue HDP-5064 JDBC driver not able to follow redirects (JDBC driver 4.6.1.23)

When an HTTP redirect status was returned, the driver was unable to follow the redirection and returned an error that the HTTP endpoint had been relocated. To resolve this issue, the FollowRedirects connection property has been introduced. When FollowRedirects is enabled, the driver can follow an HTTP redirection instead of returning an error. For details, refer to FollowRedirects.

Issue HDP-5412 "Unexpected end of stream in statement" error returned

When processing an empty result reply to a query execution request against a ServiceNow REST service, Hybrid Data Pipeline returned "Unexpected end of stream in statement" error.

4.6.1.233

Enhancements

SSO/SAML support

Hybrid Data Pipeline now supports user authentication using the SSO/SAML protocol. Customers can configure SAML authentication by providing the details of an identity provider and can configure users to use the SAML authentication.

Resolved Issues

Issue HDP-4549 HDP server unreachable due to OS file handle leak

When the "FileNotFoundException (Too many open files)" error occurred, the Hybrid Data Pipeline connection was lost and the server had to be restarted.

Issue HDP-5202 Error returned when fetching MySQL zero values for date and datetime columns

When fetching invalid date and datetime values from columns or literals, such as SELECT DATE(0), against MySQL data sources, the Hybrid Data Pipeline server returned an error.

Issue HDP-5210 OData v4 Endpoint not compatible with Tableau Desktop

Tableau was unable to connect to OData v4 endpoints exposed by Hybrid Data Pipeline.

Issue HDP-5217 Some special characters not allowed in passwords

Users were unable to use special characters for Hybrid Data Pipeline passwords.

Issue HDP-5266 Load balancer not returning OData responses from the server

When HTTP was disabled on the load balancer, the load balancer did not return OData responses to the client application as would be expected with the configuration of the X-Forwarded-Proto header to manage HTTP and HTTPS traffic.

4.6.1.169

Enhancements

Google BigQuery support

Hybrid Data Pipeline now supports access to Google BigQuery. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to Google BigQuery. OAuth 2.0 and Service Account authentication methods are supported.

4.6.1.138

Resolved Issues

Issue HDP-4923 Performance issue querying OpenEdge database

To improve performance, the OData startswith() function was changed to use LIKE instead of LOCATE for a SQL statement that takes less time to execute.

4.6.1.132

Enhancements

Changing catalog

Hybrid Data Pipeline supports changing the catalog of data sources. The setCatalog method can be used to change catalogs in JDBC, while the connection attribute SQL_ATTR_CURRENT_CATALOG can be used in ODBC. Support for changing catalogs includes support for changing the default database on an active connection to a SQL Server data source. This support extends to any data source configured with an underlying JDBC connector that supports the setCatalog method. This enhancement is available in the latest build of the Hybrid Data Pipeline server (4.6.1.132). Components such as the Hybrid Data Pipeline ODBC and JDBC drivers, as well as the On-Premises Connector must be reinstalled to adopt the enhancement (On-Premises Connector version 4.6.1.62, ODBC driver 4.6.1.27, JDBC driver 4.6.1.13).

Resolved Issues

Issue HDP-4463 JDBC driver defaulted to service.datadirectcloud.com host name and returned inaccurate error message (JDBC driver 4.6.1.13)

When an incorrect host name was specified in the connection URL, the Hybrid Data Pipeline JDBC driver defaulted to service.datadirectcloud.cloud.com as the host name and returned an inaccurate error message.

Issue HDP-4858 ODBC driver not installing on Amazon Linux 2

The ODBC driver was not installing on Amazon Linux 2.

4.6.1.107

Enhancements

OData query throttling for users

Hybrid Data Pipeline supports throttling the number of simultaneous OData queries a user may have running against a Hybrid Data Pipeline server at one time. OData query throttling for users may be configured with the ODataMaxConcurrentRequests and ODataMaxWaitingRequests limits. The ODataMaxConcurrentRequests limit sets the maximum number of simultaneous OData requests allowed per user, while the ODataMaxWaitingRequests limit sets the maximum number of waiting OData requests allowed per user. See Throttling in the user's guide for details.

Environment variables support for silent installation

Support for environment variables to specify server and system database credentials during the installation process has been added. The use of environment variables allows you to perform a more secure silent installation, compared to a standard silent installation where credential information must be specified in plain text in the silent installation response file. See Silent installation process in the user's guide for details.

Resolved Issues

Issue HDP-4853 Installation failed when special characters were used in system database credentials

When installing the Hybrid Data Pipeline server using SQL Server as the system database, the use of special characters in admin or user account credentials caused the installation to fail with the error Error in createSchema at Line 266.
NOTE: While installation no longer fails when special characters are used in system database account credentials, the installer cannot currently validate the necessary database schema objects when any of the following special characters are used in either database user ID or password values: space ( ), quotation mark ("), number sign (#), dollar sign ($), and apostrophe ('). Therefore, in a standard installation where these characters are used in database credentials, database validation must be skipped to proceed with the installation. Similarly, when performing a silent installation in this case, the SKIP_DATABASE_VALIDATION property should be set to true. Note that when skipping database validation in this scenario, the server should install successfully and work with the specified system database.

Issue HDP-4854 Silent installation process required the specification of system database admin and user passwords in clear text in the response file

The specification of system database admin and user passwords in plain text in the response file as part of the silent installation process raised security concerns. Support for environment variables to specify server and system database credentials during the installation process has been added. See Silent installation process in the user's guide for details.

Issue HDP-4859 Firefox, Chrome, and Microsoft Edge browsers not rendering Web UI correctly for load balancer installation

When the HAProxy load balancer was configured with the setting x-content-type-options:nosniff, Firefox, Chrome, and Microsoft Edge browsers rendered the Web UI as text instead of HTML.

4.6.1.92

Enhancements

Aggregation support

Hybrid Data Pipeline has added support for a subset of the functionality defined by the OData Version 4 extension for data aggregation. Aggregation functionality is extended with the $apply query parameter. See Aggregation support in the user's guide for details.

Windows Server 2019

The Hybrid Data Pipeline On-Premises Connector, ODBC driver, and JDBC driver have added support for Windows Server 2019. (On-Premises Connector version 4.6.1.48, ODBC driver 4.6.1.12, JDBC driver 4.6.1.9)

Resolved Issues

Issue HDP-4478 Unable to connect using TNS connection option for Oracle

The Hybrid Data Pipeline server required the specification of the Server Name parameter, even though Server Name is not required for a TNS connection. In addition, when Server Name was specified, the server returned an inaccurate error message.

4.6.1.62

Resolved Issues

Issue HDP-4757 Cannot retrieve data from SQL Server table (On-Premises Connector version 4.6.1.47)

Sometimes when trying to execute SELECT * FROM table against an on-premise SQL Server database using the On-Premises Connector, the ODBC driver returned the error [HY000] [DataDirect][ODBC Hybrid driver][SQLServer]Unexpected content at the end of chunk.

Issue HDP-4574 HTTP error 404 while renaming the connector label (On-Premises Connector version 4.6.1.47)

When the name of the On-Premises Connector host machine was in all uppercase at the time of the installation of the On-Premises Connector, the Connector Label field in the On-Premises Configuration Tool did not populate with the hostname as expected. Then, when attempting to update the Connector Label field with the correct hostname, the On-Premises Configuration Tool returned Error setting connector label for user Request returned Status:404 Message.

Issue HDP-4704 Error while accessing link tables in MS Access application using Hybrid Data Pipeline data source (ODBC driver 4.6.1.12)

When using the Hybrid Data Pipeline ODBC driver to connect to a data source created with a third party JDBC driver, the following error was returned: ODBC--call failed. [DataDirect][ODBC Hybrid driver]Numeric value out of range. Error in column 16. (#0). This error was returned because the third party driver diverged from the JDBC specification when describing the data type of CHAR_OCTET_LENGTH for DatabaseMetaData.getColumns(). The ODBC driver has been modified to work with the third party JDBC driver despite this divergence from the JDBC specification.

Enhancements

SQL statement auditing

Hybrid Data Pipeline has added a SQL statement auditing feature. When SQL statement auditing is enabled, the connectivity service records SQL statements and related metrics in the SQLAudit table on the Hybrid Data Pipeline system database (also referred to as the account database). This information can then be queried directly by administrators. See SQL statement auditing in the user's guide for details.

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.37. (On-Premises Connector version 4.6.1.14)

4.6.1.32

Resolved Issues

Issue HDP-4534 Unable to connect to Google Analytics data source

When connecting to Google Analytics data source using an OAuth profile created in a previous version of Hybrid Data Pipeline, the following error was returned: There is a problem connecting to the data source. Error getting data source. System not available, try again later.

4.6.1.23

Resolved Issues

Issue HDP-4490 ODataApplicationException returned when filtering on BIT/BOOLEAN field

When exposing a Microsoft SQL Server table via OData Version 4 and filtering on a BIT/BOOLEAN field, Hybrid Data Pipeline returned the ODataApplicationException An expression of non-boolean type specified in a context where a condition is expected, near ')'.

Issue HDP-4480 Shutdown script not working

With some shell configurations, the Hybrid Data Pipeline shutdown script stop.sh was not shutting down Hybrid Data Pipeline server processes.

4.6.1.20

Resolved Issues

Issue HDP-4465 Parenthesis in OData query not honored

When translating an OData Version 4 query to a SQL query, Hybrid Data Pipeline did not honor the parenthesis in the OData query. This caused the reordering of the operator precedence, and lead to incorrect results. The parenthesis in OData queries is now reproduced in SQL queries to maintain operator precedence.

Issue HDP-4464 Intermittent error "origin=driver message=PAYLOAD_TYPE_BINARY != opcode, was 10" (On-Premises Connector version 4.6.1.8)

When connecting to an on-premise data source using the On-Premises Connector, the error origin=driver message=PAYLOAD_TYPE_BINARY != opcode, was 10 was returned. The On-Premises Connector now correctly handles pongs sent from load balancers according to WebSocket protocol.

4.6.1.14 (GA)

Enhancements

Entity case conversion feature

The Hybrid Data Pipeline has been enhanced to support case conversions for entity types, entity sets, and properties. The owner of a data source can now change the entity type, entity set, and property names to all uppercase or all lowercase on the OData tab in the Web UI or using the Hybrid Data Pipeline Management API.

Web UI data source sharing feature

The Web UI has been enhanced to support data source sharing. The owner of a data source can now share access to a data store with Hybrid Data Pipeline users and tenants through the Data Sources view in the Web UI.

Web UI IP address whitelist feature

The Web UI has been enhanced to fully support the IP address whitelist feature. Administrators can secure access to Hybrid Data Pipeline resources by implementing IP address whitelists through the Web UI. The Web UI can be used to create IP address whitelists at the system level, tenant level, user level, or some combination of these levels

Web UI navigation bar

The navigation bar can be expanded to show the names of the views supported in the Web UI. The icons in the navigation bar have been reordered and updated.

PostgreSQL OData Version 4 stored functions

Hybrid Data Pipeline supports exposing stored functions for OData Version 4 connectivity to PostgreSQL data sources. When configuring a PostgreSQL data source, the OData schema map can be configured to expose stored functions.

JDBC and ODBC throttling

A new throttling limit has been introduced in the System Limits view. The XdbcMaxResponse limit can be used to set the approximate maximum size of JDBC and ODBC HTTP result data.

ODBC driver branded installation

The ODBC driver installation program has been enhanced to support branded installations for OEM customers (available in the ODBC driver installer on November 18, 2019). The branded driver can then be distributed with OEM customer client applications. For the Hybrid Data Pipeline ODBC driver distribution guide, visit the Progress DataDirect Product Books page on the Progress PartnerLink website (login required).

Tomcat upgrade

The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.20. (On-Premises Connector version 4.6.1.7)

Known Issues

See Hybrid Data Pipeline known issues for details.

Connect any application to any data source anywhere

Explore all DataDirect Connectors

A product specialist will be glad to get in touch with you

Contact Us