Enhancements and resolved issues are provided according to product version number. The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown. When applicable, a component-specific version number is provided for the On-Premises Connector or a driver. Version numbers for these components are not available through the Web UI. The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab. The version number for the JDBC driver can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory: java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver. For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Hybrid Data Pipeline now supports branding of its Web UI. The default branding information like logo, colors, naming, and icons can be configured before or after installation. For more information, refer to Branding the Web UI for details.
The Autonomous REST Composer is now available on the Configure Endpoints tab from the Autonomous REST Connector data store interface. The Composer allows you to create a REST data source and configure or import a REST Model file using the Web UI. For more information, refer to Creating REST data sources with the Web UI for details.
Hybrid Data Pipeline has been updated to install and use Tomcat 9.0.73. In addition, the following Hybrid Data Pipeline Tomcat configurations have been made to improve security.
-1
disables this port.Oracle 19c has been certified to operate as a Hybrid Data Pipeline system database.
The Microsoft Dynamics 365 data store supports a new connection option Cross Company that allows access to cross company data for users who have access to multiple companies. Refer to Microsoft Dynamics 365 parameters for details.
When querying a SQL Server data source, the JDBC driver returned the "unexpected end of stream reached" error. (JDBC driver 4.6.1.247)
The shipping version of the Tomcat server was upgraded from Tomcat 9.0.65 to 9.0.73 to address the vulnerability described in CVE-2023-24998. (Hybrid Data Pipeline server 4.6.1.1391, On-Premises Connector 4.6.1.524)
After configuring the Hybrid Data Pipeline server to use an external JRE and run in FIPS mode, server-side SSL could not be enabled. (Hybrid Data Pipeline server 4.6.1.1391)
The curl library files that are installed with the ODBC driver have been upgraded to version 7.88.1, which fixes a number of potential security vulnerabilities. For more information on the vulnerabilities resolved by this enhancement, refer to Vulnerabilities in the curl documentation. (ODBC driver 4.6.1.158)
The default version of the OpenSSL library has been upgraded to version 1.1.1t, which fixes a number of potential security vulnerabilities. For more information on the vulnerabilities resolved by this enhancement, refer to Vulnerabilities: Fixed in OpenSSL 1.1.1 in OpenSSL News. (ODBC driver 4.6.1.158)
When upgrading the Hybrid Data Pipeline server to enable FIPS, the installation failed and the installer returned an account database error.
The JDBC driver was allowing statements to be executed after a connection was terminated, resulting in an "Invalid session token" error.
On a JDBC data source configured for OAuth and created with the DataDirect Snowflake JDBC driver, the user was prompted for a user ID and password when attempting to test connect.
When fetching data from an OData-enabled Oracle database, Hybrid Data Pipeline returned Date and Time values only in UTC.
When using the SQL Editor to query a SQL Server data source, the SQL Editor was unable to browse tables, views, and procedures under any schema name that included a dot.
When deploying the server as a Docker container, using the HDP_DATABASE_ADVANCED_OPTIONS option to enable SSL (HDP_DATABASE_ADVANCED_OPTIONS=EncryptionMethod=SSL) failed to enable SSL against the system database.
The Microsoft Dynamics CRM data store was recently deprecated, and has now been removed from the product package. Connectivity to a number of Dynamics 365 apps, including CRM and ERP apps, is supported with app-specific Hybrid Data Pipeline data stores. Refer to Microsoft Dynamics 365 parameters for details.
The Rollbase data store has been removed from the product package. If you would like to reintroduce the Rollbase data store, contact Technical Support.
The SugarCRM data store has been removed from the product package. If you would like to reintroduce the SugarCRM data store, contact Technical Support.
The Hybrid Data Pipeline product and it’s connectors utilized a version of HyperSQL Database that was vulnerable to remote code execution described in CVE-2022-41853. All impacted components have been patched to fix this vulnerability. For details of components impacted and fixed versions, refer to the following KB article:
Note: In addition to updating the Hybrid Data Pipeline server, if any On-Premises Connectors are used in your environment, they should be updated with build 4.6.1.395 of the On-Premises Connector
After an initial connection to Microsoft Dynamics 365 using the OAuth 2.0 client credentials grant, the Authorization URI field automatically populated with the default value when the data source was reopened. The value in the Authorization URI field had to be manually cleared to reconnect with Microsoft Dynamics 365.
Hybrid Data Pipeline was unable to connect to an Azure Synapse serverless database via a SQL Server data source.
Hybrid Data Pipeline now supports access to a number of Microsoft Dynamics 365 apps. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to these Dynamics 365 apps. OAuth 2.0 connectivity is supported. Refer to Microsoft Dynamics 365 parameters for details. (On-Premises Connector 4.6.1.287)
The generally available Hybrid Data Pipeline Docker image now supports a trial Docker deployment. After you obtain the image from the Progress Enterprise Delivery site (ESD) or the Trial Download page, you may perform a trial deployment of Hybrid Data Pipeline as a Docker container on a single node with an internal system database. Refer to the Hybrid Data Pipeline Trial Docker Deployment Getting Started for details.
A Power BI custom connector is now available from the Progress DataDirect Hybrid Data Pipeline Public GitHub repository. This custom connector may be used to implement connectivity from Power BI to Hybrid Data Pipeline resources that use OAuth 2.0 or OIDC authentication. For details, refer to Configuring a Power BI custom connector for OAuth 2.0 or Configuring a Power BI custom connector for OIDC.
The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.63. This addresses the CVE-2022-23181 security vulnerability that was mitigated with the resolution of Issue HDP-5924. (On-Premises Connector version 4.6.1.287)
The Microsoft Dynamics CRM data store has been deprecated. Connectivity to a number of Dynamics 365 apps is now supported with app-specific Hybrid Data Pipeline data stores. Refer to Microsoft Dynamics 365 parameters for details. (On-Premises Connector 4.6.1.287)
The Docker trial image has been deprecated. A Docker trial deployment of Hybrid Data Pipeline may now be performed using the generally available Hybrid Data Pipeline Docker image. This image may be obtained from the Progress Enterprise Delivery site (ESD) or the Trial Download page. Refer to the Hybrid Data Pipeline Trial Docker Deployment Getting Started for details.
The ODBC driver did not support the GUID data type. (ODBC driver 4.6.1.67)
The shipping version of the Tomcat server was upgraded from Tomcat 9.0.54 to 9.0.63. This addresses the CVE-2022-23181 security vulnerability
that was mitigated with the resolution of Issue HDP-5924. (On-Premises Connector 4.6.1.287)
When using the SQL Editor to query datetimeoffset and sql_variant, a NullPointerException was returned.
When setting HDP_DATABASE_ADVANCED_OPTIONS to use an SSL connection to the external system database, the setting was not propagated correctly.
When performing a Hybrid Data Pipeline server upgrade in an environment using FIPS and an external JRE, the upgrade failed with the error Error in MAIN at line 576.
Idle Google BigQuery connections did not fully close, causing the Hybrid Data Pipeline server to reach the limit on open files.
After upgrading to the On-Premises Connector, the On-Premises Connector received an HTTP 401 error from the Hybrid Data Pipeline server when attempting to connect. (On-Premises Connector 4.6.1.255)
When attempting to authenticate using SAML, Hybrid Data Pipeline returned the exception "Request header is too large."
After an account lockout occurred, OData queries were running successfully.
When using a third-party connector where the database credentials are not included in the Hybrid Data Pipeline data source, the user is prompted to enter credentials. In this scenario, Hybrid Data Pipeline returned the error message "user name is missing." (JDBC driver 4.6.1.77)
Hybrid Data Pipeline was unable to support the Azure Database for PostgreSQL as an external database because Azure Database for PostgreSQL requires a unique user naming convention.
When attempting to query an Azure Synapse serverless instance, Hybrid Data Pipeline returned a java.io.IOException worker thread error.
Hybrid Data Pipeline has been enhanced to support Google Analytics, Google BigQuery, Salesforce, and REST data store implementations of OAuth 2.0. To integrate Hybrid Data Pipeline with an OAuth 2.0 authorization flow, Hybrid Data Pipeline must be registered as a client application with the given data store. Then, OAuth application and profile objects must be created to manage OAuth endpoints, properties, and tokens. For details, refer to Integrating Hybrid Data Pipeline as a client application with a data store OAuth 2.0 authorization flow. (On-Premises Connector 4.6.1.241)
The procedures for integrating Hybrid Data Pipeline as a client application to enable access to Google Analytics include the ability to select or create an OAuth application in the Web UI. For details, refer to Google Analytics parameters.
When selecting a data source from the dropdown in the SQL Editor, the server is returning the error "No suitable driver found."
When performing an insert on an OData-enabled MySQL Community Edition data source, Hybrid Data Pipeline returned an error on a datetime column.
Performing a mergeEntity operation against an OData-enabled MySQL Community Edition data source resulted in a NullPointerException.
Server-side SSL could not be configured because the enable_ssl.sh script was not properly setting the truststore information from the Web UI.
To mitigate the CVE-2022-23181 security vulnerability, the Tomcat context.xml file has been modified
such that session persistence is disabled by default.
When using the PostgreSQL JDBC driver as a third party driver to connect to backend data, the Metadata Exposed Schemas dropdown did not load PostgreSQL schemas.
After upgrading to server build 4.6.1.357, the introduction of a new keystore prevented successful login.
Hybrid Data Pipeline deployment failed when using environment variables to deploy the server as a Docker container
Hybrid Data Pipeline has been updated to use Spring Framework version 5.3.18, Spring Boot version 2.6.6, and Spring Security version 5.6.2 to address the vulnerability described in CVE-2022-22965. (Hybrid Data Pipeline server 4.6.1.417, On-Premises Connector 4.6.1.164, JDBC driver 4.1.6.43)
Hybrid Data Pipeline has been updated to use version 2.13.2.2 of the Jackson library to address the vulnerability described in CVE-2020-36518. (Hybrid Data Pipeline server 4.6.1.417, On-Premises Connector 4.6.1.164, JDBC driver 4.1.6.43)
After upgrading to On-Premises Connector build 4.6.1.120, the On-Premises Connector received an HTTP 401 error from the Hybrid Data Pipeline server when attempting to connect. (On-Premises Connector 4.6.1.164)
Hybrid Data Pipeline now supports connectivity to PostgreSQL 14 databases. PostgreSQL 14 can also be used as a system database to store account and configuration information for a Hybrid Data Pipeline instance. This functionality is supported in the following component versions.
The default behavior for handling PostgreSQL call escape syntax has changed. Previously, Hybrid Data Pipeline only supported stored functions, and treated the non-standard escape syntax {call function()} the same as the standard escape syntax {? = call function()}. With this latest patch, Hybrid Data Pipeline supports stored functions and stored procedures for JDBC and ODBC connections. Now Hybrid Data Pipeline determines whether a function or procedure is being called based on the call escape syntax. If the return value parameter ?= is used, then the connectivity service calls a stored function. If the return value parameter is not used, then the connectivity service calls a stored procedure. You can change this default behavior by setting the CallEscapeBehavior option as an extended option under the Advanced tab. These are the valid values for the CallEscapeBehavior option:
When using the OData $expand functionality to query an OpenEdge data source, the query failed and an error was returned.
When a SQL query included columns of the same name, the SQL Editor did not display the column values.
The SQL Editor did not display results as expected.
Hybrid Data Pipeline now supports server-side SSL. Server-side SSL allows you to enable SSL behind the load balancer and secure communication between the load balancer and server nodes, as well as Hybrid Data Pipeline nodes in a cluster deployment. This functionality is supported in the following component versions.
Note:
The curl library files used with the ODBC driver have been upgraded to version 7.80.0.
The default version of the OpenSSL library used with the ODBC driver has been upgraded to version 1.1.1l.
The SQL Editor was not displaying SYNONYM objects.
When queries demanded the return of multiple large result sets, the query failed and the error "Unexpected end of stream" was returned.
Hybrid Data Pipeline has been updated to use Log4j version 2.17.1 to address security vulnerabilities found in Log4j versions 2.17 as described in CVE-2021-44832. For details, refer to CVE-2021-44832. (Hybrid Data Pipeline server 4.6.1.325, On-Premises Connector version 4.6.1.99).
Hybrid Data Pipeline has been updated to use Log4j version 2.17 to address security vulnerabilities found in Log4j versions 2.15 and 2.16 as described in CVE-2021-45046 and CVE-2021-45105. For details, refer to CVE-2021-45046 and CVE-2021-45105. (Hybrid Data Pipeline server 4.6.1.311, On-Premises Connector 4.6.1.91).
Hybrid Data Pipeline has been updated to use Log4j version 2.15 to address the security vulnerability found in Log4j version 2.13.3 as described in CVE-2021-44228. For details, refer to CVE-2021-44228. (Hybrid Data Pipeline server 4.6.1.306, On-Premises Connector version 4.6.1.85).
A production instance of the Hybrid Data Pipeline server can now be deployed using a Docker image. The Hybrid Data Pipeline Docker image is available in the Hybrid Data Pipeline Docker Deployment Package. In addition, the Docker Deployment Package includes demos for a number of deployment scenarios. For details and instructions, see Deploying Hybrid Data Pipeline using Docker in the installation guide.
Hybrid Data Pipeline now supports user authentication using the OIDC protocol. An identity provider and client applications can be configured to authorize users and grant access to the OData endpoints of the Hybrid Data Pipeline server. See Integrating an OIDC authentication service in the user's guide for details.
When using the third-party JDBC Oracle driver, the Hybrid Data Pipeline SQL Editor did not return tables.
When the special character '+' (plus sign) was used in an account password, the user was unable to authenticate with the Hybrid Data Pipeline server.
Hybrid Data Pipeline was unable to access Oracle Cloud Financials REST Endpoints with the Autonomous REST Connector.
Hybrid Data Pipeline now supports invoking stored procedures for JDBC and ODBC connections. Stored procedures functionality includes support for input parameters, output parameters, and in/out parameters. Stored procedures that return multiple results are also supported. This functionality is supported in the following component versions.
When attempting to enable FIPS, the error message did not state that the required secure random instance could not be created because there was not enough entropy on the host machine.
When an HTTP redirect status was returned, the driver was unable to follow the redirection and returned an error that the HTTP endpoint had been relocated. To resolve this issue, the FollowRedirects connection property has been introduced. When FollowRedirects is enabled, the driver can follow an HTTP redirection instead of returning an error. For details, refer to FollowRedirects.
When processing an empty result reply to a query execution request against a ServiceNow REST service, Hybrid Data Pipeline returned "Unexpected end of stream in statement" error.
Hybrid Data Pipeline now supports user authentication using the SSO/SAML protocol. Customers can configure SAML authentication by providing the details of an identity provider and can configure users to use the SAML authentication.
When the "FileNotFoundException (Too many open files)" error occurred, the Hybrid Data Pipeline connection was lost and the server had to be restarted.
When fetching invalid date and datetime values from columns or literals, such as SELECT DATE(0), against MySQL data sources, the Hybrid Data Pipeline server returned an error.
Tableau was unable to connect to OData v4 endpoints exposed by Hybrid Data Pipeline.
Users were unable to use special characters for Hybrid Data Pipeline passwords.
When HTTP was disabled on the load balancer, the load balancer did not return OData responses to the client application as would be expected with the configuration of the X-Forwarded-Proto header to manage HTTP and HTTPS traffic.
Hybrid Data Pipeline now supports access to Google BigQuery. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to Google BigQuery. OAuth 2.0 and Service Account authentication methods are supported.
To improve performance, the OData startswith() function was changed to use LIKE instead of LOCATE for a SQL statement that takes less time to execute.
Hybrid Data Pipeline supports changing the catalog of data sources. The setCatalog method can be used to change catalogs in JDBC, while the connection attribute SQL_ATTR_CURRENT_CATALOG can be used in ODBC. Support for changing catalogs includes support for changing the default database on an active connection to a SQL Server data source. This support extends to any data source configured with an underlying JDBC connector that supports the setCatalog method. This enhancement is available in the latest build of the Hybrid Data Pipeline server (4.6.1.132). Components such as the Hybrid Data Pipeline ODBC and JDBC drivers, as well as the On-Premises Connector must be reinstalled to adopt the enhancement (On-Premises Connector version 4.6.1.62, ODBC driver 4.6.1.27, JDBC driver 4.6.1.13).
When an incorrect host name was specified in the connection URL, the Hybrid Data Pipeline JDBC driver defaulted to service.datadirectcloud.cloud.com as the host name and returned an inaccurate error message.
The ODBC driver was not installing on Amazon Linux 2.
Hybrid Data Pipeline supports throttling the number of simultaneous OData queries a user may have running against a Hybrid Data Pipeline server at one time. OData query throttling for users may be configured with the ODataMaxConcurrentRequests and ODataMaxWaitingRequests limits. The ODataMaxConcurrentRequests limit sets the maximum number of simultaneous OData requests allowed per user, while the ODataMaxWaitingRequests limit sets the maximum number of waiting OData requests allowed per user. See Throttling in the user's guide for details.
Support for environment variables to specify server and system database credentials during the installation process has been added. The use of environment variables allows you to perform a more secure silent installation, compared to a standard silent installation where credential information must be specified in plain text in the silent installation response file. See Silent installation process in the user's guide for details.
When installing the Hybrid Data Pipeline server using SQL Server as the system database, the use of special characters in admin or user account credentials caused the installation to fail with the error Error in createSchema at Line 266.
NOTE: While installation no longer fails when special characters are used in system database account credentials, the installer cannot currently validate the necessary database schema objects when any of the following special characters are used
in either database user ID or password values: space ( ), quotation mark ("), number sign (#), dollar sign ($), and apostrophe ('). Therefore, in a standard installation where these characters are used in database credentials, database validation
must be skipped to proceed with the installation. Similarly, when performing a silent installation in this case, the SKIP_DATABASE_VALIDATION property should be set to true.
Note that when skipping database validation in this scenario, the server should install successfully and work with the specified system database.
The specification of system database admin and user passwords in plain text in the response file as part of the silent installation process raised security concerns. Support for environment variables to specify server and system database credentials during the installation process has been added. See Silent installation process in the user's guide for details.
When the HAProxy load balancer was configured with the setting x-content-type-options:nosniff, Firefox, Chrome, and Microsoft Edge browsers rendered the Web UI as text instead of HTML.
Hybrid Data Pipeline has added support for a subset of the functionality defined by the OData Version 4 extension for data aggregation. Aggregation functionality is extended with the $apply query parameter. See Aggregation support in the user's guide for details.
The Hybrid Data Pipeline On-Premises Connector, ODBC driver, and JDBC driver have added support for Windows Server 2019. (On-Premises Connector version 4.6.1.48, ODBC driver 4.6.1.12, JDBC driver 4.6.1.9)
The Hybrid Data Pipeline server required the specification of the Server Name parameter, even though Server Name is not required for a TNS connection. In addition, when Server Name was specified, the server returned an inaccurate error message.
Sometimes when trying to execute SELECT * FROM table against an on-premise SQL Server database using the On-Premises Connector, the ODBC driver returned the error [HY000] [DataDirect][ODBC Hybrid driver][SQLServer]Unexpected content at the end of chunk.
When the name of the On-Premises Connector host machine was in all uppercase at the time of the installation of the On-Premises Connector, the Connector Label field in the On-Premises Configuration Tool did not populate with the hostname as expected. Then, when attempting to update the Connector Label field with the correct hostname, the On-Premises Configuration Tool returned Error setting connector label for user Request returned Status:404 Message.
When using the Hybrid Data Pipeline ODBC driver to connect to a data source created with a third party JDBC driver, the following error was returned: ODBC--call failed. [DataDirect][ODBC Hybrid driver]Numeric value out of range. Error in column 16. (#0). This error was returned because the third party driver diverged from the JDBC specification when describing the data type of CHAR_OCTET_LENGTH for DatabaseMetaData.getColumns(). The ODBC driver has been modified to work with the third party JDBC driver despite this divergence from the JDBC specification.
Hybrid Data Pipeline has added a SQL statement auditing feature. When SQL statement auditing is enabled, the connectivity service records SQL statements and related metrics in the SQLAudit table on the Hybrid Data Pipeline system database (also referred to as the account database). This information can then be queried directly by administrators. See SQL statement auditing in the user's guide for details.
The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.37. (On-Premises Connector version 4.6.1.14)
When connecting to Google Analytics data source using an OAuth profile created in a previous version of Hybrid Data Pipeline, the following error was returned: There is a problem connecting to the data source. Error getting data source. System not available, try again later.
When exposing a Microsoft SQL Server table via OData Version 4 and filtering on a BIT/BOOLEAN field, Hybrid Data Pipeline returned the ODataApplicationException An expression of non-boolean type specified in a context where a condition is expected, near ')'.
With some shell configurations, the Hybrid Data Pipeline shutdown script stop.sh was not shutting down Hybrid Data Pipeline server processes.
When translating an OData Version 4 query to a SQL query, Hybrid Data Pipeline did not honor the parenthesis in the OData query. This caused the reordering of the operator precedence, and lead to incorrect results. The parenthesis in OData queries is now reproduced in SQL queries to maintain operator precedence.
When connecting to an on-premise data source using the On-Premises Connector, the error origin=driver message=PAYLOAD_TYPE_BINARY != opcode, was 10 was returned. The On-Premises Connector now correctly handles pongs sent from load balancers according to WebSocket protocol.
The Hybrid Data Pipeline has been enhanced to support case conversions for entity types, entity sets, and properties. The owner of a data source can now change the entity type, entity set, and property names to all uppercase or all lowercase on the OData tab in the Web UI or using the Hybrid Data Pipeline Management API.
The Web UI has been enhanced to support data source sharing. The owner of a data source can now share access to a data store with Hybrid Data Pipeline users and tenants through the Data Sources view in the Web UI.
The Web UI has been enhanced to fully support the IP address whitelist feature. Administrators can secure access to Hybrid Data Pipeline resources by implementing IP address whitelists through the Web UI. The Web UI can be used to create IP address whitelists at the system level, tenant level, user level, or some combination of these levels
The navigation bar can be expanded to show the names of the views supported in the Web UI. The icons in the navigation bar have been reordered and updated.
Hybrid Data Pipeline supports exposing stored functions for OData Version 4 connectivity to PostgreSQL data sources. When configuring a PostgreSQL data source, the OData schema map can be configured to expose stored functions.
A new throttling limit has been introduced in the System Limits view. The XdbcMaxResponse limit can be used to set the approximate maximum size of JDBC and ODBC HTTP result data.
The ODBC driver installation program has been enhanced to support branded installations for OEM customers (available in the ODBC driver installer on November 18, 2019). The branded driver can then be distributed with OEM customer client applications. For the Hybrid Data Pipeline ODBC driver distribution guide, visit the Progress DataDirect Product Books page on the Progress PartnerLink website (login required).
The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.20. (On-Premises Connector version 4.6.1.7)
See Hybrid Data Pipeline known issues for details.