Enhancements and resolved issues are provided according to product version number. The product version number can be obtained from the Hybrid Data Pipeline Web UI by selecting About from the help dropdown. When applicable, a component-specific version number is provided for the On-Premises Connector or a driver. Version numbers for these components are not available through the Web UI. The On-Premises Connector version number can be obtained by opening the On-Premises Connector Configuration Tool and clicking the Version tab. The version number for the JDBC driver can be obtained (1) by calling the DatabaseMetaData.getDriverVersion() method or (2) by executing the following command from the driver installation directory: java -cp ddhybrid.jar com.ddtek.jdbc.ddhybrid.DDHybridDriver. For the ODBC driver, see Driver version string for details on obtaining the driver version number.
Stored procedures functionality now includes support for input parameters, output parameters, and in/out parameters. Stored procedures that return multiple results are also supported. This functionality is supported in the following component versions.
- Hybrid Data Pipeline Server 18.104.22.1686 and higher
- On-Premises Connector 22.214.171.124 and higher
- JDBC Driver 126.96.36.199 and higher
- ODBC Driver 188.8.131.52 and higher
Issue HDP-5020 Error message did not state reason that the required secure random instance could not be created when enabling FIPS
When attempting to enable FIPS, the error message did not state that the required secure random instance could not be created because there was not enough entropy on the host machine.
Issue HDP-5064 JDBC driver not able to follow redirects (JDBC driver 184.108.40.206)
When an HTTP redirect status was returned, the driver was unable to follow the redirection and returned an error that the HTTP endpoint had been relocated. To resolve this issue, the FollowRedirects connection property has been introduced. When FollowRedirects is enabled, the driver can follow an HTTP redirection instead of returning an error. For details, refer to FollowRedirects.
Issue HDP-5412 "Unexpected end of stream in statement" error returned
When processing an empty result reply to a query execution request against a ServiceNow REST service, Hybrid Data Pipeline returned "Unexpected end of stream in statement" error.
Hybrid Data Pipeline now supports user authentication using the SSO/SAML protocol. Customers can configure SAML authentication by providing the details of an identity provider and can configure users to use the SAML authentication.
Issue HDP-4549 HDP server unreachable due to OS file handle leak
When the "FileNotFoundException (Too many open files)" error occurred, the Hybrid Data Pipeline connection was lost and the server had to be restarted.
Issue HDP-5202 Error returned when fetching MySQL zero values for date and datetime columns
When fetching invalid date and datetime values from columns or literals, such as SELECT DATE(0), against MySQL data sources, the Hybrid Data Pipeline server returned an error.
Issue HDP-5210 OData v4 Endpoint not compatible with Tableau Desktop
Tableau was unable to connect to OData v4 endpoints exposed by Hybrid Data Pipeline.
Issue HDP-5217 Some special characters not allowed in passwords
Users were unable to use special characters for Hybrid Data Pipeline passwords.
Issue HDP-5266 Load balancer not returning OData responses from the server
When HTTP was disabled on the load balancer, the load balancer did not return OData responses to the client application as would be expected with the configuration of the X-Forwarded-Proto header to manage HTTP and HTTPS traffic.
Google BigQuery support
Hybrid Data Pipeline now supports access to Google BigQuery. Hybrid Data Pipeline can be used to create and manage data sources for JDBC, ODBC, and OData client application access to Google BigQuery. OAuth 2.0 and Service Account authentication methods are supported.
Issue HDP-4923 Performance issue querying OpenEdge database
To improve performance, the OData startswith() function was changed to use LIKE instead of LOCATE for a SQL statement that takes less time to execute.
Hybrid Data Pipeline supports changing the catalog of data sources. The setCatalog method can be used to change catalogs in JDBC, while the connection attribute SQL_ATTR_CURRENT_CATALOG can be used in ODBC. Support for changing catalogs includes support for changing the default database on an active connection to a SQL Server data source. This support extends to any data source configured with an underlying JDBC connector that supports the setCatalog method. This enhancement is available in the latest build of the Hybrid Data Pipeline server (220.127.116.11). Components such as the Hybrid Data Pipeline ODBC and JDBC drivers, as well as the On-Premises Connector must be reinstalled to adopt the enhancement (On-Premises Connector version 18.104.22.168, ODBC driver 22.214.171.124, JDBC driver 126.96.36.199).
Issue HDP-4463 JDBC driver defaulted to service.datadirectcloud.com host name and returned inaccurate error message (JDBC driver 188.8.131.52)
When an incorrect host name was specified in the connection URL, the Hybrid Data Pipeline JDBC driver defaulted to service.datadirectcloud.cloud.com as the host name and returned an inaccurate error message.
Issue HDP-4858 ODBC driver not installing on Amazon Linux 2
The ODBC driver was not installing on Amazon Linux 2.
OData query throttling for users
Hybrid Data Pipeline supports throttling the number of simultaneous OData queries a user may have running against a Hybrid Data Pipeline server at one time. OData query throttling for users may be configured with the ODataMaxConcurrentRequests and ODataMaxWaitingRequests limits. The ODataMaxConcurrentRequests limit sets the maximum number of simultaneous OData requests allowed per user, while the ODataMaxWaitingRequests limit sets the maximum number of waiting OData requests allowed per user. See Throttling in the user's guide for details.
Environment variables support for silent installation
Support for environment variables to specify server and system database credentials during the installation process has been added. The use of environment variables allows you to perform a more secure silent installation, compared to a standard silent installation where credential information must be specified in plain text in the silent installation response file. See Silent installation process in the user's guide for details.
Issue HDP-4853 Installation failed when special characters were used in system database credentials
When installing the Hybrid Data Pipeline server using SQL Server as the system database, the use of special characters in admin or user account credentials caused the installation to fail with the error Error in createSchema at Line 266.
NOTE: While installation no longer fails when special characters are used in system database account credentials, the installer cannot currently validate the necessary database schema objects when any of the following special characters are used in either database user ID or password values: space ( ), quotation mark ("), number sign (#), dollar sign ($), and apostrophe ('). Therefore, in a standard installation where these characters are used in database credentials, database validation must be skipped to proceed with the installation. Similarly, when performing a silent installation in this case, the SKIP_DATABASE_VALIDATION property should be set to true. Note that when skipping database validation in this scenario, the server should install successfully and work with the specified system database.
Issue HDP-4854 Silent installation process required the specification of system database admin and user passwords in clear text in the response file
The specification of system database admin and user passwords in plain text in the response file as part of the silent installation process raised security concerns. Support for environment variables to specify server and system database credentials during the installation process has been added. See Silent installation process in the user's guide for details.
Issue HDP-4859 Firefox, Chrome, and Microsoft Edge browsers not rendering Web UI correctly for load balancer installation
When the HAProxy load balancer was configured with the setting x-content-type-options:nosniff, Firefox, Chrome, and Microsoft Edge browsers rendered the Web UI as text instead of HTML.
Hybrid Data Pipeline has added support for a subset of the functionality defined by the OData Version 4 extension for data aggregation. Aggregation functionality is extended with the $apply query parameter. See Aggregation support in the user's guide for details.
Windows Server 2019
The Hybrid Data Pipeline On-Premises Connector, ODBC driver, and JDBC driver have added support for Windows Server 2019. (On-Premises Connector version 184.108.40.206, ODBC driver 220.127.116.11, JDBC driver 18.104.22.168)
Issue HDP-4478 Unable to connect using TNS connection option for Oracle
The Hybrid Data Pipeline server required the specification of the Server Name parameter, even though Server Name is not required for a TNS connection. In addition, when Server Name was specified, the server returned an inaccurate error message.
Issue HDP-4757 Cannot retrieve data from SQL Server table (On-Premises Connector version 22.214.171.124)
Sometimes when trying to execute SELECT * FROM table against an on-premise SQL Server database using the On-Premises Connector, the ODBC driver returned the error [HY000] [DataDirect][ODBC Hybrid driver][SQLServer]Unexpected content at the end of chunk.
Issue HDP-4574 HTTP error 404 while renaming the connector label (On-Premises Connector version 126.96.36.199)
When the name of the On-Premises Connector host machine was in all uppercase at the time of the installation of the On-Premises Connector, the Connector Label field in the On-Premises Configuration Tool did not populate with the hostname as expected. Then, when attempting to update the Connector Label field with the correct hostname, the On-Premises Configuration Tool returned Error setting connector label for user Request returned Status:404 Message.
Issue HDP-4704 Error while accessing link tables in MS Access application using Hybrid Data Pipeline data source (ODBC driver 188.8.131.52)
When using the Hybrid Data Pipeline ODBC driver to connect to a data source created with a third party JDBC driver, the following error was returned: ODBC--call failed. [DataDirect][ODBC Hybrid driver]Numeric value out of range. Error in column 16. (#0). This error was returned because the third party driver diverged from the JDBC specification when describing the data type of CHAR_OCTET_LENGTH for DatabaseMetaData.getColumns(). The ODBC driver has been modified to work with the third party JDBC driver despite this divergence from the JDBC specification.
SQL statement auditing
Hybrid Data Pipeline has added a SQL statement auditing feature. When SQL statement auditing is enabled, the connectivity service records SQL statements and related metrics in the SQLAudit table on the Hybrid Data Pipeline system database (also referred to as the account database). This information can then be queried directly by administrators. See SQL statement auditing in the user's guide for details.
The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.37. (On-Premises Connector version 184.108.40.206)
Issue HDP-4534 Unable to connect to Google Analytics data source
When connecting to Google Analytics data source using an OAuth profile created in a previous version of Hybrid Data Pipeline, the following error was returned: There is a problem connecting to the data source. Error getting data source. System not available, try again later.
Issue HDP-4490 ODataApplicationException returned when filtering on BIT/BOOLEAN field
When exposing a Microsoft SQL Server table via OData Version 4 and filtering on a BIT/BOOLEAN field, Hybrid Data Pipeline returned the ODataApplicationException An expression of non-boolean type specified in a context where a condition is expected, near ')'.
Issue HDP-4480 Shutdown script not working
With some shell configurations, the Hybrid Data Pipeline shutdown script stop.sh was not shutting down Hybrid Data Pipeline server processes.
Issue HDP-4465 Parenthesis in OData query not honored
When translating an OData Version 4 query to a SQL query, Hybrid Data Pipeline did not honor the parenthesis in the OData query. This caused the reordering of the operator precedence, and lead to incorrect results. The parenthesis in OData queries is now reproduced in SQL queries to maintain operator precedence.
Issue HDP-4464 Intermittent error "origin=driver message=PAYLOAD_TYPE_BINARY != opcode, was 10" (On-Premises Connector version 220.127.116.11)
When connecting to an on-premise data source using the On-Premises Connector, the error origin=driver message=PAYLOAD_TYPE_BINARY != opcode, was 10 was returned. The On-Premises Connector now correctly handles pongs sent from load balancers according to WebSocket protocol.
Entity case conversion feature
The Hybrid Data Pipeline has been enhanced to support case conversions for entity types, entity sets, and properties. The owner of a data source can now change the entity type, entity set, and property names to all uppercase or all lowercase on the OData tab in the Web UI or using the Hybrid Data Pipeline Management API.
Web UI data source sharing feature
The Web UI has been enhanced to support data source sharing. The owner of a data source can now share access to a data store with Hybrid Data Pipeline users and tenants through the Data Sources view in the Web UI.
Web UI IP address whitelist feature
The Web UI has been enhanced to fully support the IP address whitelist feature. Administrators can secure access to Hybrid Data Pipeline resources by implementing IP address whitelists through the Web UI. The Web UI can be used to create IP address whitelists at the system level, tenant level, user level, or some combination of these levels
Web UI navigation bar
The navigation bar can be expanded to show the names of the views supported in the Web UI. The icons in the navigation bar have been reordered and updated.
PostgreSQL OData Version 4 stored functions
Hybrid Data Pipeline supports exposing stored functions for OData Version 4 connectivity to PostgreSQL data sources. When configuring a PostgreSQL data source, the OData schema map can be configured to expose stored functions.
JDBC and ODBC throttling
A new throttling limit has been introduced in the System Limits view. The XdbcMaxResponse limit can be used to set the approximate maximum size of JDBC and ODBC HTTP result data.
ODBC driver branded installation
The ODBC driver installation program has been enhanced to support branded installations for OEM customers (available in the ODBC driver installer on November 18, 2019). The branded driver can then be distributed with OEM customer client applications. For the Hybrid Data Pipeline ODBC driver distribution guide, visit the Progress DataDirect Product Books page on the Progress PartnerLink website (login required).
The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 9.0.20. (On-Premises Connector version 18.104.22.168)
See Hybrid Data Pipeline known issues for details.