4.2.1 archive

Note: This version of Hybrid Data Pipeline has reached end of life. These release notes are for reference purposes.

Changes Since Release 4.2.1

Enhancements

Change password functionality
  • Hybrid Data Pipeline change password functionality has been enhanced. When changing passwords, users must now provide a current password as well as a new password by default. The Administrator's API has been modified to support this functional change. The changepassword API now includes the currentPassword parameter, as well as the newPassword parameter, in the payload.
       {
       "currentPassword": "<mycurrentpassword>"
       "newPassword": "<mynewpassword>"
       }
    Administrators can fall back to the old functionality by setting the configurations API with the new secureChangePassword attribute (specified with the number 2). For example, the following PUT operation would configure the system to use the old functionality where the user must provide only a new password.
       https://myserver:port/api/admin/configurations/2
       {
       "value": "false"
       }

Resolved Issues

  • 4.2.1.59. Issue 83987. Resolved an issue where editing of the OData schema map resulted in the addition of "entityNameMode":"pluralize" when a data source had been configured with OData Version 4, the OData schema map version was odata_mapping_v3, and entityNameMode had not been included.
  • 4.2.1.59. Issues 84061. Resolved issues where the Web UI was not displaying function synonyms for read-only users and where the Web UI duplicated function parameters when synonyms were created for read-only users.
  • 4.2.1.59. Issue 84480. Resolved an issue where the data access service, when configured with a delimiter for external authentication, required the user login to contain the user name, a delimiter, and the identifier for the internal authentication service, for any users authenticating with the internal authentication service. For example, if the data access service was configured with the @ character as the delimiter, then authenticating as an internal user might look like user1@Internal. Now the user login only needs to contain the user name for any users authenticating with the internal authentication service, for example, user1. When only the user name is provided, the data access service uses the internal authentication service to authenticate the user.
  • 4.2.1.59. Issue 84496. Resolved an issue where the data access server was not running in FIPS approved mode when FIPS was enabled. The Bouncy Castle BCFIPS security provider now ensures that the data access service is running in FIPS approved mode.
    When the data access and notification services start, they check to see if they are running in FIPS approved mode. You can confirm that the services are running in FIPS approved mode by checking their corresponding log files: das/server/logs/catalina.out and notification/logs/palatte/datestamp-SYSTEM.log. With result=true, the log entry confirms that the service is running in FIPS approved mode:
    Check for BouncyCastle Approved Only Mode [result=true]
    NOTE: Because the installer program is not capable of regenerating encryption keys for existing users and data sources, we currently recommend a new, clean installation of Hybrid Data Pipeline with FIPS enabled when upgrading from a non-FIPS-compliant server to a FIPS-compliant server. With a new installation, users and data sources must be re-created.
  • 4.2.1.59. Issue 84499. Resolved an issue where a log file was created for each external user when the data access service is configured to use an external authentication service. The data access service now produces a single log file for each internal user and data source with logging details for each external user associated with that internal user and data source.
  • 4.2.1.59. Issue 84527. Resolved an issue where the database host name and port numbers were included in an error message when a query was made against the data access service with the database down.

4.2.1 Release Notes

Security

FIPS compliance
  • Hybrid Data Pipeline is now FIPS 140-2 compliant. By default, HDP will be installed in a FIPS disabled mode. We recommend a new, clean installation with FIPS enabled for production environments. With a new installation, users and datasources must be re-created. For information on how to enable FIPS, refer to the Progress DataDirect Hybrid Data Pipeline Installation Guide.

    Note: The On-Premises Connector is not currently FIPS compliant. Therefore, any connections made to an on-premises data source through an On-Premises Connector will not be fully FIPS compliant.

Support for external authentication
  • Hybrid Data Pipeline now supports two types of authentication: the Hybrid Data Pipeline internal authentication mechanism and external authentication. The external authentication feature is supported as a Java plugin. Administrators can create their own implementation and plug it into Hybrid Data Pipeline either at the time of installation, or at a later time. After external authentication is set up successfully, using APIs, one can set up users in such a way that they get authenticated against an external authentication system. Optionally, multiple external authentication users can be configured to map to one Hybrid Data Pipeline user to get access to data sources.
Tomcat Upgrade
  • The Hybrid Data Pipeline server and On-Premises Connector have been upgraded to install and use Tomcat 8.5.23.

Enhancements

Hybrid Data Pipeline Server
  • OData Version 4 functions. With 4.2.1, Hybrid Data Pipeline supports OData Version 4 functions for Oracle data sources only. If the Oracle database contains stored functions, they can be exposed using an OData Version 4 service. As part of OData function support, OData schema map version has been changed. The Web UI will automatically migrate the existing OData schema map to a newer OData schema map version when the OData schema is modified for OData Version 4 data sources.

    The following aspects of OData Version 4 functions are supported:

    • Functions that are unbound (static operations)
    • Function imports
    • Functions that return primitive types
    • Function invocation with OData system query options $filter

    The following aspects of OData Version 4 functions are currently NOT supported:

    • Functions that return complex types and entities
    • Functions that are bound to entities
    • Built-in functions
    • Functions with OUT/INOUT parameters
    • Overloaded functions
    • OData system query options using $select
    • OData system query options using $orderby
    • Functions that invoke Parameter value
    • Parameter aliases are not supported. Hence, function invocation with function parameters as URL query parameters is not supported.
  • Log files cleanup. Hybrid Data Pipeline now enables you to configure the number of days for which log files must be stored. This is to prevent log files from completely filling up your directories. You can use the Limits API to specify the number of days for log file retention.
    • Support for Ubuntu. Hybrid Data Pipeline Server now supports Ubuntu Linux version 16 and higher.
    • Installation procedures and response file. The installation procedures have been modified with the introduction of support for FIPS and External Authentication. New prompts have been added to the installation process. One of these prompts has a corresponding option that appears in the response file generated by the latest installer for silent installation. If you are using a response file generated by an earlier version of the installer, you should regenerate the response file with the latest installer. The new response file should then be used for silent installations. The following table provides the new settings. The settings differ depending on whether you generate the response file with a GUI or console installation.
    New response file options
    GUIConsoleDefinition
    D2C_USING_FIPS_CONFIGD2C_USING_FIPS_CONFIG_CONSOLESpecifies if you want to configure the server to be FIPS-compliant.
     

    Connect any application to any data source anywhere

    Explore all DataDirect Connectors

    A product specialist will be glad to get in touch with you

    Contact Us