Progress DataDirect Hybrid Data Pipeline is a data access server that provides simple, secure access to cloud and on-premises data sources, such as RDBMS, Big Data, and NoSQL. Hybrid Data Pipeline allows business intelligence tools and applications to use ODBC, JDBC, or OData to access data from supported data sources. Hybrid Data Pipeline can be installed in the cloud or behind a firewall. Hybrid Data Pipeline can then be configured to work with applications and data sources in nearly any business environment. Progress DataDirect Hybrid Data Pipeline consists of four primary, separately installed components.
The Hybrid Data Pipeline server provides access to multiple data sources through a single, unified interface. The server can be hosted on premises or in the cloud.
The On-Premises Connector enables the Hybrid Data Pipeline to establish a secure connection from the cloud to an on-premises data source.
The ODBC driver enables ODBC applications to communicate to a data source through the Hybrid Data Pipeline server.
The JDBC driver enables JDBC applications to communicate to a data source through the Hybrid Data Pipeline server.
In addition to these four primary components, Progress DataDirect also provides a customized version of OpenAccess server. The OpenAccess server is a connectivity layer required for an Eloqua data store in a Hybrid Data Pipeline environment.
4.0.0 Release Notes
Hybrid Data Pipeline Server Installer
- When choosing an External database under “Custom Installation”, the admin user and user fields are pre-populated with erroneous values.
- When configuring OpenAccess server under “Custom Installation”, the Enable OpenAccess Integration check box is checked, but the Hostname and Eloqua Port boxes are disabled. To enable them, uncheck the check box and then check it again.
- When performing an upgrade install and using an external database, you must choose the Custom installation path and re-enter the information for your external database.
- Google Analytics data sources return an error when used in the SQL Editor. However, they work with Hybrid Data Pipeline ODBC, JDBC and OData clients.
- When an administrator tries to add new users using the Add Users window, the Password and Confirm Password fields occasionally do not appear properly in the popup window.
- If User Account Control is enabled on your Windows machine and you installed the On-Premises Connector in a system folder (such as Windows or Program Files), you must run the On-Premises Connector Configuration Tool in administrator mode.
- When using Kerberos with Microsoft Dynamics, the JRE installed with the On-Premises Connector must be configured to run with Kerberos. Take the following steps to configure the JRE.
- Download a zip file containing new version of the Java Cryptography Extension (JCE) Unlimited Strength Jurisdiction Policy Files for JDK/JRE 7 at http://www.oracle.com/technetwork/java/javase/downloads/jce-7-download-432124.html.
- Unzip the file into the \jre\lib\security directory to update the Java security policy files to support 256-bit encryption:
- Uninstalling and re-installing the On-Premise Connector causes the Connector ID of the On-Premise Connector to change. Any Hybrid Data Pipeline data sources using the old Connector ID must be updated to use the new Connector ID. Installing to a new directory allows both the old and new On-Premise Connector to exist side-by-side. However, you must update the Connector ID option in previously-defined Hybrid Data Pipeline data sources to point to the new On-Premise Connector. In addition, you must update Connector Id wherever it was used, such as the definitions of Group Connectors and Authorized Users.
All Data Sources
- It is recommended that Login Timeout not be disabled (set to 0) for a Data Source.
- Using setByte to set parameter values fails when the data source does not support the TINYINT SQL type. Use setShort or setInt to set the parameter value instead of setByte.
- "Numeric value out of range” error when calling SQLStatistics with the Hybrid Data Pipeline ODBC driver.
- Once a Google Analytics OAuth profile is created for a specific Google account, changing the Google Account associated with the profile results in "the configuration options used to open the database do not match the options used to create the database" error being returned for any existing data sources.
- Validation message is not displayed when a user enters a Start Date value less than the End Date value in Create/Update Google Analytics page.
- $expand only supports one level deep.
- For example, with the entity hierarchy:
| |-- OrderItems
The following queries are supported:
However, this query is not supported:
OrderItems is a second level entity with respect to
Customers. To query
OrderItems, the query must be rooted at Orders. For example:
- When manually editing the ODataSchemaMap value, the table names and column names specified in the value are case-sensitive. The case of the table and column names must match the case of the tables and column names reported by the data source.
Note: It is highly recommended that you use the OData Schema Editor to generate the value for the ODataSchemaMap data source option. The Schema Editor takes care of table and column name casing and other syntactic details.
- When using the substring function on properties that map to a CHAR column in the data source, it is data source dependent as to whether the substring function treats the trailing spaces as significant. When going against Oracle, the trailing spaces are preserved. When going against other data sources, the trailing spaces are discarded.
- The $expand clause is not supported with OpenEdge data sources.
- The day scalar function is not working when specified in a $filter clause when querying a DB2 data source.
- Executing queries against column of type xmltype result in the following error: “This column type is not currently supported by this driver."
Oracle Sales Cloud
- Create Mapping is not fully supported for the Oracle Sales Cloud data source. Typically, when editing a data source from the Data Sources page, a user would need to select "Force New" for Create Mapping under the Mapping tab to refresh a schema. However, this currently results in an input/output error. As a workaround, create a new data source with the desired configuration.
- External storage for processing large results is not currently supported for Oracle Sales Cloud. All processing currently takes place in memory. This primarily impacts queries with post processing options and limits the size of the query that can be successfully processed to the system resources available to the Hybrid Data Pipeline connectivity service.
- The drivers currently report ATTACHMENT type fields in the metadata but do not support retrieving data for these fields. These fields are set to NULL.
- Join queries between parent and child tables are not supported.
- Queries on child tables whose parent has a composite primary key are not supported. For example, the children of ACTIVITIES_ACTIVITYCONTACT and LEADS_PRODUCTS are not accessible.
- Queries on grandchildren with multiple sets of Parent IDs and Grand Parent IDs used in an OR clause are not supported. For example, the following query is not supported.
Select * from ACCOUNTS_ADDRESS_ADDRESSPURPOSE
where (ACCOUNTS_PARTYNUMBER = 'OSC_12343' AND
ACCOUNTS_ADDRESS_ADDRESSNUMBER = 'AUNA-2XZKGH')
or (ACCOUNTS_PARTYNUMBER = 'OSC_12344' AND
ACCOUNTS_ADDRESS_ADRESSNUMBER = 'AUNA-2YZKGH")
- When querying documented objects like "CATALOGPRODUCTITEMS" and "CATEGORYPRODUCTITEMS", no more than 500 records are returned, even when more records may be present. This behavior is also seen with some custom objects. We are currently working with Oracle support to resolve this issue.
- A query on OPPORTUNITIES_CHILDREVENUE_PRODUCTS or LEADS_PRODUCTGROUPS with a filter on the primary key column returns 0 records even when more records are present. We are currently working with Oracle support to resolve this issue.
- Queries that contain subqueries returning more than 100 records are not supported. For example, the following query is not supported.
select * from ACCOUNTS_ADDRESS
in (select top 101 PARTYNUMBER from ACCOUNTS
- When you create custom objects, your Oracle Sales Cloud administrator must enable these objects for REST API access through Application Composer. Otherwise, you will not be able to query against these custom objects.
Oracle Service Cloud
- When you create a custom object, your Oracle Service Cloud administrator must enable all four columns of the Object Fields tab of the Object Designer, or you cannot query against the custom objects.
- The initial connection when the relational map is created can take some time. It is even possible to receive an error "504: Gateway Timeout". When this happens, Hybrid Data Pipeline continues to build the map in the background such that subsequent connection attempts are successful and have full access to the relational map.
Microsoft Dynamics CRM
- Testing has shown the following two errors from Microsoft Dynamics CRM Online when executing queries against the ImportData and TeamTemplate tables:
Note: We have filed a case with Microsoft and are waiting to hear back about the cause of the issue.
- Attribute errortype on Entity ImportData is of type picklist but has Child Attributes Count 0
- Attribute issystem on Entity TeamTemplate is of type bit but has Child Attributes Count 0
- The initial on-premises connection when the relational map is created can take some time. It is even possible to receive an error "504: Gateway Timeout". When this happens, Hybrid Data Pipeline continues to build the map in the background such that subsequent connection attempts are successful and have full access to the relational map.
- Setting the MaxPooledStatements data source option in an OpenEdge data store to a value other than zero can cause statement not prepared errors to be returned in some situations.
- Data sources that are using the deprecated enableExportMode option will still see a problem until they are migrated to the new data source configuration.
- Data source connections by default now use Export Mode to communicate with the Sugar CRM server, providing increased performance when querying large sets of data. Bulk export mode causes NULL values for currency columns to be returned as the value 0. Because of this, there is no way to differentiate between a NULL value and 0, when operating in export mode. This can be a problem when using currency columns in the SQL statements, because Hybrid Data Pipeline must satisfy some filter conditions on queries, such as with operations like =, <>, >, >=, <, <=, IS NULL and IS NOT NULL. For example, suppose a currency column in a table in SugarCRM has 3 null values and 5 values that are 0. When a query is executed to return all NULL values (SELECT * FROM <table> WHERE <>currency column> IS NULL), then 3 rows are returned. However, if a query is executed to return all rows where the column performs an arithmetic operation (SELECT * FROM <table> WHERE <currency column> + 1 = 1), then all 8 records are returned because the 3 NULL values are seen as 0.
Hybrid Data Pipeline JDBC Driver
- Executing certain queries against MS Dynamics CRM may result in a “Communication failure. Protocol error."
- Using JNDI data sources, encryptionMethod must be configured through setExtendedOptions.
- Default value for Service connection option does not connect to Hybrid Data Pipeline server. Set Service=<my hybrid data pipeline server> in your connectionURL to successfully connect to your server.
Hybrid Data Pipeline ODBC Driver
- The default ODBC.INI generated by the installer is missing required entries for Service=, PortNumber=, and HybridDataPipelineDataSource=.