Install and Run Hybrid Data Pipeline in Docker

Introduction

In this tutorial, we will walk you through on how you can quickly deploy Hybrid Data Pipeline using Docker.

Pre-requisite

  1. Download and Install Docker on your machine. It can be mac, Windows or Linux.

Download Hybrid Data Pipeline Docker Image

  1. Go to our website and click on Try Now – 90 Days free Trial
  2. After registration, you should find the download link to Docker image as shown below.

    download docker

  3. Download the docker image.

Load and Run Hybrid Data Pipeline

  1. After your download has completed, the first thing you need to do is load the image to docker image repository. Run the below command to do that


    docker load -i hdp_eval_docker.tar.gz


  2. To check if your HDP image is loaded into docker image repository, run the following command

     

    docker images


    docker images

     

     

  3. Now to run hybrid data pipeline, run the following command

    ·  Accept EULA by passing environment variable "ACCEPT_EULA" with value set to "true".

    · Pass docker platform hostname as HDP_HOSTNAME environment variable

    · Provide d2cadmin user's password to be set in HDP_ADMIN_PASSWORD variable

    · Provide d2cuser user's password to be set in HDP_USER_PASSWORD variables

    · Provide the docker host port numbers to map container ports (HDP server will be accessible on docker platform hostname: port)

     

    docker run -dt -p <Docker platform port 1>:8080 -p <Docker platform port 2>:8443 -e "ACCEPT_EULA=true" -e "HDP_HOSTNAME=<Docker platform hostname>" -e "HDP_ADMIN_PASSWORD=<d2cadmin user's password to be set>" -e "HDP_USER_PASSWORD=<d2cuser user's password to be set>" <HDP Server docker image name>:<TAG>


    Example:

    docker run -dt -p 8080:8080 -p 8443:8443 -e "ACCEPT_EULA=true" -e "HDP_HOSTNAME=localhost" -e "HDP_ADMIN_PASSWORD=d2cadmin" -e "HDP_USER_PASSWORD=d2cuser" hdp_eval_4.4.0:48

     

  4. When HDP Image is ran for the first time, HDP product will be installed and 90 days evaluation period will be set.
  5. HDP Server will be accessible via docker platform hostname on mapped port. Ex: http://localhost:8080
  6. If you plan to install HDP On Premise Connector or ODBC/JDBC clients, then expose below mentioned additional ports


  7. docker run -dt -p <Docker platform port 1>:8080 -p <Docker platform port 2>:8443 -p <Docker platform port 3>:11443 -p <Docker platform port 4>:40501 -p <Docker platform port 5>:11235 -p <Docker platform port 6>:11280 -e "ACCEPT_EULA=true" -e "HDP_HOSTNAME=<Docker platform hostname>" -e "HDP_ADMIN_PASSWORD=<d2cadmin user's password to be set>" -e "HDP_USER_PASSWORD=<d2cuser user's password to be set>" <docker image name>:<TAG>

Stopping and Starting Hybrid Data Pipeline Containers

  1. To shutdown Hybrid Data Pipeline Server, run the following command.
    docker stop <container-name>


  2. To start Hybrid Data Pipeline Server, run the following command.
docker start <container-name>


Access Hybrid Data Pipeline Server container shell

Run below command to get access to HDP Server docker container terminal.

docker exec -it <ContainerID> /bin/bash


docker shell

Copying redist files from HDP Server container

Redist files are needed for you to install and use ODBC, JDBC or On-Premise Connectors. These files are generally present in server installation. To copy these files to your machine, run the following commands.

docker cp <container_id>:/opt/Progress/DataDirect/Hybrid_Data_Pipeline/Hybrid_Server/redist <destination_folder>


Example

docker cp e75165b4257d:/opt/Progress/DataDirect/Hybrid_Data_Pipeline/Hybrid_Server/redist /test


With the Docker support for Hybrid Data Pipeline, we made it easier for you to get started. Feel free to download the Docker image and try Hybrid Data Pipeline and if you have any questions, please contact us.

DataDirect
connectors

Connect any data source to any application

DataDirect
Tutorials

Discussions, tips and tricks for
DataDirect Connect drivers