At Progress NEXT 2019, Yogesh in his keynote spoke about how at Progress we are accelerating Digital Innovation and during that presentation we showed off a cool little demo around event driven architecture where a baseball company updates its inventory and pricing on Sitefinity based on production rate data from IOT devices and demand for the baseball products. We heard some good things about the demo and there were requests for sharing on how we built it and I want to do this in series of articles on how I built the demo. 

One of the main parts of the above demo is to stream your changes in OpenEdge to Kafka and this can be done very easily using CDC. In this article, I will walk you through on how you can create this integration between Progress OpenEdge and Apache Kafka. Let’s get started.
For the demo, we have used the Sports2000 database which ships with OpenEdge. We will be using the same database in this
## add_cdc.st#d "CDC_Track_Data":20,64;512 . f 102400d "CDC_Track_Data":20,64;512 . f 102400d "CDC_Track_Data":20,64;512 .#d "CDC_Track_Idx":21,1;64 . f 10240d "CDC_Track_Idx":21,1;64 . f 10240d "CDC_Track_Idx":21,1;64 .#prostrct addonline <databasename> cdc_sports2019.st
proutil <databasename> -C enablecdc area "CDC_Track_Data" indexarea "CDC_Track_Idx"
CDC feature has been successfully enabled. (18039)
    
    
CREATE VIEW PUB.CDCOrderView ASSELECT "_Tran-id" AS CDCTransactionID, "_Time-Stamp" AS CDCTimeStamp, "_Change-Sequence" AS CDCChangeSequence, "_Continuation-Position" AS CDCContinuationPosition, "_ArrayIndex" AS CDCArrayIndex, "_Fragment" AS CDCFragment, "_Operation" AS CDCOperation, "BillToID", "Carrier", "Creditcard", "CustNum", "Instructions", "OrderDate", "Ordernum", "OrderStatus", PO, "PromiseDate", "SalesRep", "ShipDate", "ShipToID", "Terms", "WarehouseNum"FROM PUB."CDC_Order";confluent start
name=cdcorderconnector.class=io.confluent.connect.jdbc.JdbcSourceConnectorconnection.url=jdbc:datadirect:openedge://hostname:port;databaseName=sports2019;User=<user>;Password=<password>query=SELECT * FROM CDCORDERVIEWmode=incrementingincrementing.column.name=CDCTRANSACTIONIDtopic.prefix=cdcordernumeric.mapping=best_fit
confluent load cdcorder -d cdcorder.properties
kafka-topics --list --zookeeper localhost:2181
kafka-avro-console-consumer --bootstrap-server localhost:9092 --topic cdcorder --from-beginning --property schema.registry.url=http://localhost:8081With your databases changes now being streamed in Kafka topics, you can use a Kafka Consumer that reads these events and you can program to act on those events by sending emails or notifications to user or run real-time analytics or use the CDC metadata in those events to replicate the data to a different system.