Question
TEMA
US
Last activity: 18 Feb 2019 3:27 EST
Can PEGA 7.3 publish messages to Kafka topic?
We are planning to integrate PEGA 7.3 with Kafka and we are successfully able to integrate Kafka to read messages from a topic however I would like to know if Pega 7.3 supports publishing it to Kafka.
-
Like (0)
-
Share this page Facebook Twitter LinkedIn Email Copying... Copied!
Pegasystems Inc.
NL
Hi,
Yes, you can use the Kafka Dataset both as a source and a destination in a real time Data flow run both to read or write and publish messages to Kafka.
Please refer to the link below for more information,
https://community.pega.com/knowledgebase/articles/kafka-data-sets-decision-management
Regards
Nithin
CIBC Imperial Bank of Commerce
CA
Hi Nithin,
My request to you on LinkedIn being -not sure if I'm posting my query at the right place.
Use Case-would like to stream data in real time to our big data environment using Kafka -current using bix utility.
1>What would be the steps for configuration
2>Would the data stream into a kafa stream after creating a Kafka data set thus bypassing the database.?-if so any pros/cons
3>What happens if u want to stream the data into the big data from the database considering Pega obfuscates the blob before handing the data to oracle for writing
4>Do we need a license to use Kafka -currently using 7.2 Pega does Kafka work with 7.2
5>looking for configuration steps if there is a solution for our use case
6>also looking for some good documentation-not looking to understand kafka concepts in a pega site-
thanks
Selwyn
Pegasystems Inc.
NL
Hi Selwyn,
What do you mean by "Big Data environment"? Is it a Hadoop data lake> Could you give some more information here?
As far as Kafka support is concerned, From Pega v7.3 onwards we have the Kafka Dataset which can be used along with a realtime dataflow to process realtime data.
Please refer to the link below for information around configuring the Dataset,
https://community.pega.com/knowledgebase/articles/kafka-data-sets-decision-management
The Kafka Datasets are available ootb from Pega 7.3 onwards and you will not need any additional license.
CIBC Imperial Bank of Commerce
CA
hi Nithin,
Currently we use Bix utility to send data to our data lake in a batch mode since bix is not real time.(this is not the optimal solution as Golden Gate can be used for real time replication ,but since the blob is obfuscated and only bix can read it, we have a limited choice)Looking to use Kafka for similar use cases
1>Stream data out of production database into the data lake -big data-Hadoop.
Can we use KAFKA for this use case assuming we move to 7.3 which is scheduled for next year or so.
If so at what point-in-time do the Kafka Data Sets get created ?-does it get created before Pega engine gives control to Oracle to write the data to db???-
I haven't understood the context of creating Kafka data Sets as explained below-it does not explain any use case-but gives you steps to create Kafka Data sets
but permitting myself to ask another question
hi Nithin,
Currently we use Bix utility to send data to our data lake in a batch mode since bix is not real time.(this is not the optimal solution as Golden Gate can be used for real time replication ,but since the blob is obfuscated and only bix can read it, we have a limited choice)Looking to use Kafka for similar use cases
1>Stream data out of production database into the data lake -big data-Hadoop.
Can we use KAFKA for this use case assuming we move to 7.3 which is scheduled for next year or so.
If so at what point-in-time do the Kafka Data Sets get created ?-does it get created before Pega engine gives control to Oracle to write the data to db???-
I haven't understood the context of creating Kafka data Sets as explained below-it does not explain any use case-but gives you steps to create Kafka Data sets
but permitting myself to ask another question
What if you want to stream data out of the database using Kafka -is this possible? considering by the time the data is in the db -its in the blob and obfuscated. In this scenario how could we push data from database into Kafka topics-why does Pega obfuscates the data ?
Thanks in Advance and appreciate your time -hope to get answers that I'm looking for
Selwyn
Pegasystems Inc.
NL
Hi Selwyn,
In your case, since you want to move data from Database to Data lake/HDFS. I would recommend you start with using the Database Dataset and the HDFS Datataset in a data flow.
For more information on Data flows and Datasets, please refer to the academy course below,
https://academy.pega.com/library/717/testing-strategies-using-customer-data
-
Miguel Calderon
CIBC Imperial Bank of Commerce
CA
Hi Nithin,
Thanks for the reply which I can definitely appreciate,however I do not have access to that training, guess its only available to internal team or perhaps product team.
Thanks
Selwyn
TEMA
US
Thanks Nithin. I read the article before but missed the point as source or destination.
I have a follow up question on real time data flow.
Does it mean we can use Kafka integration from PEGA only in real time processes?
Thanks,
Shan
Pegasystems Inc.
NL
You can publish messages in Kafka by using it as a destination in a dataflow OR call it via DataSet-Execute, choose operation SAVE, in an activity step
-
Tujian Peng
Rulesstack Solutions LTD
US
Did you try this, which version of pega did it work. I tried implementing the same but it would be helpful if you can share how you achieved this.
Tomfashion
HK
hi Shan,
we have a requirement to integrate pega 7.3.1 with kafka, would you mind to share us the guideline in connectivity between pega and kafka? many thanks...
Pegasystems Inc.
IN
-
Ravali Hemant