I have configured kafka connection and created kafka datasets for topics in my broker. Everything looks fine till now. I am working on two scenarios.
Scenario 1: Pega as a publisher: It is working as expected. I am able to push the data to a particular topic in pega using a data flow. No issues till now.
Scenario 2: Pega as a subscriber: I created a real time data flow run and initiated it. Now I am calling an activity at the destination of the data flow. I used a log step to check whether the message is received or not. Now, I started sending messages into the Kafka topic which is configured. It is streaming and the log is coming as expected.
And now, I am sending JSON text into the topic. How do I read it from the data flow destination activity? What is the page context of the input messages from the topic? I think I am missing something.
I have the similar use case to read messages from topic. If i run data set manually. Results are coming fine. But if i run it using data flow, there are no record processed ( 0 record has been processed). Also there is no logs. I am not sure why Dataflow is not picking the records. Could some one help on this?
Posted: 1 year ago
Posted: 5 Sep 2020 22:15 EDT
Abdul Shaik (AbdulShaik)
I am facing the same problem as you described. Have you found the solution? I could create a message in Kafka's topic, but do not know which property/page we should map to. Also, the same problem, while retrieving the message from Kafka.
Posted: 1 year ago
Posted: 7 Sep 2020 3:57 EDT
Oguz Cavli (cavlo)
Principal Decisioning Architect
Do you see the number of incoming counts on Real-Time processing landing page?
If your real-time data flow is in e.g. MyCRM-Data-Response class then keep your activity in the same class as well. This is the class that has represents incoming JSON payload properties. Since you're in the same class, you should be able to access the required properties right away.