I'm currently looking the implementation of an event processing capability using Pega and I'm seeking some guidance to verify my thinking.
Fundamentally we need to create a process which:
1) integrates with a system to ingest a large volume of real-time events into Pega
2) Depending on the type of event, invoke the correct services and aggregate this information
3) if the information surpasses a certain score (or is otherwise important somehow), create a case to be processed by a user
My key questions are:
1) Can event ingestion can be performed using Pega Decision Strategy Manager's event strategies?
2) Can event strategies can invoke external integrations/services, especially when dealing with a large volume of events, or does it need to be a case at this point?
3) Can information be collected and bundled with the event as it progresses through the process, or does it need to be a case at this point?
The key concern is that the volume of events is significant. If every event was a case, performance issues may arise.
Would anyone be able to provide some insights around this?
Event Processing is implemented as a Data flow run which reads from a stream data set, augments the incoming data, applies basic filtering or ,using Event Strategies, compute aggreations and filters on aggregated data and then it triggers an action ( case, activity or insert into database).