Question
Cognizant Technolgy Solutions
IN
Last activity: 20 Apr 2023 6:32 EDT
Upload large docs to another system from Pega portal
Hi Pega practitioners/Gurus ,
Pls refer to the following scenarios and let me know about your thoughts on what could be the best design for this.
- the client uses a customer service application (CPM FW) where from within an Intent task the CSR is expected to upload multiple large docs (each up to 10mb).
- Although the functionality requires a Pega upload look and feel, the uploaded doc is supposed to be sent over to another system which is on Amazon S3. (Through rest services using file type binary attachments).
- There's no limit to the number of docs the CSR can upload on a case.
- As soon as the upload happens, the intent task is supposed to be refreshed with the link of the uploaded doc(a pre-signed URL) in the embeded "Documents" section.
- There can be 100s of CSRs who might be uploading these documents concurrently during the peak hour in prod.
Considering the size of the attachment, the approach we have taken is asynch service calls using Queue processors to send the attachments over to the non-pega systems and reconciliation to Pega case using notification channel; the high level details of this is described below,
Hi Pega practitioners/Gurus ,
Pls refer to the following scenarios and let me know about your thoughts on what could be the best design for this.
- the client uses a customer service application (CPM FW) where from within an Intent task the CSR is expected to upload multiple large docs (each up to 10mb).
- Although the functionality requires a Pega upload look and feel, the uploaded doc is supposed to be sent over to another system which is on Amazon S3. (Through rest services using file type binary attachments).
- There's no limit to the number of docs the CSR can upload on a case.
- As soon as the upload happens, the intent task is supposed to be refreshed with the link of the uploaded doc(a pre-signed URL) in the embeded "Documents" section.
- There can be 100s of CSRs who might be uploading these documents concurrently during the peak hour in prod.
Considering the size of the attachment, the approach we have taken is asynch service calls using Queue processors to send the attachments over to the non-pega systems and reconciliation to Pega case using notification channel; the high level details of this is described below,
- We'll be using the Pega's OOTB upload icon as the UI, which gives a local action in a modal window to upload/drag and drop files.
- On submit of this Attach-File local action, we'll be calling a Queue processor (running on background processing nodes) which will handle the upload service calls.
- After the call to the QP, the attachment stream (pyAttachmentPage.pyAttachStream ) will be removed from clipboard so that the user's current session (Jvm's of the web user nodes) wont have to bear the high volume of the attachments.
- Once the QP has processed the attachments through REST service calls, we'll be publishing the response received to a Notification channel.
- The Intent task will be subscribed to the same channel so when the notification is published, the response will be used to update the live case UI to show the latest links.
Pls share if you see any concerns or have any thoughts on how the requirement can be implemented in a better way.
Thanks,
Arka